Neural Network
Definition
A computational model inspired by biological neural systems, composed of interconnected layers of nodes that learn patterns from data.
Neural networks are the foundation of modern machine learning. They consist of layers of interconnected nodes (neurons) that transform input data through learned weights and nonlinear activation functions. By adjusting these weights during training to minimize a loss function, the network learns to recognize patterns, make predictions, and generate outputs.
Deep neural networks — those with many layers — can learn hierarchical representations, with early layers capturing low-level features and deeper layers capturing abstract concepts. In speech recognition, early layers might detect acoustic features like formants, while deeper layers capture phonemes, words, and linguistic patterns.