Bidirectional Associative Memory (BAM) is extended to multi-layer architecture. In this model interconnection weights between adjacent layers are given by Hebbian learning rule to allow bidirectionality and inner-product implementation. Adaptive learning algorithm based on gradient descent error minimization algorithm with error back-propagation is use to optimize hidden-layer activations for each input-output pair. Error for gradient descent error minimization is defined at both input and output. Bidirectional recall improve the error correction rate. Computer simulation show better performance than Perceptron for both classifier and hetero-associative memory applications.