In this thesis, a new construction approach of neural networks is proposed. The proposed algorithm constructs not only hidden neurons but also layers using aquasi-newton method as a learning algorithm. It determines the network structure by increasing an initial small network to bigger one in a self-organizing way. The initial network is trained until local minimum. When the network is trapped in the local minimum, an extra hidden neuron is added in such a way that hyperplane decided by the added neuron goes through the datum causing the greatest error to the original network. The new network is retrained and reduces errors, since the added neuron makes it escape from a local minimum state by approximating the residual error of the previous network. A hidden neuron is added in the same way whenever the network encounters a saturation state. Layer construction is performed when the addition of successive hidden neurons does not improve performance. The proposed algorithm can give upper bound of complexity of a learnable network since the network size obtained from the proposed algorithm may have some redundant neurons. Pruning of the final trained network can reduce the size and produce more appropriate structure. Simple pruning method is proposed and hybridized with the proposed construction algorithm. By pruning, we can achieve reduction of the network size and improvement of generalization performance.