This thesis aims at operating the Hopfield neural network model in parallel by means of a new state updating rule, caller the majority protocol. While the original Hopfield network may oscillate if it operates in parallel, the stable-state convergence of the proposed parallel Hopfield network is guaranteed thanks to the novel concept of the majority protocol. The majority protocol is characterized by that a neuron in a parallel Hopfield model changes its state only when the net-input is considerably larger (or smaller) than its threshold regardless of the states of other neurous that operate simultaneously. The stable-state convergence property of the proposed network is theoretically proved by showing that the energy of network always monotonically increases when a state transition with the majority protocol is issued. In order to demonstrate the usefulness of the majority protocol, the parallel Hopfield model with the majority protocol is applied to wellknown combinatorial optimization problems. In addition, we simulate the parallel Hopfield network with respect to the stable-state convergence property, speed-up, and solution-quality in two ways. One is the simulation on sequential computer without considerations about inter-processor communication overhead. The other is the simulation on Super Cluster which is a distibuted memory message passing multiprocessor along with communications between processors. Two Simulations show that new Hopfield model operates in parallel with the stable state convergence and comparable solution-quality to the original sequential Hopfield model.