For large NP-complete optimization problems, optimization algorithms are not appropriate because it takes exponential time to get a global optimum. Approximation algorithms such as Boltzmann machines and Genetic Algorithms are suitable for this kind of problem since we can obtain a near optimal solution in polynomial time.
Motivated by these two optimization paradigms, we propose a hybrid optimization method which uses the Genetic Algorithm with the energy function of Boltzmann machines. The energy function can be used as a general form of cost functions of Genetic Algorithms. This reduces the large gap between problems and representations of Genetic Algorithms. Additionally, we strengthen the short distance search of the Genetic Algorithm by using betterment operators which better an individual with local search or simulated annealing instead of the mutation operators' blind searches.
Our method is applied to traveling salesman problems and independent set problems represented in the form of a neural networks whose weights are set heuristically. The performance of this method is compared with the Boltzmann machine's and a pure Genetic Algorithm's in speed and solution quality. In both respects our method is better than or comparable to the Boltzmann machine and worse than the Genetic Algorithm.