서지주요정보
분수차 퓨리에 변환, 대수 우도 및 병렬성에 기초한 광 신경회로망의 성능분석 = Performance analysis of optical neural network based on fractional Fourier transform, log-likelihood, and parallelism
서명 / 저자 분수차 퓨리에 변환, 대수 우도 및 병렬성에 기초한 광 신경회로망의 성능분석 = Performance analysis of optical neural network based on fractional Fourier transform, log-likelihood, and parallelism / 신상길.
발행사항 [대전 : 한국과학기술원, 1998].
Online Access 원문보기 원문인쇄

소장정보

등록번호

8009220

소장위치/청구기호

학술문화관(문화관) 보존서고

DEE 98042

휴대폰 전송

도서상태

이용가능(대출불가)

사유안내

반납예정일

리뷰정보

초록정보

Optical neural network based on the fractional Fourier transform (FRT) has a simple optical architecture and it is suitable for a large-scale optical implementation. The FRT neural network with the mean square error has been proposed but its performance has not been examined in detail. In this dissertation the performance of this neural network is systematically analyzed in a pattern classification problem and its improvements are realized by the following methods. The mean square error is replaced with the log-likelihood in the FRT neural network and the parallelism is introduced for a significant improvement in performance. Then, the FRT neural network using two cylindrical lenses is proposed and its performance is analyzed. Finally, the optimization of fractional order, which is important in the pattern classification of the FRT neural network, is solved by evolutionary programming. First, the performance of the FRT neural network is analyzed and improved as follows. Seven alphabet patterns (A, B, C, D, E, F, and, G) with 16 x 16 pixels have been classified by the FRT neural network, and then its recall rate is tested with the noisy patterns. However, it is found the classification performance of the FRT neural network with the mean square error is limited for a practical application. To improve both the learning convergence and the recall rate of neural network, the mean square error is replaced with the log-likelihood. To study the effect of parallelism on the FRT neural network, the parallelism is introduced to the FRT neural network with the mean square error. It initially improves the learning convergence but slightly diminishes its recall rate. It has been found that the combination of FRT, log-likelihood, and parallelism significantly improves both the learning convergence and the recall rate of neural network. Second, the fractional orders associated with the horizontal axis and the vertical axis are independently controlled to classify various types of patterns. The proposed FRT neural network uses two cylindrical lenses instead of a spherical lens and this neural network classifies the elongated patterns better than the FRT neural network with a spherical lens. Finally, the optimization of the fractional order of the FRT neural network, which is important for the good classification performance, is achieved by employing the evolutionary programming. Namely the fractional orders are successfully optimized in numeral pattern classifications using the FRT neural network with a spherical lens and the FRT neural network with two cylindrical lenses, respectively.

서지기타정보

서지기타정보
청구기호 {DEE 98042
형태사항 iv, 84 p. : 삽화 ; 26 cm
언어 한국어
일반주기 저자명의 영문표기 : Sang-Gil Shin
지도교수의 한글표기 : 신상영
지도교수의 영문표기 : Sang-Yung Shin
학위논문 학위논문(박사) - 한국과학기술원 : 전기및전자공학과,
서지주기 참고문헌 : p. 80-84
QR CODE

책소개

전체보기

목차

전체보기

이 주제의 인기대출도서