Skip to main content
ARS Home » Research » Publications at this Location » Publication #224566

Title: Multiple image sensor data fusion through artificial neural networks

Author
item Huang, Yanbo
item Lan, Yubin
item Hoffmann, Wesley
item LACEY, RON - TEXAS A&M UNIVERSITY

Submitted to: Advances in Natural Science
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 3/21/2011
Publication Date: 6/30/2011
Citation: Huang, Y., Lan, Y., Hoffmann, W.C., Lacey, R. 2011. Multiple image sensor data fusion through artificial neural networks. Advances in Natural Science. 4:1-13.

Interpretive Summary: The fusion of multiple sensor measurements can exceed the accuracy of any individual measurement. A special scheme of data fusion was developed for input from different image sensors focusing on the same object simultaneously or at different times. This method classifies image data by the algorithm of artificial neural networks which mimics the interaction of neurons in human brains and merges the data output from each individual image sensor. The method was applied for non-destructive inspection of corroded spots on aircraft panel specimens. The results indicate that data fusion consistently enhanced individual data processing through artificial neural networks from any individual image sensor. This method is expected to solve problems of non-destructive inspection in which a number of image sensors are available for measuring the same specimen.

Technical Abstract: With multisensor data fusion technology, the data from multiple sensors are fused in order to make a more accurate estimation of the environment through measurement, processing and analysis. Artificial neural networks are the computational models that mimic biological neural networks. With high performance pattern recognition through artificial neural networks, multisensor data fusion is able to bring machine intelligence to a new level. This research has developed a method of data fusion from multiple image sensors. The image data were processed and classified using artificial neural networks. The classified data were then fused to produce a result that performed better than any of the individual image classification. With the established concept and method, we conducted an application of non-destructive inspection to identify the corrosive spots on the aircraft panel specimens, which we will discuss in this paper. In this application, ultrasonic and eddy current image data ran though artificial neural network classifiers to identify the corroded spots on the same aircraft panel specimen. The result indicated that the data fusion consistently enhanced artificial neural network corrosion detection with ultrasonic and eddy current image data individually. This method of multiple image sensor data fusion is expected to solve problems of non-destructive inspection in various areas.