One of the most active disciplines around autonomous perception is sensor fusion. This research area aims at combining sensorial data from diverse origins (and sometimes also other information sources) to achieve a "better perception" of the environment.
There can be found various definitions of sensor fusion differing slightly in the meaning. One states that sensor fusion is "the combining of sensory data or data derived from sensory data in order to produce enhanced data in form of an internal representation of the process environment. The achievements of sensor fusion are robustness, extended spatial and temporal coverage, increased confidence, reduced ambiguity and uncertainty, and improved resolution." [3], to which we fully agree.
Sensor data fusion is a relatively recent and dynamic field, and a standard terminology has not yet been adopted. The terms "sensor fusion", "sensor integration", "data fusion", "information fusion", "multisensor data fusion", and "multisensor integration" have been widely used in technical literature to refer to a variety of techniques, technologies, systems, and applications, which use data derived from multiple information sources [4–6].
Data for sensor fusion can come from single sensors taken from multiple measurements subsequently at different instants of time, from multiple sensors of identical types, or from sensors of different types. In the following, concepts, models, methods, and applications for sensor fusion will be summarized, mainly following the ideas of [7, 8].
Concepts for Fusion
Sensor fusion is generally based on the combination of redundant or complementary information. Among others, the works in [3, 5, 8] distinguish three types of sensor data fusion, which are not mutually exclusive: complementary fusion, competitive fusion, and cooperative fusion.
Complementary fusion is the fusion of incomplete sensor measurements from several disparate sources. Sensor data do not directly depend on each other, but are combined to give a more complete image of a phenomenon under observation.
Competitive fusion is the fusion of redundant sensor measurements from several sources. Each sensor delivers independent measurements of the same property. Competitive sensor configurations are also called redundant configurations.
Cooperative fusion uses the information provided by independent sensors to derive information that would not be available from the single sensors. An example for cooperative sensor fusion is stereovision. In contrast to complementary and competitive fusion, cooperative fusion generally decreases accuracy and reliability.
Models for Fusion
Regarding the models for sensor fusion, it has to be noted that sensor fusion models heavily depend on the application they are used in. So far, there does not exist a model for sensor fusion that is generally accepted, and it seems unlikely that one technique or architecture will provide a uniformly superior solution [3]. Therefore, there exist numerous models for sensor fusion in the literature. To mention only few of them: the JDL fusion model architecture, the Waterfall model, the Intelligence cycle, the Boyd loop, the LAAS architecture, the Omnibus model, Mr. Fusion, the DFuse framework, and the Time-Triggered Sensor Fusion Model.
Methods for Fusion
There have been suggested various methods for sensor fusion. Sensor fusion methods can principally be divided into grid-based (geometric) and parameter-based (numerical) approaches whereby in the case of numeric approaches. A further distinction is made between feature-based approaches (weighted average, Kalman filter), probabilistic approaches (classical statistics, Bayesian statistics), fuzzy methods, and neural approaches. In contrast, the work in [9] classifies fusion algorithms into estimation methods (weighted average, Kalman filter), classification methods (cluster analysis, unsupervised or self-organized learning algorithms), interference methods (Bayesian interference, Dempster-Shafter evidential reasoning), and artificial intelligence methods (neural networks, fuzzy logic). Similar to the models of sensor fusion, there is also no one sensor fusion method suitable for all applications. Hence, new hierarchical approaches are sought to combine the advantages of the basic mathematical ones.
Application Areas
Areas of applications of fusion are broad and range from measurement engineering and production engineering over robotics and navigation to medicine technology and military applications. Examples for applications can be found in [4, 8, 9].
Biological Sensor Fusion
It is well appreciated that sensor fusion in the perceptual system of the human brain is of far superior quality than sensor fusion achieved with existing mathematical methods [10, 11]. Therefore, it seems to be particularly useful to study biological principles of sensor fusion.
Such studies can, on the one hand, lead to better technical models for sensor fusion and, on the other hand, to a better understanding of how perception is performed in the brain. Sensor fusion based on models derived from biology is called biological sensor fusion. Approaches to biological sensor fusion made so far can be found in [12–18].
Although there have already been introduced a number of models for biological sensor fusion, yet success of research efforts incorporating lessons learned from biology into "smart algorithms" has been limited [10]. One reason therefore might be that the use of biological models in actual machines is often only metaphorical, using the biological architecture as a general guideline [19].