Overview
ABSTRACT
Read this article from a comprehensive knowledge base, updated and supplemented with articles reviewed by scientific committees.
Read the articleAUTHOR
-
Jean-François GRANDIN: Ingénieur de l'Institut national des télécommunications Expert in information processing Technical Division - Electronic Warfare Systems Thales Systèmes Aéroportés
INTRODUCTION
Data fusion essentially consists in comparing and integrating multiple pieces of information in order to reduce the uncertainty of the resulting information. This paper focuses on the application of probability theory to all functions in the perceptual chain: detection, identification, estimation, tracking and situation analysis. Within this framework, we explain how to model and take into account the reliability of sources, how to estimate the quality of a distribution of hypothetical events, how to merge in the presence of correlation, and how to integrate expert information and observation. To be effective, merging must take into account the degrees of uncertainty in the incoming information: to this end, merging relies on a combination of information elements weighted by their respective uncertainties. In addition, to "fill in" the missing information, fusion uses hypothetical reasoning and evaluates the various potential options; Bayesian inference is built for this type of evaluation and lends itself to elaborate hypothesis propagation (MHT: "multiple hypothesis tracking", Bayesian networks) integrating knowledge of heterogeneous nature and level.
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!
CAN BE ALSO FOUND IN:
Merging data
Article included in this offer
"Control and systems engineering"
(
143 articles
)
Updated and enriched with articles validated by our scientific committees
A set of exclusive tools to complement the resources
References
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!