Sensor fusion explained

Sensor fusion is the process of combining sensor data or data derived from disparate sources so that the resulting information has less uncertainty than would be possible if these sources were used individually. For instance, one could potentially obtain a more accurate location estimate of an indoor object by combining multiple data sources such as video cameras and WiFi localization signals. The term uncertainty reduction in this case can mean more accurate, more complete, or more dependable, or refer to the result of an emerging view, such as stereoscopic vision (calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints).[1] [2]

The data sources for a fusion process are not specified to originate from identical sensors. One can distinguish direct fusion, indirect fusion and fusion of the outputs of the former two. Direct fusion is the fusion of sensor data from a set of heterogeneous or homogeneous sensors, soft sensors, and history values of sensor data, while indirect fusion uses information sources like a priori knowledge about the environment and human input.

Sensor fusion is also known as (multi-sensor) data fusion and is a subset of information fusion.

Examples of sensors

Algorithms

Sensor fusion is a term that covers a number of methods and algorithms, including:

Example calculations

Two example sensor fusion calculations are illustrated below.

Let

{bf{x}}1

and

{bf{x}}2

denote two sensor measurements with noise variances
2
\scriptstyle\sigma
1
and
2
\scriptstyle\sigma
2
, respectively. One way of obtaining a combined measurement

{bf{x}}3

is to apply inverse-variance weighting, which is also employed within the Fraser-Potter fixed-interval smoother, namely[6]

{bf{x}}3=

2
\sigma
3
-2
(\sigma
1

{bf{x}}1+

-2
\sigma
2

{bf{x}}2)

,

where

2
\scriptstyle\sigma
3

=

-2
(\scriptstyle\sigma
1

+

-2
\scriptstyle\sigma
2

)-1

is the variance of the combined estimate. It can be seen that the fused result is simply a linear combination of the two measurements weighted by their respective noise variances.

Another (equivalent) method to fuse two measurements is to use the optimal Kalman filter. Suppose that the data is generated by a first-order system and let

{bf{P}}k

denote the solution of the filter's Riccati equation. By applying Cramer's rule within the gain calculation it can be found that the filter gain is given by:

{bf{L}}k

2
= \begin{bmatrix} \tfrac{\scriptstyle\sigma
2

{bf{P}}k}{\scriptstyle\sigma

2
2

{bf{P}}k+

2
\scriptstyle\sigma
1

{bf{P}}k+

2
\scriptstyle\sigma
1
2
\scriptstyle\sigma
2
} & \tfrac \end.

By inspection, when the first measurement is noise free, the filter ignores the second measurement and vice versa. That is, the combined estimate is weighted by the quality of the measurements.

Centralized versus decentralized

In sensor fusion, centralized versus decentralized refers to where the fusion of the data occurs. In centralized fusion, the clients simply forward all of the data to a central location, and some entity at the central location is responsible for correlating and fusing the data. In decentralized, the clients take full responsibility for fusing the data. "In this case, every sensor or platform can be viewed as an intelligent asset having some degree of autonomy in decision-making."[7]

Multiple combinations of centralized and decentralized systems exist.

Another classification of sensor configuration refers to the coordination of information flow between sensors.[8] [9] These mechanisms provide a way to resolve conflicts or disagreements and to allow the development of dynamic sensing strategies. Sensors are in redundant (or competitive) configuration if each node delivers independent measures of the same properties. This configuration can be used in error correction when comparing information from multiple nodes. Redundant strategies are often used with high level fusions in voting procedures.[10] [11] Complementary configuration occurs when multiple information sources supply different information about the same features. This strategy is used for fusing information at raw data level within decision-making algorithms. Complementary features are typically applied in motion recognition tasks with Neural network,[12] [13] Hidden Markov model,[14] [15] Support-vector machine,[16] clustering methods and other techniques.[16] [15] Cooperative sensor fusion uses the information extracted by multiple independent sensors to provide information that would not be available from single sensors. For example, sensors connected to body segments are used for the detection of the angle between them. Cooperative sensor strategy gives information impossible to obtain from single nodes. Cooperative information fusion can be used in motion recognition,[17] gait analysis, motion analysis,[18] [19],.[20]

Levels

There are several categories or levels of sensor fusion that are commonly used.[21] [22] [23] [24] [25] [26]

Sensor fusion level can also be defined basing on the kind of information used to feed the fusion algorithm.[27] More precisely, sensor fusion can be performed fusing raw data coming from different sources, extrapolated features or even decision made by single nodes.

Applications

One application of sensor fusion is GPS/INS, where Global Positioning System and inertial navigation system data is fused using various different methods, e.g. the extended Kalman filter. This is useful, for example, in determining the attitude of an aircraft using low-cost sensors.[32] Another example is using the data fusion approach to determine the traffic state (low traffic, traffic jam, medium flow) using road side collected acoustic, image and sensor data.[33] In the field of autonomous driving, sensor fusion is used to combine the redundant information from complementary sensors in order to obtain a more accurate and reliable representation of the environment.[34]

Although technically not a dedicated sensor fusion method, modern Convolutional neural network based methods can simultaneously process many channels of sensor data (such as Hyperspectral imaging with hundreds of bands [35]) and fuse relevant information to produce classification results.

See also

External links

Notes and References

  1. Book: Elmenreich, W.. Sensor Fusion in Time-Triggered Systems, PhD Thesis. Vienna University of Technology. Vienna, Austria. 2002. 173.
  2. 10.1016/j.compeleceng.2011.04.016. Multi-focus image fusion for visual sensor networks in DCT domain. Computers & Electrical Engineering. 37. 5. 789–797. 2011. Haghighat. Mohammad Bagher Akbari. Aghagolzadeh. Ali. Seyedarabi. Hadi. 38131177 .
  3. Li. Wangyan. Wang. Zidong. Wei. Guoliang. Ma. Lifeng. Hu. Jun. Ding. Derui. 2015. A Survey on Multisensor Fusion and Consensus Filtering for Sensor Networks. Discrete Dynamics in Nature and Society. en. 2015. 1–12. 10.1155/2015/683701. 1026-0226. free.
  4. Badeli. Vahid. Ranftl. Sascha. Melito. Gian Marco. Reinbacher-Köstinger. Alice. Von Der Linden. Wolfgang. Ellermann. Katrin. Biro. Oszkar. 2021-01-01. Bayesian inference of multi-sensors impedance cardiography for detection of aortic dissection. COMPEL - the International Journal for Computation and Mathematics in Electrical and Electronic Engineering. 41 . 3 . 824–839 . 10.1108/COMPEL-03-2021-0072. 245299500 . 0332-1649.
  5. Ranftl. Sascha. Melito. Gian Marco. Badeli. Vahid. Reinbacher-Köstinger. Alice. Ellermann. Katrin. von der Linden. Wolfgang. 2019-12-31. Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection. Entropy. 22. 1. 58. 10.3390/e22010058. 33285833 . 7516489 . 1099-4300. free .
  6. Book: Maybeck, S. . 1982 . Stochastic Models, Estimating, and Control . Academic Press . River Edge, NJ .
  7. Web site: Multi-sensor management for information fusion: issues and approaches. N. Xiong . P. Svensson . Information Fusion. 2002. 3(2):163–186.
  8. Durrant-Whyte. Hugh F.. Sensor Models and Multisensor Integration. The International Journal of Robotics Research. 7. 6. 2016. 97–113. 0278-3649. 10.1177/027836498800700608. 35656213.
  9. Book: eMaintenance: Essential Electronic Tools for Efficiency. Diego. Galar. Uday . Kumar. 9780128111543. 26 . Academic Press. 2017.
  10. Book: Li. Wenfeng. 2012 12th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (ccgrid 2012). Bao. Junrong. Fu. Xiuwen. Fortino. Giancarlo. Galzarano. Stefano. Human Postures Recognition Based on D-S Evidence Theory and Multi-sensor Data Fusion. 2012. 912–917. 10.1109/CCGrid.2012.144. 978-1-4673-1395-7. 1571720.
  11. Book: Fortino. Giancarlo. Proceedings of the 10th EAI International Conference on Body Area Networks. Gravina. Raffaele. Fall-MobileGuard: a Smart Real-Time Fall Detection System. 2015. 10.4108/eai.28-9-2015.2261462. 978-1-63190-084-6. 38913107.
  12. Tao. Shuai. Zhang. Xiaowei. Cai. Huaying. Lv. Zeping. Hu. Caiyou. Xie. Haiqun. Gait based biometric personal authentication by using MEMS inertial sensors. Journal of Ambient Intelligence and Humanized Computing. 9. 5. 2018. 1705–1712. 1868-5137. 10.1007/s12652-018-0880-6. 52304214.
  13. Dehzangi. Omid. Taherisadr. Mojtaba. ChangalVala. Raghvendar. IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion. Sensors. 17. 12. 2017. 2735. 1424-8220. 10.3390/s17122735. 29186887. 5750784. 2017Senso..17.2735D. free.
  14. Guenterberg. E.. Yang. A.Y.. Ghasemzadeh. H.. Jafari. R.. Bajcsy. R.. Sastry. S.S.. A Method for Extracting Temporal Parameters Based on Hidden Markov Models in Body Sensor Networks With Inertial Sensors. IEEE Transactions on Information Technology in Biomedicine. 13. 6. 2009. 1019–1030. 1089-7771. 10.1109/TITB.2009.2028421. 19726268. 1829011.
  15. Parisi. Federico. Ferrari. Gianluigi. Giuberti. Matteo. Contin. Laura. Cimolin. Veronica. Azzaro. Corrado. Albani. Giovanni. Mauro. Alessandro. Inertial BSN-Based Characterization and Automatic UPDRS Evaluation of the Gait Task of Parkinsonians. IEEE Transactions on Affective Computing. 7. 3. 2016. 258–271. 1949-3045. 10.1109/TAFFC.2016.2549533. 16866555.
  16. Gao. Lei. Bourke. A.K.. Nelson. John. Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems. Medical Engineering & Physics. 36. 6. 2014. 779–785. 1350-4533. 10.1016/j.medengphy.2014.02.012. 24636448.
  17. Xu. James Y.. Wang. Yan. Barrett. Mick. Dobkin. Bruce. Pottie. Greg J.. Kaiser. William J.. Personalized Multilayer Daily Life Profiling Through Context Enabled Activity Classification and Motion Reconstruction: An Integrated System Approach. IEEE Journal of Biomedical and Health Informatics. 20. 1. 2016. 177–188. 2168-2194. 10.1109/JBHI.2014.2385694. 25546868. 16785375. free.
  18. Chia Bejarano. Noelia. Ambrosini. Emilia. Pedrocchi. Alessandra. Ferrigno. Giancarlo. Monticone. Marco. Ferrante. Simona. A Novel Adaptive, Real-Time Algorithm to Detect Gait Events From Wearable Sensors. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 23. 3. 2015. 413–422. 1534-4320. 10.1109/TNSRE.2014.2337914. 25069118. 25828466. 11311/865739. free.
  19. Wang. Zhelong. Qiu. Sen. Cao. Zhongkai. Jiang. Ming. Quantitative assessment of dual gait analysis based on inertial sensors with body sensor network. Sensor Review. 33. 1. 2013. 48–56. 0260-2288. 10.1108/02602281311294342.
  20. Kong. Weisheng. Wanning. Lauren. Sessa. Salvatore. Zecca. Massimiliano. Magistro. Daniele. Takeuchi. Hikaru. Kawashima. Ryuta. Takanishi. Atsuo. Step Sequence and Direction Detection of Four Square Step Test. IEEE Robotics and Automation Letters. 2. 4. 2017. 2194–2200. 2377-3766. 10.1109/LRA.2017.2723929. 23410874.
  21. http://www.infofusion.buffalo.edu/tm/Dr.Llinas'stuff/Rethinking%20JDL%20Data%20Fusion%20Levels_BowmanSteinberg.pdf Rethinking JDL Data Fusion Levels
  22. Blasch, E., Plano, S. (2003) “Level 5: User Refinement to aid the Fusion Process”, Proceedings of the SPIE, Vol. 5099.
  23. J. Llinas . C. Bowman . G. Rogova . A. Steinberg . E. Waltz . F. White . 10.1.1.58.2996 . Revisiting the JDL data fusion model II . International Conference on Information Fusion . 2004 .
  24. Blasch, E. (2006) "Sensor, user, mission (SUM) resource management and their interaction with level 2/3 fusion" International Conference on Information Fusion.
  25. Web site: Harnessing the full power of sensor fusion -. 3 April 2024 .
  26. Blasch, E., Steinberg, A., Das, S., Llinas, J., Chong, C.-Y., Kessler, O., Waltz, E., White, F. (2013) "Revisiting the JDL model for information Exploitation," International Conference on Information Fusion.
  27. Gravina. Raffaele. Alinia. Parastoo. Ghasemzadeh. Hassan. Fortino. Giancarlo. Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges. Information Fusion. 35. 2017. 68–80. 1566-2535. 10.1016/j.inffus.2016.09.005. 40608207 .
  28. Gao. Teng. Song. Jin-Yan. Zou. Ji-Yan. Ding. Jin-Hua. Wang. De-Quan. Jin. Ren-Cheng. An overview of performance trade-off mechanisms in routing protocol for green wireless sensor networks. Wireless Networks. 22. 1. 2015. 135–157. 1022-0038. 10.1007/s11276-015-0960-x. 34505498.
  29. Chen. Chen. Jafari. Roozbeh. Kehtarnavaz. Nasser. A survey of depth and inertial sensor fusion for human action recognition. Multimedia Tools and Applications. 76. 3. 2015. 4405–4425. 1380-7501. 10.1007/s11042-015-3177-1. 18112361.
  30. Book: Banovic. Nikola. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16. Buzali. Tofi. Chevalier. Fanny. Mankoff. Jennifer. Dey. Anind K.. Modeling and Understanding Human Routine Behavior. 2016. 248–260. 10.1145/2858036.2858557. 9781450333627. 872756.
  31. Book: Maria. Aileni Raluca. 2015 Conference Grid, Cloud & High Performance Computing in Science (ROLCG). Sever. Pasca. Carlos. Valderrama. Biomedical sensors data fusion algorithm for enhancing the efficiency of fault-tolerant systems in case of wearable electronics device. 2015. 1–4. 10.1109/ROLCG.2015.7367228. 978-6-0673-7040-9. 18782930.
  32. Gross. Jason. Yu Gu . Matthew Rhudy . Srikanth Gururajan . Marcello Napolitano . Flight Test Evaluation of Sensor Fusion Algorithms for Attitude Estimation. IEEE Transactions on Aerospace and Electronic Systems. July 2012. 48. 3. 2128–2139. 10.1109/TAES.2012.6237583. 2012ITAES..48.2128G. 393165.
  33. Joshi, V., Rajamani, N., Takayuki, K., Prathapaneni, N., Subramaniam, L. V. . 2013. Information Fusion Based Learning for Frugal Traffic State Sensing. Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence.
  34. Mircea Paul. Muresan. Ion. Giosan. Sergiu. Nedevschi . Stabilization and Validation of 3D Object Position Using Multimodal Sensor Fusion and Semantic Segmentation . Sensors . 20 . 4 . 1110. 2020-02-18. 10.3390/s20041110 . 32085608. 7070899. 2020Senso..20.1110M. free.
  35. Ran . Lingyan . Zhang . Yanning . Wei . Wei . Zhang . Qilin . A Hyperspectral Image Classification Framework with Spatial Pixel Pair Features . Sensors . 17 . 10 . 2421 . 2017-10-23 . 10.3390/s17102421 . 29065535 . 5677443 . 2017Senso..17.2421R . free .
  36. 10.1109/TIFS.2016.2569061. Discriminant Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition. IEEE Transactions on Information Forensics and Security. 11. 9. 1984–1996. 2016. Haghighat. Mohammad. Abdel-Mottaleb. Mohamed. Alhalabi. Wadee. 15624506.