<<

HSI 2008 Krakow, Poland, May 25-27, 2008

Emulating the Perceptual System of the Brain for the Purpose of Sensor Fusion

Rosemarie Velik, Member, IEEE, Roland Lang, Member, IEEE, Dietmar Bruckner, Member, IEEE, and Tobias Deutsch, Member, IEEE

Abstract — This work presents a bionic model derived II. STATE OF THE ART from research findings about the perceptual system of the There have already been made various attempts to merge human brain to build next generation intelligent sensor fusion systems. Therefore, a new information processing information coming from various sensory sources. The principle called neuro-symbolic information processing is research area generally first mentioned when discussing introduced. According to this method, sensory data are such tasks is the area of sensor fusion. In literature, there processed by so-called neuro-symbolic networks. The basic cannot be found only one but various definitions of the processing units of neuro-symbolic networks are neuro- term sensor fusion. Principally, sensor fusion is concerned symbols. Correlations between neuro-symbols of a neuro- with the combination of sensor data or data derived from symbolic network can be learned from examples. sensory data in order to produce enhanced data in form of is based on sensor data as well as on interaction with an internal representation of the process environment. The cognitive processes like focus of attention, memory, and achievements of sensor fusion are robustness, extended knowledge. Additionally, a mechanism for evaluating perception by emotions is suggested. spatial and temporal coverage, increased confidence, reduced ambiguity and uncertainty, and improved Keywords — Humanlike Perception, Sensor Fusion, resolution [8]. The research field of sensor data fusion is Neuro-symbolic Networks, Learning, Knowledge-based relatively recent and dynamic. Therefore, a standard Systems, Focus of Attention. terminology has not yet evolved. There are widely used the terms “sensor fusion”, “sensor integration”, “data fusion”, “information fusion”, “multi-sensor data fusion”, I. INTRODUCTION and “multi-sensor integration” [2], [28]. Data for sensor fusion can come from one single sensor taken from HE human brain is a highly complex system, which is multiple measurements subsequently at different instants capable of performing a huge range of diverse tasks. T of time, from multiple sensors of identical types, or from One capability of the brain is to process information sensors of different types. Applications for fusion are coming from thousands and thousands of sensory various and range from measurement engineering and receptors and integrating this information into a unified production engineering over robotics and navigation to perception of the environment. Up to now, technical medicine technology and military applications [19], [25]. systems used for machine perception can by far not There have been proposed various models for sensor compete with their biological archetype. Having available fusion. However, up to now, the selection of a model a technical system, which is capable of perceiving objects, strongly depends on the special application. There does events, and situations in a similar efficient manner as the not yet exist a model for sensor fusion that is generally brain does, would be very valuable for a wide range of accepted. Many researchers even point out that it is very applications. Examples for applications are automatic unlikely that one technique or architecture will provide a surveillance systems in buildings and autonomous robots. uniformly, superior solution [9]. To perceive objects, events, and situations in an In [21], it is pointed out that it generally accepted that environment, sensors of various types are necessary. The sensor fusion in the perceptual system of the human brain challenge that has to be fenced for perceptive tasks is the is of far superior quality than sensor fusion achieved with merging and the interpretation of sensory data from existing mathematical methods. Therefore, it seems to be various sources. The aim of this paper is to introduce a particularly useful to study biological principles of sensor model for integrating and interpreting such data. As fusion. Such studies can on the one hand lead to better humans can perceive their environment very effectively, technical models for sensor fusion and on the other hand the perceptual system of the brain serves as archetype for to a better understanding of how perception is performed model development. Particularly, research findings from in the brain. Sensor fusion based on models derived from and neuro-psychology are the guides in the biology is called biological sensor fusion. Models for development process. biological sensor fusion based on neural networks have been proposed in [5] and [15]. There have also been made Rosemarie Velik, Roland Lang, Dietmar Bruckner, and Tobias attempts to merge sensory data by transforming sensor Deutsch are with the Vienna University of Technology, Institute of data into symbols. Computer Technology, 1040 Vienna, Austria. {velik, langr, bruckner, deutsch}@ict.tuwien.ac.at Approaches to process sensor information symbolically have been described by [3], [13], [18], [22], and [23].

There are often suggested layered architectures for this nerve cells but in terms of symbols. Mental processes are purpose. Fusion of sensor data from a set of heterogeneous often considered as a process of symbol manipulation or homogeneous sensors, soft sensors, and history values [11]. of sensor data is generally called direct fusion. However, Learning and Adaptation there does also exist indirect fusion, which uses The perceptual system of the human brain is not fully information sources like prior knowledge about the developed at birth. Although certain patterns need to be environment and human input. Furthermore, it is possible predefined by the genetic code, lots of concepts and to fuse the outputs of direct and indirect fusion. In correlations concerning perception are learned during literature, different models for such hybrid systems have lifetime [20]. been described: [2], [4], [7], [10], [26]. Influence from Focus of Attention According to the hypothesis of focused attention, what we III. CHARACTERISTICS OF HUMAN PERCEPTION see is determined by what we attend to. At every moment, The aim of this paper is to introduce a new model for the environment presents far more perceptual information sensor fusion for the purpose of machine perception, than can be effectively processed. Attention can be used to which is based on research findings about the perceptual select relevant information and to ignore irrelevant or system of the human brain. To develop such a model, in a interfering information. Instead of trying to process all first step, characteristics of human perception have to be objects simultaneously, processing is limited to one object identified. Figure 1 gives an overview about important in a certain area of space at a time [17]. mechanisms and factors that form and influence human Influence from Knowledge and Memory perception. The mentioned characteristics are derived Perception is facilitated by knowledge. Prior knowledge is from research results of neuroscience and neuro- often required for interpreting ambiguous sensory signals. psychology about the perceptual system of the human Much of what we take for granted as the way the world is brain. – as we perceive it – is in fact what we have learned about the world – as we remember it. Much of what we take for perception is in fact memory. We frequently see things that are not there, simply because we expect them to be there [12]. Emotional Evaluation For perception, there are most often only considered the detection and the processing of stimuli from the external environment. However, the perception of objects, events, and situations makes little if we do not know what influence they have on our body. In the human brain, an evaluation of perceptual images is performed by emotions. The basic function of emotions in perception is to classify objects, events, and scenarios as good or bad. Emotions

Fig. 1. Characteristics of Human Perception are necessary, to react adequately on perceived objects, events, and situations [27]. Diverse Sensory Modalities To perceive the external environment, our brain uses IV. BIONIC MODEL FOR PERCEPTION multiple sources of sensory information derived from several different modalities including vision, touch, and A. Neuro-symbolic Information Processing audition. The combination and integration of multiple To fulfill the first five characteristics mentioned in the last sources of sensory information is the key to robust section, a new concept of information processing is perception [10]. introduced, which is called neuro-symbolic information Parallel Distributed Information Processing processing. This concept is outlined in the following. As just outlined, for perception, information from various sources is processed. However, the perceptual system is Neuro-symbols as Basic Processing Units no unitary central unit that processes all information in The basic information processing units for neuro-symbolic one step. Instead, sensory information is processed in information processing are so-called neuro-symbols. The parallel [20]. inspiration for the usage of such neuro-symbols came Information Integration over Time from the following observations made from neuroscience To perceive objects, events, and situations in an and neuro-psychology: It is generally accepted that environment, single-moment snapshots of sensory information in the human brain is processed by neurons. information provided by different modalities is not always These neurons are interconnected and exchange sufficient for unambiguous perception. The course and the information with each other via a special neural code. succession of sensory signals over time is of importance However, humans do not think in terms of action [24]. potentials and firing nerve cells, but they think in terms of Symbolic Information Processing symbols. According to the theory of symbolic systems, the In the human brain, perceptual information from different mind is a symbol system and cognition is symbol modalities is processed by interacting neurons. However, manipulation. Examples for symbols are objects, humans do not think in terms of action potential and firing characters, figures, sounds, or colors used to represent

abstract ideas and concepts. Each symbol is associated processing in the primary cortex of the are with other symbols. Symbol manipulation offers the sounds of a certain frequency. A perceptual image of the possibility to generate complex behavior [11]. secondary cortex of the could be a face, a To sum up, neurons could be regarded as basic person, or an object. Examples for perceptual images in information processing on a physiological basis and the acoustic system on this level would be a melody or a symbols as information processing units on a more voice. An example for a task performed in the tertiary abstract level. An interesting question is if there exists an cortex would be to merge the perceptual visual image of a interface between these two levels of abstraction. face and the perceptual auditory image of a voice to the Actually, there have been found neurons in the brain, perception that a person is currently talking. The which respond exclusively to certain perceptual images. of the brain (commonly known as For example, there have been found neurons in the tactile system) comprises in fact a whole group of sensory secondary visual cortex that respond exclusively to the systems, including the cutaneous sensations, perception of faces. This fact inspired the usage of neuro- proprioception, and kinesthesis. symbols. Neuro-symbols stand for perceptual images. Perceptual images are for example a face, a person, or a voice. A neuro-symbol has an activation state and is activated if the perceptual image that it represents is perceived in the environment. Neuro-symbols have a certain number of inputs and one output (see figure 2). Via the input, information about the activation state of other neuro-symbols or triggered sensory receptors is received. All incoming activation states are summed up. If this sum Fig. 3. Cerebral Organization of the Perceptual System exceeds a certain threshold, the neuro-symbol is activated. of the Brain The information about its activation state is transmitted via In accordance to this organization of the perceptual system the output to other neuro-symbols it is connected to. To be of the brain, in the proposed model, neuro-symbols are able to consider also input data arriving asynchronously structured to so-called neuro-symbolic networks (see and the succession of sensor signals over time, there exist figure 4). Due to ease of writing, in the following, neuro- mechanisms in neuro-symbols that allow the processing of symbols are also simply referred to as symbols. In a first input data arriving within a certain time window or to processing step, so-called feature symbols are extracted consider certain successions of incoming data. out of sensory raw data. Information processing in this level correlates with information processing performed in the primary cortex of the brain. In the next two steps, feature symbols are combined to sub-unimodal and unimodal symbols. These two levels correspond to the function of the secondary cortex of the brain. In connection with the somatosensory system of the brain it was mentioned that a sensory modality can consist of further sub-modalities. Therefore, there can exist a sub- unimodal level between the feature level and the unimodal Fig. 2. Function Principle of a Neuro-symbol level. By processing information in these two steps, Neuro-symbolic Networks different sensor types, for example different video To perform complex perceptive tasks, a certain number of cameras, cameras mounted at different positions, or neuro-symbols have to be interconnected. An important different tactile sensors like floor sensors, motion question is how these neuro-symbols should best be detectors, and light barriers, can be merged to one unified structured. As mentioned above, the structural modality. On the multimodal level, information from all organization of the perceptual system of the brain is taken unimodal symbols is merged to multimodal symbols. as archetype. According to [20], the perceptual system of Examples and concept clarifications concerning the usage the brain has a cerebral organization as depicted in figure of neuro-symbols for concrete perceptive tasks can be 3. Perception starts with information coming from sensory found in [29] and [30]. receptors. Afterwards, this information is processed in Learning in Neuro-symbolic Networks three levels, which are referred to as primary cortex, In neuro-symbolic networks, a great part of the secondary cortex, and tertiary cortex. Each sensory information does not only lie in the perceptual neuro- modality of human perception has its own primary and symbols themselves but in the connections between neuro- sencondary cortex. This means that in the first two levels, symbols. Therefore, the question how to determine what information of different sensory modalities is processed neuro-symbols shall be connected and exchange separately and in parallel. In the tertiary cortex, information is very important. To answer this question, information coming from all sensory modalities is merged. again, research findings from neuroscience are exploited. This results in a unified multimodal perception. Examples As outlined in [20], higher cortical layers can only evolve for perceptual images in the primary cortex of the visual if lower levels have already developed. Besides this, system would be simple features like edges, lines, or certain correlations have to be already innate and therefore movement. Examples for the result of information be predefined by genes. Guided by this description, in the

Fig. 4. Structure of a Neuro-symbolic Network proposed model, the lowest-level connections are Input data are sensor data from sensors that are triggered predefined. This means that it is fixed at initial system when a certain situation occurs. start up what feature symbols shall be extracted out of As the lower neuro-symbolic levels are already pre- sensor data and how these feature symbols form sub- connected, based on these sensor data, certain lower level unimodal symbols. In contrast to the lower layers, symbols are activated. The target data are the particular correlations between higher layers can be learned from higher-level symbol that is assigned to the current examples. Therefore, at initial system start-up, there do situation occurring. During the learning phase, the system not exist connections between the sub-unimodal layer and memorizes the examples. Based on these examples it the unimodal level and no connections between the extracts correlations that exist between lower-level and unimodal level and the multimodal level. Suitable higher level symbols and sets connections between them connections between these layers are learned by accordingly. Besides forward connections from lower presenting a number of examples to the system, which levels to higher levels, there can also exist feedback cover all objects, events, and situations that the system connections from higher levels downwards. These shall perceive. Based on examples, there are set feedback connections can be determined and set by connections between sub-unimodal symbols and unimodal presenting the same examples as used for setting the symbols in a first stage. After these connections have been forward connections to the system a second time. For an set, there are calculated the correlation between the explanation of the necessity of feedback connections in unimodal symbols and the multimodal symbols in a perception see [30]. further step. Objects, events, and situations are considered B. Interaction between Perception and Other Cognitive as perceptual images and each of them is assigned to one Processes particular neuro-symbol. For each object, event, and situation, a number of examples is necessary, because The neuro-symbolic information processing structure there can occur deviations in how they are represented by described until now can be regarded as the core of sensory data. Figure 5 illustrates the learning process perception. The description given covers bottom-up between the sub-unimodal and the unimodal level of the information processing starting from sensor values. tactile perceptual system. Similar to supervised learning in However, according to the last three points mentioned in neural networks, there are presented input data and target section III when identifying characteristics of human data to the system. perception, perception is also influenced from processes that are at least partly performed in areas lying outside the perceptual system of the brain. Perception is influenced and modified by memory, knowledge, and by focus of attention. Additionally, emotional evaluations are important to classify perceptual images and give subjective meaning to them (see figure 6).

Influence from Memory and Knowledge According to [12], perception is influenced by knowledge and memory, in a top-down manner. Memory and knowledge cover factual knowledge, knowledge about the context where a situation occurs, past-experience of what happened before, and expectation. By integrating knowledge into the perception process, perception can be facilitated and ambiguous sensor data can be resolved. Unfortunately, neuroscientists and neuro-psychologists do Fig. 5. Function Principle for Learning by Examples in not yet agree about the question how and on what level Neuro-symbolic Networks interaction between knowledge, memory, and perception takes place.

Emotional Evaluation of Perceptual Images According to [27], besides the externally directed perception of the environment, there also exists an internal perception of the body by emotions. Emotions are akin to a sensory modality – an internally directed sensory modality that provides information about the current state of the body. Emotions are of importance for external perception, because they can be triggered from perceptual images of the outside world and can classify them as good or bad. Via a triggering of emotions, certain perceptual images can directly result in action tendencies. Therefore, a realization of emotions in technical systems can get

Fig. 6. Interaction between Perception and other important if human behavior shall be emulated. According Cognitive Processes to [5], it can be distinguished between primary emotions and secondary emotions. Primary emotions appear to

consist in “hardwired” connections between certain In the model, knowledge can influence the activation state external situations of biological significance and the of neuro-symbols. The interaction can principally take subjective responses they evoke. Secondary emotions are place on the sub-unimodal, unimodal, or the multimodal “learned” connections between perceptual images and a level. This means that the activation state of neuro- response. In the presented model, it is possible to evaluate symbols in these layers can be increased or decreased perceptual images represented as neuro-symbols by based on this cognitive information. Imagine for example emotions represented as emotion symbols. In later that a room is monitored and the room is empty at the development stages, these emotion symbols will provide beginning. Now, the system can “know” that certain an interface between the introduced perceptual system and situations cannot occur as long as there does not enter a an action system. In the model, there principally exist person. Therefore, neuro-symbols, which are correlated primary emotion symbols and secondary emotion symbols. with activities performed by persons, cannot be activated. Primary emotion symbols are already assigned to certain The event that a person enters a room only takes a very perceptual images at system start-up to support decision short time. Accordingly, the corresponding neuro-symbol units using this model as a basis. Correlations between is also activated only for a brief moment. Therefore, there perceptual images and secondary emotion symbols have needs to exist a mechanism to memorize that a person been designated to be learned from examples. entered the room, because this activity has influence on future perception. In the proposed model, this task is V. SIMULATION AND TESTING performed by memory symbols. Memory symbols interact with knowledge and can principally receive information To test and evaluate the model presented, the modeling from the sub-unimodal, unimodal, or the multimodal level. language AnyLogic was used together with Java. The They are set when a certain event happens in the reason for choosing AnyLogic lies in the fact that this environment and are reset when another event happens. simulation environment offers certain design elements Coming back to the example of an entering person, a (active objects, ports, connectors, messages, interface and memory symbol “person is present in the room” would be state variables, timers, state charts) that allow it to model set if a person enters the room and would be reset if the and implement neuro-symbols. Besides this, a modular person has left the room again. hierarchical design is possible and by using its graphic development environment, model development gets more Influence from Focus of Attention comprehensible. These facts allow efficient testing. To test If in an environment a number of situations happen in and evaluate the model, sensor data from various sources parallel, it can occur that lower-level symbols cannot be are necessary. For the provision of these sensor data, a correctly assigned to higher level symbols. This effect can simulator was used, which allows it to generate sensor also be observed in human perception, if perception is values based on a virtual environment. [14], [15] This overloaded. An overloading of perception means that there simulator is developed to simulate sensor values in order are too many perceptual stimuli present at a moment to to perceive scenarios in a virtual office environment. The integrate them all at once into a unified perception. In such reason for simulating the sensor values is on the one hand a case, illusionary conjunctions occur. In the brain, a the cost reduction for testing in comparison to real mechanism called focus of attention helps to correctly physical installations. On the other hand, the simulator merge information coming from various sources. Similar allows it to evaluate which sensors are necessary to detect as in human perception, such a mechanism of focus of scenarios most effectively and efficiently and where they attention also exists in the introduced model for humanlike should be mounted. Simulation results show that perception. This mechanism constraints the spatial area in processing of information in a modular hierarchical which perceptual images of different modalities are fashion considering the described bottom-up and top- merged. In the model, focus of attention interacts with down processes as well as feedbacks is an adequate way perception on the feature symbol level, which is the for handling end interpreting sensory data. symbol level where neuro-symbols are topographic and therefore location dependent.

VI. CONCLUSION AND OUTLOOK [9] W. Elmenreich, “A Review on System Architectures for Sensor Fusion Applications,” International Federation for Information This paper suggested a model for emulating the perceptual Processing, 2007, pp. 547–559. system of the human brain to build next generation [10] M. O. Ernst, H. H. Bülthoff, “Merging the into a Robust intelligent sensor fusion systems, e.g. for surveillance Percept,”.TRENDS in Cognitive Sciences, 2004, Vol. 8, pp. 162– systems in buildings and to allow robots to perceive their 169. environment autonomously. Therefore, a new information [11] R. M. French, “Review of The Engine of Reason, the Seat of the processing principle was introduced, which is called Soul,” in Minds & Machines, 1996, Vol. 6, pp. 416–421. neuro-symbolic information processing. According to this [12] E. B. Goldstein, “Wahrnehmungspsychologie,” Spektrum method, sensory data are processed by so-called neuro- Akademischer Verlag, 2002. symbolic networks to result in “awareness” of what is [13] S. O. Götzinger, “Scenario Recognition based on a Bionic Model going on in an environment. The basic processing units of for Multi-Level Symbolization,” Master Thesis, Vienna University neuro-symbolic networks are neuro-symbols. Besides of Technology, 2006. bottom-up information processing of sensory data, focus [14] H. Hareter, G. Pratl, D.Bruckner, “A Simulation and Visualization System for Sensor and Actuator Data Generation,” in Proc. IFAC, of attention, memory, and knowledge can influence the 2005, pp. perceptive process. Additionally, there was suggested a [15] H. Hareter, “Worst Case Szenarien Simulator für die mechanism to evaluate perceptual images by emotions, Gebäudeautomation,”,Ph.D. dissertation, Vienna University of which provide an interface between perception and action. Technology, 2008. Emotions can be regarded as internally directed [16] R. Harvey, K. Heinemann, “Biological Vision Models for Sensor perception. Perceptual images coming from the external Fusion,” in Proc. 1st IEEE Conference on Control Applications, world represented by neuro-symbols are grounded in 1992, pp. 392–397. sensory receptors. To provide these sensor data, a [17] B. Hommel, B. Milliken, “Taking the Brain Serious: Introduction simulator was used. To ground emotion symbols, they to, in, and across Perception and Action,” Psychological Research, would have to be connected to a visceral body. This body 2005. could principally also be simulated. To build such a [18] D. Joyce, L. Richards, A. Cangelosi, K. R. Coventry, “On the Foundations of Perceptual Symbol Systems: Specifying Embodied simulator, a quite detailed model about visceral processes Representations via Connectionism,” in Proc. 5th International in the body would be necessary. However, such a model is Conference on Cognitive Modeling, 2003, pp. 147–152. not yet available. [19] R. C. Luo, M. G. Kay, “Multisensor Integration and Fusion in Nevertheless, besides the problem of symbol grounding Intelligent Systems,” IEEE Transactions on Systems, Man, and for emotions, the proposed model provides a powerful and Cybernetics, 1989, Vol. 19, pp. 901–931. flexible tool for information processing of sensor data. By [20] A. R. Luria, “The Working Brain - An Introduction in making certain adaptations, neuro-symbolic information Neuropsychology,” Basic Books, 1973. processing could also be used for other application [21] L. I. Perlovsky, B. Weijers, C. W. Mutz “Cognitive Foundations for domains. What will be subject to further work is to Model-based Sensor Fusion,” in Proc. of SPIE, 2003, Vol. 5096, develop a toolbox for neuro-symbolic networks pp. 494–501. comparable to the existing toolboxes for neural networks, [22] G. Pratl, “Processing and Symbolization of Ambient Sensor Data” which will allow fast and comfortable development and Ph.D. dissertation, Vienna University of Technology, 2006. testing. With this toolbox, the method of neuro-symbolic [23] A. Richtsfeld, “Szenarienerkennung durch symbolische Datenverarbeitung mit Fuzzy-Logic,“ Master Thesis, Vienna information processing shall get attractive for a broader University of Technology, 2007. group of users. [24] A. L. Roskies, “The Binding Problem,” Neuron, Vol. 24, 1999, pp. 7-9. REFERENCES [25] H. Ruser, F. P. León, “Informationfusion - Eine Übersicht,“ [1] X. Bian, G. D. Abowd, J. M. Rehg, “Using Sound Source Technisches Messen Oldenbourg Verlag, 2007, Vol. 74, pp. 93–102 Localization to Monitor and Infer Activities in the Home,” GVU [26] J. Sillanpää, A. Klapuri, J. Seppäne, T. Virtanen, “Recognition of Technical Reports, 2004. Acoustic Noise Mixtures by Combined Bottom-up and Top-down [2] J. Beyerer, F. P. León, K. Sommer, “Informationfusion in der Mess- Processing,” in Proc. European Signal Processing Conference und Sensortechnik“, Universitätsverlag Karlsruhe, 2006. EUSIPCO, 2000, Vol. 1, pp. 335–338. [3] W. Burgstaller, “Interpretation of Situations in Buildings,” Ph.D. [27] M. Solms O. Turnbull, “The Brain and the Inner World - An dissertation, Vienna University of Technology, 2007. Introduction to the Neuroscience of Subjective Experience,” Other [4] E. Datteri, G. Teti, C. Laschi, G. Tamburrini, P. Dario, C. Press New York, 2002. Guglielmelli, “Expected Perception: An Anticipation-Based [28] J. van Dam, “Environment Modelling for Mobile Robots: Neural Perception-Action Scheme in Robots,” in Proc. International Learning for Sensor Fusion,” Ph.D. dissertation, University of Conference on Intelligent Robots and Systems, 2003, pp. 934–939. Amsterdam, 1998. [5] A. Damasio, “Descartes' Error - Emotion, Reason, and the Human [29] R. Velik, “A Multi-sensory, Symbolic, Knowledge-based Model for Brain,” Penguin Books, 1994. Humanlike Perception,” in Proc 7th International Conference on [6] J. Davis, “Biological Sensor Fusion Inspires Novel System Design,” Fieldbus & Networks in Industrial & Embedded Systems, 2007. in Proc. Joint Service Combat Identification Systems Conference, [30] R. Velik, “A Model for Multimodal Humanlike Perception based on 1997. Modular Hierarchical Symbolic Information Processing, [7] D. P. W. Ellis, “Prediction-driven Computational Auditory Scene Knowledge Integration, and Learning,” in Proc. Bionetics, 2007. Analysis,” Ph.D dissertation, Massachusetts Institute of Technology, 1996. [8] W. Elmenreich, “Sensor Fusion in Time-Triggered Systems,” Ph.D. dissertation, Vienna University of Technology, 2002.