A Distributed Reasoning Engine Ecosystem for Semantic Context-Management in Smart Environments

Context information inference consumes a lot of time when dealing with large amount of data. This situation is common in modern ubiquitous computing when there are a lot of sensors and devices available. Therefore, Almeida and López-de-Ipiña have proposed an agent-based system architecture to distribute the context reasoning problem into smaller parts in order to reduce the inference time [1]. The mechanism works by systematically splitting the inference problem and distribute them among different reasoning engine to reduce the time required to reach certain conclusions. The dsitributed system is implemented using JADE framework. The result shows parallelism of context reasoning engine achieves 700% less inference times than centralized approach. Three factors are considered when dividing inference problems. Firstly, inference problems are divided according to a specific concept or a broad concept. Second factor is the location where the measures originate from. Lastly, the certainty factor which associated to each ontological concept. The system architecture consists of context provider and context consumer. The context provider is an agent which can provide any context data. Each context provider has one context type, one location and a certainty factor. Context consumer is an agent that consume context data which has a set of interests defined by the context types, locations and minimum certainty factors.

[1] Almeida, Aitor, and Diego López-de-Ipiña. “A Distributed Reasoning Engine Ecosystem for Semantic Context-Management in Smart Environments.” Sensors 12, no. 12 (July 30, 2012): 10208–27. doi:10.3390/s120810208.

Leave a Reply

Your email address will not be published. Required fields are marked *