Mining Mind Master Presentation Contents MM V3.0 Contents / 2

Data CurationLayer Information CurationLayer Knowledge Curation Layer Service CurationLayer Supporting Layer Data Curation Layer (DCL)

Curating Multimodal Sensory data for Health and Wellness Platforms Background / 4

• The focus of healthcare and wellness technologies has shown a significant shift towards personal vital signs devices

• Healthcare providers are focusing on when, where, and how; care and support are delivered to the particular patient and service consumer

• Most prevalent diseases are partly caused or aggravated by poor lifestyle choices that people make in their daily routine

Image source : http://www.southcoasthealthandwellness.com/integrative-medicine/ Motivation for Data Curation / 5

• Advent of smart and personal devices • Healthcare providers want to empower people to take care of their health and wellness by providing them with timely, ubiquitous, and personalized support • Most current solutions are single-device focused and have a limited scope

• Unable to generate a context-rich user lifelog which can provide a holistic view of user activity and behavior

• A context-rich lifelog is also a low-cost way to acquire valuable user information on which effective interventions from healthcare professionals can be based

Image source: http://mobihealthnews.com/17827/docomo-omron-healthcare-launch-connected-health-venture-in-japan Data Curation as a Framework (DCF) / 6

Responsibilities 1. Device Independent sensory data acquisition

2. Curation of context-rich user lifelog ▪ Acquisition of multimodal raw sensory data in real-time 3. Monitoring of user lifelog for push-based interventions ▪ Synchronization of multimodal raw sensory data in a distributed 4. Support for the evolution and re-usability of sensory data ▪ Preparation of data instances for context 5. Integrated as a core foundation to health and wellness platforms determination

Sensory Sensory Context Data Data Sensory Instance Acquisition Raw Data Synchronizer Data Writer Service Buffer Queue ▪ Curation and persistence of user context Data Acquisition and Synchronization Query Loader

Data Data Exporter in the form of user lifelog Data Data Lifelog Curation Lifelog Persistence Retrieval Model Service ▪ CRUD operations for user lifelog Data Format Lifelog Representation and Mapping User Active Data evolution Profiles Reader Situation Monitor Event Constraint ▪ Non-volatile persistence of raw sensory Event Configurator Configurator LLM Data Writer Configurator Config. Data Data data and user lifelog in a Big Data Contract Lifelog Monitor Intermediate environment ▪ Monitoring of user lifelong for situations Database Data Data Acquisition and Curation Persistence to respond ▪ Active interface to Big Data for ▪ Hosting and execution of static Query Generator visualization and analytics Data Schema situations Exporter ▪ Passive interface to Big Data for data Scan Service HDFS HIVE driven knowledge generation ▪ Incorporation and execution of dynamic Passive Data Reader Physical Data Storage situations Non-Volatile Sensory Data Persistence Data Curation Framework DCF as a Cloud Implementation / 7

• Ubiquitous nature of the cloud provides DCF the ability to acquire sensory data in different contexts and environments • The cloud provides a central yet scalable computational resource that can accumulate sensory data from clients without being concerned with their computational abilities • The cloud provides a hub for context curation and monitoring for anomalies detection • To support the volume of data accumulated by DCF, the cloud provides a big data platform Data Curation from Framework to Platform / 8

• Data Curation Framework has been adopted as Service API the foundation for Mining Minds platform as Service Generation Supporting independent layer called Data Curation Layer or Recommendation Recommendation Service Layer DCL Manager Interpreter Orchastrator • Responsibilities of DCL are directly aligned with Knowledge Creation and Management UI / UX the requirements of Mining Minds Platform Data-Driven Expert-Driven Knowledgebase Security and Privacy

Sensory Sensory Context Data Context Acquisition Data Sensory Instance Acquisition Raw Data Synchronizer Data Writer Service Buffer Queue Query Loader Feedback Data Acquisition and Synchronization High Level Context-Awareness Analysis Data Data Exporter Data Data Lifelog Curation Persistence Retrieval Model Lifelog Service Low Level Context-Awareness Data Format Lifelog Representation and Mapping User Active Data Profiles Reader Situation Monitor Event Constraint Event Big Data Storage and Processing Descriptive Configurator Configurator LLM Data Writer Configurator Config. Data Data Analytics Contract Sensory Data Processing and Persistence Lifelog Monitor Intermediate Database Data Data Acquisition and Curation Persistence Mining Minds Big Data Storage Gateway Query Generator

Data Schema Exporter

Scan Service HDFS HIVE Passive Data Reader Physical Data Storage Multimodal Data Source Non-Volatile Sensory Data Persistence Data Curation Framework DCL High Level Architecture / 9

• Data Curation Layer (DCL) is currently in its 5th iteration for Mining Minds Version 3.x DCL Execution Flow in Mining Minds Platform / 10

Supporting Layer Service Curation Layer Knowledge Curation Layer Information Curation Layer

Data Curation Layer DCL Service Data Acquisition and Instance Sensory Data Raw Content Buffers Synchronization Context EP. 7 Writer Reader Acquisition Service Sensory Recv. Data Send for Send Sensory Data Synchronizer EP. 6 Request/ EP. 1 Queue Context Recv. Response Scan Enqueue EP. 2 Dequeue Life-log Representation and EP. 3 Mapping Context Mapper Recv. Create Instance Send for Persistence Life-log Mapper Model Create Data Writer ORM Layer Data Situation Writer Create EP. 4 Configurator CRUD Read Create

EP. 5 Create Raw Data Constraints Recv. Life-log Backup Storage Service Intermediate Configurator Database Create Video Data Storage Distributed Data Persistence Service Big Data Query Storage Create Loader Scan Trigger Active Scan Data Hive Passive Situation Event Detector Query Data Create Data Queries Select Data Create Reader Loader Exporter Response Schema Cache Exporter Response Reader Life-log Monitoring

Online Process Offline Process Related Work / 11

Contributions Limitations Insight1 • An energy efficient continuous sensing framework • Limited devices • Uses wearable devices with small data footprint • Smaller Data footprint • Opportunistic (Event and interval driven) sensing • Does not take computational complexity of process in account NetBio2 • It assembles vast amounts of curated and annotated, • Only supports data repositories as data clinical and molecular data sources • Big data technology for permanent persistence and core • Support for Clinical domain only logic layers to make correlations between the billions of • Offline and Batch processing data points • No real-time physical sensor based • A rich set of APIs that enable clients to integrate their continuous sensing workflows and scenarios SAMI3 • Data-driven Development (D3) platform for receiving, • Data exchange-centric Implementation storing and sending data to/from IoT devices. • No effective processing on accumulated • Any device can send data in various formats which is then data normalized into a JSON format and stored in the cloud • No curation mechanism for data representation DCL vs. State-of-the-art / 12

• DCL is a novel attempt to implement a raw sensory data acquisition, curation, and monitoring over cloud platform • The sensory data acquisition services of DCL are independent of data sources. • With scalability in mind, numerous multimodal data sources can communicate in parallel, making it a more IoT-oriented implementation • DCL considers all of the communicating devices as a source of raw sensory data; thus, generating a context- rich user lifelog • DCL implements situation detection on lifelog instances based upon expert-driven rules in correlation with user profiles, keeping the monitoring vigilant as well as personalized • The computation over the accumulated data and lifelog is performed over a cloud platform, keeping the framework compatible with data source with low computational abilities • From an evolutionary perspective, complex computational algorithms for context identification, data fusion, and mining can be implemented without disturbing client implementations Distributed Data Storage DCL Distributed Data Storage Terminologies / 14

• Terminologies

• Life-log Data

• Intermediate Data

• Raw Sensory Data

• Big Data

• Personalized Big Data DCL Personalized Big Data Deployment / 15

Personalized Big Data

Intermediate Data

Relational Database + + (SQL Server) } Life-log User Profile Life-log Monitoring Technology Physical Location Service Provider

Big Data

+ + Big Data Store (Hadoop) } RAW Sensory 3D Video Life-log (historic) Technology Physical Location Service Provider Data Acquisition & Synchronization (DAS) Overview / 17

• Acquisition of heterogeneous data from Multi- modal data sources in Real-time is a must for data curation layer

• This acquisition of data must be dynamic, parallel and of high performance to support the influx of multimodal data at real-time

• For reliable acquisition of data at real-time, sensory data must be synchronized with resolution to the distributed clock issues Motivation / 18

• Sensory data is generated after every 3 seconds from data sources • Real-time data acquisition

• Mining Minds Platform is using data from multiple data sources for context determination • Data needs to be synchronized prior to context determination

• Context determination is non-real-time process • Regulation and pipelining of Sensory Data Queue for Context determination

• Mining Minds is a distributed platform with parallel execution • Optimal resource utilization with no communication bottlenecks Component Architecture / 19

Information Curation Layer

Data Curation Layer raw sensory data, environment variables, Data Acquisition and Synchronization 3D video Sensory Data 6 Synchronizer Sensory Data 5 ICL Client Sensory Data Queue raw sensory data, Acquisition raw sensory data, environment Write Thread Message Service environment variables, Raw Data Buffer 4 variables, 3D video 3D video RAW Data to ICL Communication raw sensory DCL raw sensory raw sensory data, data, Sensory Data data, Data Buffer 1 Smartphone 2 3 environment environment Buffer environment Synchronizer variables Gatway variables variables Big Data Storage Big Data Client

1 6 Data Buffer Write Thread Message raw sensory data, HDFS << purge command >> Purge Routines environment variables, HDFS HDFS 1 3D video DCL Video RAW Data to Big Data Communication 3D Video 3D Video Video Data 3D Video Camera 1 Acquisition 2 3 data data Buffer data Gateway Context Instance Writer

Asynchronous Non-Blocking communication Execution Workflow / 20

Information Curation Layer

Data Curation Layer raw sensory data, environment variables, Data Acquisition and Synchronization 3D video Sensory Data 6 Synchronizer Sensory Data 5 ICL Client Sensory Data Queue raw sensory data, Acquisition raw sensory data, environment Write Thread Message Service environment variables, Raw Data Buffer 4 variables, 3D video 3D video RAW Data to ICL Communication raw sensory DCL raw sensory raw sensory data, data, Sensory Data data, Data Buffer 1 Smartphone 2 3 environment environment Buffer environment Synchronizer variables Gatway variables variables Big Data Storage Big Data Client

1 6 Data Buffer Write Thread Message raw sensory data, HDFS << purge command >> Purge Routines environment variables, HDFS HDFS 1 3D video DCL Video RAW Data to Big Data Communication 3D Video 3D Video Video Data 3D Video Camera 1 Acquisition 2 3 data data Buffer data Gateway Context Instance Writer

• Sensory Data from smartphone received asynchronously after every 3 seconds 1 • Semantics of video data from Kinect will be received via laptop after every 3 seconds Execution Workflow / 21

Information Curation Layer

Data Curation Layer raw sensory data, environment variables, Data Acquisition and Synchronization 3D video Sensory Data 6 Synchronizer Sensory Data 5 ICL Client Sensory Data Queue raw sensory data, Acquisition raw sensory data, environment Write Thread Message Service environment variables, Raw Data Buffer 4 variables, 3D video 3D video RAW Data to ICL Communication raw sensory DCL raw sensory raw sensory data, data, Sensory Data data, Data Buffer 1 Smartphone 2 3 environment environment Buffer environment Synchronizer variables Gatway variables variables Big Data Storage Big Data Client

1 6 Data Buffer Write Thread Message raw sensory data, HDFS << purge command >> Purge Routines environment variables, HDFS HDFS 1 3D video DCL Video RAW Data to Big Data Communication 3D Video 3D Video Video Data 3D Video Camera 1 Acquisition 2 3 data data Buffer data Gateway Context Instance Writer

• Sensory data buffer for smartphone-based sensory data 2 • Video data buffer for for Kinect-based video semantics data Execution Workflow / 22

Information Curation Layer

Data Curation Layer raw sensory data, environment variables, Data Acquisition and Synchronization 3D video Sensory Data 6 Synchronizer Sensory Data 5 ICL Client Sensory Data Queue raw sensory data, Acquisition raw sensory data, environment Write Thread Message Service environment variables, Raw Data Buffer 4 variables, 3D video 3D video RAW Data to ICL Communication raw sensory DCL raw sensory raw sensory data, data, Sensory Data data, Data Buffer 1 Smartphone 2 3 environment environment Buffer environment Synchronizer variables Gatway variables variables Big Data Storage Big Data Client

1 6 Data Buffer Write Thread Message raw sensory data, HDFS << purge command >> Purge Routines environment variables, HDFS HDFS 1 3D video DCL Video RAW Data to Big Data Communication 3D Video 3D Video Video Data 3D Video Camera 1 Acquisition 2 3 data data Buffer data Gateway Context Instance Writer

• Data synchronized by time-stamp is moved to Sensory Data Queue • Data items are purged from Sensory and Video Data buffers 3 Execution Workflow / 23

Information Curation Layer

Data Curation Layer raw sensory data, environment variables, Data Acquisition and Synchronization 3D video Sensory Data 6 Synchronizer Sensory Data 5 ICL Client Sensory Data Queue raw sensory data, Acquisition raw sensory data, environment Write Thread Message Service environment variables, Raw Data Buffer 4 variables, 3D video 3D video RAW Data to ICL Communication raw sensory DCL raw sensory raw sensory data, data, Sensory Data data, Data Buffer 1 Smartphone 2 3 environment environment Buffer environment Synchronizer variables Gatway variables variables Big Data Storage Big Data Client

1 6 Data Buffer Write Thread Message raw sensory data, HDFS << purge command >> Purge Routines environment variables, HDFS HDFS 1 3D video DCL Video RAW Data to Big Data Communication 3D Video 3D Video Video Data 3D Video Camera 1 Acquisition 2 3 data data Buffer data Gateway Context Instance Writer

• Sensory Data Queue instance is converted into ICL message 4 • A write thread send the message for context determination Execution Workflow / 24

Information Curation Layer

Data Curation Layer raw sensory data, environment variables, Data Acquisition and Synchronization 3D video Sensory Data 6 Synchronizer Sensory Data 5 ICL Client Sensory Data Queue raw sensory data, Acquisition raw sensory data, environment Write Thread Message Service environment variables, Raw Data Buffer 4 variables, 3D video 3D video RAW Data to ICL Communication raw sensory DCL raw sensory raw sensory data, data, Sensory Data data, Data Buffer 1 Smartphone 2 3 environment environment Buffer environment Synchronizer variables Gatway variables variables Big Data Storage Big Data Client

1 6 Data Buffer Write Thread Message raw sensory data, HDFS << purge command >> Purge Routines environment variables, HDFS HDFS 1 3D video DCL Video RAW Data to Big Data Communication 3D Video 3D Video Video Data 3D Video Camera 1 Acquisition 2 3 data data Buffer data Gateway Context Instance Writer

• In parallel, Sensory Data Queue instance is converted into big data storage message. • A write thread sends the message to Big data storage server. 5 Sensory Data Acquisition Service Implementation / 25

Node.js

g n

REQUEST i

k

c

e

t

o i l

1. Built for HTTP support (device POSIX r

B

W

e

t

-

t p

n Async

n a

independence) REQUEST o

e

g

o o v

e Threads

L

l

E N 2. Support for Real-time e Communication D REQUESTS

3. Non-Blocking IO (Asynchronous)

d

e

l a

4. Highly scalable by event-based g

e

r n

REQUESTS i

h S callback T

node.js Data Buffer Server Sensory Data Synchronization Strategies / 26

1. Complete Sync 2. Eager Sync 3. Rendezvous Sync

segment segment segment segment segment segment 3 2 1 3 2 1 segment segment segment 3 2 1

time window time window time window time window time window time window t = 3 sec t = 3 sec t = 3 sec t = 3 sec t = 3 sec t = 3 sec time window time window time window t = 3 sec t = 3 sec t = 3 sec Sensory Buffer Synchronization Sensory Buffer Synchronization Sensory Buffer Synchronization • Executes when all the required sensory data is • Executes in regular interval time without the received in time window dependence on data sources • Support for highly accurate context • Support for real-time execution with no delays • Executes Executes when all the required sensory data is determination • Ignores out-windowed packets; resulting in received • Only possible when all the data sources are lower accuracy for context determination • Support for highly accurate context determination almost time- and communication-synced • No guarantee for real-time execution due to delay in sync Contributions / 27

• Acquisition of Heterogeneous data at real-time • Synchronization of Heterogeneous data per user and timestamp • Buffered pipe-lining of data for context determination to avoid communication data stress • Non-blocking IO to avoid Communication bottlenecks Life-log Representation and Mapping (LLRM) Overview / 29

• Representation of Lifelog: A black box of users daily Activities, Goals, Recommendations and feedback with Continuous Semantics.

Activities

Context Determination

Heterogeneous Goals Data Motivation / 30

• Lifelog is based on context determined over heterogeneous data • Re-usable and extensible lifelog model to support the variety of data

• Fine-grained lifelog data access to participating layers of the Mining Minds Platform • Data Curation Service for Online and Offline CRUD operations

• Performance-oriented persistence for lifelog data • Object to Relational Model for lifelog persistence and access Component Architecture / 31

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer Retrieval Handler Persistence Handler

Retrieval Persistence Execution Workflow / 32

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer Retrieval Handler Persistence Handler • Accepts request as persistence and retrieval for users’ profile related data Retrieval 1 Persistence • Transforms and provides the requested data in communicable format Execution Workflow / 33

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer • AcceptsRetrieval request Handler as persistence onlyPersistence for recognizedHandler activities, locations, low 2 Retrieval level context and high level context Persistence • The persistence of information requested with change of context. Execution Workflow / 34

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer Retrieval Handler Persistence Handler • It provides the persistence and retrieval of recommendations, user profile Retrieval 3 Persistence and high level context to SCL. Execution Workflow / 35

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O• HandlerProvides recommendations and users’ trends of context, activities, Preferences 6 format 4 Retrieval Handler Query Data Data Curation 1 nutrition for analytics. 5 Manger Retrieval Persistence• HandlerExperts’ recommendations2 Selected schema and users’ feedbackDesired allows data for persistence Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer Retrieval Handler Persistence Handler

Retrieval Persistence Execution Workflow / 36

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer Retrieval Handler Persistence Handler • The persistence request is mapped with schema classes of corresponding layer 5 Retrieval Persistence • The actual instances are mapped with schema classes. Execution Workflow / 37

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Layer • The selectedRetrieval Handler schema and its instancesPersistence are Handler verified for correctness. 6

Retrieval • It checks the constrains on instances and schema like unique, primary, and foreign Persistence key constraints. Execution Workflow / 38

Lifelog Representation and Mapping Data Curation Services User profile, High Service Representation Information Persistence Level context, Selected schema Recommendations Service I/O Handler Data Instances Mapper Service 1 Retrieval Handler Schema Class Instances Curation Mapper Persistence Handler 2 Mapper

3 Mapped schema and instances Information Representation Storage Verifier Low level Context Schema Constraints Lifelog High level context Information I/O Handler 4 Database Information Verifier Verifier 1 Retrieval Handler Selected schema Verified schema Curation Data Instances and instances Persistence Handler 2

Data Representation Retrieved data in Information Retrieval User Profile requested Data I/O Handler Preferences 6 format Retrieval Handler Query Data Data Curation 1 5 Manger Retrieval Persistence Handler 2 Selected schema Desired data Data Instances instance Tables

User activity trends 2 Context trends Support Representation Feedback Supporting 1 Data I/O Handler Persistence Handler Layer • The requestedRetrieval Handler retrieval queries are created according to corresponding layer 7 Retrieval schema Persistence • Retrieve and transform the data into requested format ORM (Object-Relational Mapping) Implementation / 39

. High performance . Reusability for streaming data . Bridge Design . Multi-threaded Patterns programming . Java Language . Java Language

Heterogeneous ORM Library Data Socket RDBMS Communication

Object Model

Information Curation Restful Communication Service Curation

Supporting Layer . High performance for request based communication . Façade and Bridge Design Patterns . Java Language Object Oriented Design Patterns Implementation / 40

• Unified interface • Easy to use Façade Design Pattern • Reduced coupling • Encapsulate

• Scalability for data storage • Decouple the Bridge Design Pattern abstraction • Division of the responsibilities Life-log Monitoring (LLM) Overview / 42

Lifelog persistently record and archive some informational dimension of user's (user lifelog) life experience in a particular data category. Life-log monitoring is to trace the 'threads' of an individual's life in terms of events to observe a situation over time.

User’s Life Experience Life-log Life-log Monitoring

https://en.wikipedia.org/wiki/DARPA_LifeLog Motivation / 43

• Recognize the event from stream of life-log • Identification of target events for efficient monitoring

• Dynamically accommodate critical condition for monitoring • Accommodate critical situations on the basis of expert’s opinion.

• Detect the alarming condition from stream of life-log to notify • Evaluate the alarming situation with respect to user context and generate intimation. Conceptual View / 44

Offline Process

Intermediate

database 1 Configuration: Monitoring Events & Constraints Configuration Data Monitoring Events & Constraints

2 3 High Level Low Level Context Life-log Monitor Data 2 5 1 Filtered Events: Data Life-log 4 6 Acquisition & Representation Life-log Satisfied User Synchronization & Mapping User, Events Notify: Event Alarming User Situation

Online Process Component Architecture / 45

Service Curation Knowledge Curation Layer Layer JSON : Monitoring Events , JSON : UserID, Event , Conditions Service Knowledge Sharing Constraint Conditions Orchestrator Interface

Data Curation Layer

Intermediate Life-log Monitor Database Monitor Event Configurator Constraints UserID, Configurator Monitor Event Manager Age, Constraint Profile Data Gender User Situation Conditions Event Classifier Event Parser Verifier

Event Manipulator Constraint Conditions Constraints Manager Multi Event Resolver Configuration Data Constraint Conditions Monitoring Monitoring Events Events Monitoring Event, Situation Event Detector Conditions Active Log Manager

Life-log Data UserID, Alarming Situation Evaluator Alarming Situation Notifier Activity, Duration Offline Process: Register Monitoring Condition / 46

Knowledge Curation Layer 1 Knowledge Sharing Interface @Post: Input

Data Curation Layer JSON : Monitoring Events , Constraint Conditions Intermediate Life-log Monitor Database Monitor Event Configurator Constraints Configurator Monitor Event Manager 1 3 • Monitoring conditions are provided 4b Event Classifier Event Parser Profile Data Constraint 2 User Situation Verifier 4a Conditions Monitoring Events through I–Kat tool in KCL and shared Constraint Conditions 5a Event Manipulator with DCL in JSON format. • JSON is parsed into constitute units. 5b Constraints Manager Multi Event Resolver Configuration • Parsed units are classified into Data Monitoring Events monitoring events and stored in Situation Event Detector configuration Data Active Log Manager

Life-log Data Alarming Situation Evaluator Alarming Situation Notifier Offline Process: Register Monitoring Condition / 47

Knowledge Curation Layer 1 Knowledge Sharing Interface @Post: Input

Data Curation Layer JSON : Monitoring Events , Constraint Conditions Intermediate Life-log Monitor Database Monitor Event Configurator Constraints Configurator Monitor Event Manager 2 3 • Classified constraint conditions are 4b Event Classifier Event Parser Profile Data Constraint 2 User Situation Verifier 4a Conditions Monitoring Events provided to store into configuration Constraint Conditions 5a Event Manipulator data . • Constraint Conditions are converted 5b Constraints Manager Multi Event Resolver Configuration into Key-value pair format and stored in Data Monitoring Events configuration data. Situation Event Detector Active Log Manager

Life-log Data Alarming Situation Evaluator Alarming Situation Notifier Online Process: Situation Monitoring / 48 Step-1 | Step-2 | Step-3 | Step-4 | Step-5

Information Curation Layer 3 • Data Representation and Mapping update recognized activity in lifelog. Data Router • Database update trigger activate Active Log Manager to get verified the activity and condition for monitoring. 3

Data Curation Layer Service Curation 2 Layer Data Intermediate Service Life-log Monitor Monitor Event Configurator 1 Acquisition & Database Orchestrator Synchronization Constraints Configurator Monitor Event Manager

Profile Data Event Classifier Event Parser User Situation Verifier @Post: Output @Post: Output

{ Data 10 Event Manipulator "EventsAlert": [ Representation { Situation Event Manager & Multi Event Resolver "User_ID":100, Mapping Configuration "Activity":"Sitting", Data “Duration“:60 8 } 9 ] } Recognized Activity, 6 7 Situation Event Detector UserID, 5 Active Log Manager TimeStamp 12 13 RESTful Web Services 4 Life-log Data Alarming Situation Evaluator Alarming Situation Notifier

11 Online Process: Situation Monitoring / 49 Step-1 | Step-2 | Step-3 | Step-4 | Step-5

Information Curation Layer • Active Log Manager update the monitoring situation after getting verified of 4 Data Router constraints and monitoring events. • Monitoring event and constraint conditions are verified against a user id to 3 keep the context.

Data Curation Layer Service Curation 2 Layer Data Intermediate Service Life-log Monitor Monitor Event Configurator 1 Acquisition & Database Orchestrator Synchronization Constraints Configurator Monitor Event Manager

Profile Data Event Classifier Event Parser User Situation Verifier @Post: Output @Post: Output UserID { Data Constraint Conditions 10 Event Manipulator "EventsAlert": [ Representation Monitoring Event { Situation Event Manager & Multi Event Resolver "User_ID":100, Mapping Configuration "Activity":"Sitting", Data “Duration“:60 8 } 9 Monitoring Event Constraint Conditions ] } Recognized Activity, 6 7 Situation Event Detector UserID, 5 Active Log Manager TimeStamp 12 13 RESTful Web Services 4 Life-log Data Alarming Situation Evaluator Alarming Situation Notifier

11 Online Process: Situation Monitoring / 50 Step-1 | Step-2 | Step-3 | Step-4 | Step-5

Information Curation 5 Layer • Alarming situation Evaluator continuously monitor the ActiveLog to find out the Data Router user’s activities to filter out abnormal one. • Alarming Situation Notifier build the JSON and send it to service curation layer 3 to notify about user and abnormal situation.

Data Curation Layer Service Curation 2 Layer Data Intermediate Service Life-log Monitor Monitor Event Configurator 1 Acquisition & Database Orchestrator Synchronization Constraints Configurator Monitor Event Manager

Profile Data Event Classifier Event Parser User Situation Verifier @Post: Output @Post: Output UserID { Data Constraint Conditions 10 Event Manipulator "EventsAlert": [ Representation Monitoring Event { Situation Event Manager & Multi Event Resolver "User_ID":100, Mapping Configuration "Activity":"Sitting", Data “Duration“:60 8 } 9 Monitoring Event Constraint Conditions ] } Recognized Activity, 6 7 Situation Event Detector UserID, 5 Active Log Manager TimeStamp 12 13 RESTful Web Services 4 Life-log Data Alarming Situation Evaluator Alarming Situation Notifier

11 UserID, Activity ID, Duration Situation Detection- Algorithm / 51

• Situation Detection • Step 1 :Population of Current Life log Start

• Activation: On-Insert Trigger on life-log Event Trigger • Retrieval of User, event and Start time

Target Value Current Life of Event Log • Checking Event for Monitoring Process: User, Event, StartTime • Verification of user against event’s Yes constraints Verify Check for Yes User/Event End Event • Retrieval of target value of event Constraints No No • Populate current life log with user, event, start time, and target value. Situation Detection- Algorithm / 52

• Situation Detection • Step 2 : Monitoring of Current Life log

• Monitoring: Time Based Start • Filter users whose Alarming condition meet. Monitor Update Next Current Life Target • Notify the user (Just display ) Log • Search next available target condition of No Yes Alarming Yes Next Target Notify event Condition Exist • Update current lifelog with updated target No

value of event Remove user End from • If no further target value then remove of Monitoring user from monitoring Contributions / 53

• Monitoring of Life-log for occurrence of alarming situation in personal life in real time manner. • Accommodate dynamically constraints of events on the recommendation of experts. • Configuration of Life-log monitor with contributing factors / target variables. Contribution / 54

• Monitoring of Life-log for occurrence of alarming situation in personal life in real time manner.

• Accommodate dynamically constraints of events on the recommendation of experts.

• Configuration of Life-log monitor with contributing factors / target variables. Big Data Storage Overview / 56

• To store unstructured data from multimodal data sources in real-time is must for big data storage

• This storage must be non-volatile and allow CRUDS operations on data

• The data should be available for online processes such as Analytics and Visualization

• The data should be available for offline training of Data models for Knowledge Generation

• Big data storage should handle very large amounts of data and keep scaling to keep up with growth

Image source : http://bigdatablog.emc.com/2016/02/18/thought- leadership-big-data-conversation-with-steve-jones-from-capgemini/ Motivation / 57

• To store raw sensory, environmental variables in a large-scale non-volatile persistence (Big Data) with CRUD operations. • Real-time data storage

• For model training and rule generation, KCL requires interface to selected historic sensory data • Passive data read operations

• For Analytics and Visualization, SL required real-time data read based on provided parameters • Active data read operations Component Architecture / 58

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set] [Result Set] 4 Supporting Layer DCL Service Request / Response Send [Result Set] 4 Knowledge Curation Layer Execution Workflow / 59

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer • DCLRaw Service Sensory dataRequest is / receivedResponse Sendby Data Writer of[Result Data Set] Persistence Component. • Data is de-serialized according to the message model4 Knowledge Curation Layer Execution Workflow / 60

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer • DCLDe- Serviceserialized messageRequest / Response is sent Sendto HDFS for Persistence.[Result Set] 4 Knowledge Curation Layer Execution Workflow / 61

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring

Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer DCL Service Request / Response Send [Result Set] • Data is written inside HDFS 4 Knowledge Curation Layer Execution Workflow / 62

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set] • 4 Supporting Layer Active Data readerDCL Service is responsible forRequest handling / Response online dataSend request for data visualization[Result Set] and analytics • Data read request is generated by visualization and Analytics components of SL. 4 Knowledge Curation Layer • Active data reader selects the particular query depending upon the query parameters Execution Workflow / 63

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer • SelectedDCL query Service is sent toRequest Physical / Response DataSend Store for Execution[Result Set] 4 Knowledge Curation Layer Execution Workflow / 64

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer • RequestedDCL Servicedata is returnedRequest /as Response a result Send set to data exporter[Result Set] • Result set is converted into data message per defined data 4formatKnowledge and Curation send Layer to SL Execution Workflow / 65

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer • RequestDCL for Service Schema of Requestthe persisted/ Response Senddata from KCL is [Resultreceived Set] by the Passive data reader 4 Knowledge Curation Layer Execution Workflow / 66

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 • Scanned and most updated schema from4 Non-volatile[Result Set] storage is returned to KCL. 4 Supporting Layer • KCL selectsDCL Service the parameters Request / fromResponse the Sendschema to generate[Result Set] a query and submit to Passive Data Reader 4 Knowledge Curation Layer • Passive data reader selects the query and sent to Physical Data Store for execution. Execution Workflow / 67

Multimodal Data Intermediate Sources Database [Sensory data, env. variables] 1 Big Data Storage Data [life-log data] Persistence Life-Log Synchronizer Query Writer Raw [life-log data] Message Data Create Query Storage Query Deployer Model Authoring Service parameter] [query

Data Writer 2 [Selected query] 1 Create 2 Active Data Reader Passive Data Reader Query Loader Physical Data Query MapReduce Queries Store Schema cache Loader r HiveHive Hive Queries 1 Scan Select

. . variables] MetaStore env [Sensory data, 2 [Query] Data Exporter Data Exporter

HDFS Create Response 3 HDFS Data Format [Result Set] 3 4 4 [Result Set]

4 Supporting Layer • RequiredDCL data Service is returnedRequest as / Response a result Send set to data exporter[Result Set] 4 Knowledge Curation Layer • The result set is returned to KCL SQL on Big Data Storage (Hadoop Platform) / 68

• Built for REST Services (request handle) • Support for Visualization and Model Training and Analytics and Rule Generation Visualization Analytics • Support for most updated Use & Analytics Data schema for model training and REST Service

[Sensory data, Physical Data Storage rule generation env. variables] • Highly scalable and interactive

Hive & Storage Data Management Management Data Big Data for Analytics and Visualization / 69

1. Data read request is generated by Visualization and

Start Analytics Components of SL End 2. Active data reader selects the particular query

Request for Data depending upon the query parameters Data Exporter 3. Selected Query is sent to physical data store for Query parameter execution Physical Data Store

4. Required data is returned as a result set to Data No

Parameter Yes Query Loader Hive Query exporter Match 5. Result-set is converted into data message per defined data format and send to the SL Contribution / 70

• Storage of Heterogeneous data at real-time • Stream-based soft real-time data read for Analytics and Visualization • Schema-based query selection and execution over Big Data Storage • Availability to the most updated schema of persisted Data • Temporal backups of Life-log data for non-volatile storage • Able to build the big data ecosystem that facilitate request from the other layers. Uniqueness & Contributions Uniqueness and Contributions (1/4) / 72

• The ability to continuously sense for raw sensory data from multimodal data sources in real time • Asynchronous, Non-blocking Sensory Data Acquisition Service Sensory Sensory Context Data Data Sensory Instance Acquisition Raw Data Synchronizer Data Writer • Buffer-based pipelining for continuous data accumulation Service Buffer Queue Data Acquisition and Synchronization Query Loader

Schema Data Exporter Representation Lifelog • and Storage Time-based buffer synchronization for instance creation Model CRUD Lifelog Instance Verifier Selector Operations Mapper Data Format Lifelog Representation and Mapping User Active Data Profiles Reader Situation Monitor Event Constraint Event Configurator Configurator LLM Data Writer • The device-independent acquisition of raw sensory data, Configurator Config. Data Data Contract compatibility of DCF for IoT-based environments Lifelog Monitor Intermediate Database Data Data Acquisition and Curation • HTTP- as well as socket-based service contract for data Persistence Query Generator

Data Schema acquisition Exporter

Scan Service HDFS HIVE Passive Data Reader Physical Data Storage Non-Volatile Sensory Data Persistence Data Curation Framework Uniqueness and Contributions (2/4) / 73

• DCF provides comprehensive curation of this context over a user lifelog • The induction of a larger set of sensory devices results in a richer Sensory Sensory Context Data Data Sensory Instance Acquisition Raw Data Synchronizer Data Writer Service Buffer Queue context Query Loader Data Acquisition and Synchronization

Schema Data Exporter Representation Lifelog and Storage • This context-rich lifelog can be used in multidimensional ways, Model CRUD Lifelog Instance Verifier Selector Operations Mapper Data Format Lifelog Representation and Mapping User Active Data e.g., data accumulated from a smartphone, a smartwatch, and a Profiles Reader Situation Monitor Event Constraint Event Configurator Configurator LLM Data Writer depth camera can accurately identify the context of a user Configurator Config. Data Data Contract Lifelog Monitor Intermediate Database Data posture in an environment. Data Acquisition and Curation Persistence

Query Generator

Data Schema Exporter

Scan Service HDFS HIVE Passive Data Reader Physical Data Storage Non-Volatile Sensory Data Persistence Data Curation Framework Uniqueness and Contributions (3/4) / 74

• DCF is equipped with a lifelog monitoring tool called LLM. • In comparison with device-based activity recognition, lifelog

Sensory Sensory Context Data monitoring looks for situations over richer context occurring over Data Sensory Instance Acquisition Raw Data Synchronizer Data Writer Service Buffer Queue time. Data Acquisition and Synchronization Query Loader

Schema Data Exporter Representation Lifelog and Storage Model CRUD Lifelog Instance Verifier • Selector Operations For reliability, expert-driven rules provide intelligence to this Mapper Data Format Lifelog Representation and Mapping User Active Data Profiles monitoring Reader Situation Monitor Event Constraint Event Configurator Configurator LLM Data Writer Configurator Config. Data Data Contract Lifelog Monitor Intermediate Database Data Data Acquisition and Curation Persistence

Query Generator

Data Schema Exporter

Scan Service HDFS HIVE Passive Data Reader Physical Data Storage Non-Volatile Sensory Data Persistence Data Curation Framework Uniqueness and Contributions (4/4) / 75

• DCF provides persistence to support the large volume of heterogeneous and multimodal raw sensory data associated with the lifelog.

Sensory Sensory Context Data Data Sensory Instance Acquisition • Enables DCF to support the forthcoming concepts of data-driven Raw Data Synchronizer Data Writer Service Buffer Queue Query Loader knowledge generation, descriptive and predictive analytics, and Data Acquisition and Synchronization Schema Data Exporter Representation Lifelog and Storage Model CRUD Lifelog Instance Verifier Selector Operations visualization. Mapper Data Format Lifelog Representation and Mapping User Active Data Profiles Reader Situation Monitor Event Constraint Event Configurator Configurator LLM Data Writer Configurator Config. Data Data Contract Lifelog Monitor Intermediate Database Data Data Acquisition and Curation Persistence

Query Generator

Data Schema Exporter

Scan Service HDFS HIVE Passive Data Reader Physical Data Storage Non-Volatile Sensory Data Persistence Data Curation Framework Evaluation Evaluation Environment / 77

• Evaluation Matrix • Multimodal Raw Sensory data sources Criteria Explanation

Accuracy • Accuracy of the Synchronization Process during Raw-Sensory Data Acquisition Performance • Performance Testing during Raw-Sensory Data Acquisition and Synchronization Scalability • Scalability Testing during Raw Sensory Data Acquisition and Synchronization Monitoring • Lifelog monitoring with situation detection in real-time Big Data Persistence • Performance of sensory data persistence and retrieval over big data storage 1. Accuracy Evaluation / 78

Testbed • Smartphone (Samsung Galaxy s5) • Kinect (2nd Gen.) • DCL Server (64-bit Windows 8.1 operating system, 16 GB of RAM, and a 3.10 GHz AMD with 10 computing cores of 4C + 6G)

Test case • 100 data packets containing activity, location, voice from the smartphone • 100 data packets containing video data are sent from the PC connected with Kinect device

Results Summary: • All the packets sent at different times are synchronized to correct time window 2. Performance Evaluation / 79

Testbed • Data Sources • LG Watch R(TM) • Smartphone (Samsung Galaxy s5) x 2 • Kinect (2nd Gen.) • PC emulating an environmental sensor

• DCL Server (64-bit Windows 8.1 operating system, 16 GB of RAM, and a 3.10 GHz AMD with 10 computing cores of 4C + 6G)

Results Summary: Test case • Able to accumulate and synchronize 10,000 packets from • Rate of Data packets : 10 to 10,000 increment the five sensors within the provided timeout deadline • Time-out deadline : 5 s • The fastest time was recorded for 800 packets with a total time of 3.15 s • Time increased after 1800 packets and reached at 10,000 packets with a total time of 3.31 s, (51% faster than the provided deadline of 5 s.) 3. Scalability Evaluation / 80

Testbed • Data Sources • Simulated from 5 - 320 • DCL Server (64-bit Windows 8.1 operating system, 16 GB of RAM, and a 3.10 GHz AMD with 10 computing cores of 4C + 6G)

Test case • Fixed packet size: 10,000 packets / data source • Packet size: 30 kb of sensory data

Results Summary: • DCL was able to scale successfully from 5 multimodal data sources to 160 • For 160 data sources with the maximum of 10,000 packets per data source, DCF was able to accumulate, synchronize and enqueue within 1.2 min (=72,258.2 ms) • Stress test of DCL with 320 data sources, the total time increased exponentially to 41 min (=2,470,064 ms) 4. Lifelog Monitoring Evaluation (1/2) / 81

Test Scenarios Situation-detection Rules Lifelog Notifications

user 1 notification IF laying Monitor Current act == “sitting” && Yes Generate Context loc == “office” && Notification dur >= “2hr” stretching No walking user 1 3 hr. 2 hr. 1 hr. 1 hr. IF standing Monitor Current act == “stretching” && Yes Generate Context loc == “gym” && Notification dur >= “1hr” sitting No

1st-Hour 2nd-Hour 3rd-Hour 4rth-Hour 5th-Hour 6th-Hour 7th-Hour

user 2 notification laying IF Monitor Current act == “sitting” && Yes Generate Context loc == “office” && Notification stretching dur >= “2hr” No walking

standing user 2 3 hr. 2 hr. 2 hr. IF Monitor Current act == “standing” && Yes Generate Context loc == “office” && Notification dur >= “3hr” sitting No

1st-Hour 2nd-Hour 3rd-Hour 4rth-Hour 5th-Hour 6th-Hour 7th-Hour

IF user 3 notification act == “sitting” && Yes Monitor Current Generate laying Context loc == “home” && Notification dur >= “2hr” No stretching

walking user 3 2 hr. 2 hr. 1 hr. 2 hr. IF Monitor Current act == “stretching” && Yes Generate Context loc == “gym” && Notification standing dur >= “1hr” No sitting

1st-Hour 2nd-Hour 3rd-Hour 4rth-Hour 5th-Hour 6th-Hour 7th-Hour 4. Lifelog Monitoring Evaluation (2/2) / 82

Test Scenarios Situation-detection Rules Lifelog Notifications

user 4 notification IF laying Monitor Current act == “stretching” && Yes Generate Context loc == “gym” && Notification dur >= “1hr” stretching No walking

IF standing user 4 2 hr. 2 hr. 2 hr. 1 hr. Monitor Current act == “walking” && Yes Generate Context loc == “home” && Notification dur >= “1hr” sitting No

1st-Hour 2nd-Hour 3rd-Hour 4rth-Hour 5th-Hour 6th-Hour 7th-Hour

user 5 notification laying IF Monitor Current act == “laying” && Yes Generate Context loc == “home” && Notification stretching dur >= “4hr” No walking

standing IF Yes user 5 4 hr. 1 hr. 1 hr. 1 hr. Monitor Current act == “sitting” && Generate sitting Context loc == “home” && Notification dur >= “1hr” No 1st-Hour 2nd-Hour 3rd-Hour 4rth-Hour 5th-Hour 6th-Hour 7th-Hour

Results Summary • Interval duration greater than 1 sec is appropriate for the evaluation environment • Keeping the monitoring cycle interval at 3 sec, LLM publishes notifications with 96.97% efficiency. • In the case of 5 s, notification publication efficiency is 95.84% 5. Big Data Persistence Evaluation (1/2) / 83

Testbed

• Read and Write operations have been executed for three different raw sensory data sizes, i.e., 1 Gb, 5 Gb, and 9 Gb.

• This data is distributed over a private cloud infrastructure having four nodes

• Name-node, equipped with Intel Core i-5 3.3 GHz, 4 Gb of RAM; • (ii) First data node, equipped with an AMD 2.7 Results Summary GHz, 2 Gb of RAM; • The write operation is substantially faster than the read • 3rd and 4th data nodes are equipped with Intel operation. Core i-5 3.3 GHz, 2 Gb of RAM. • The time difference for both read and write is proportional to the volume of raw sensory data in big data storage. 5. Big Data Persistence Evaluation (2/2) / 84

Test Queries

Results Summary • Queries with varying complexities are performed over 1.7 Gb of lifelog data maintained over big data storage • Execution time is substantially dependent on the complexity of the execution query + the amount of data Summary / 85

• DCL focuses on curation of accumulated data from multimodal data sources in real time such that a context-rich user lifelog can be generated

• This lifelog offers a holistic view on user activity and behavior which can further be utilized in multidimensional ways including effective interventions from healthcare professionals.

• The data source-independent implementation of DCL makes it more scalable and IoT compatible.

• DCL monitors this lifelog of registered users for the detection of situations. This monitoring is able to integrate static, dynamic, and complex rules created by the expert.

• DCL incorporates multi-level abstraction on the data, depending upon its usage and persistence. • Frequently required user lifelog and profile data is maintained in an intermediate database; whereas, the historic and raw sensory data is maintained in non-volatile storage provided by big data technologies.

• DCF has been evaluation for performance, scalability, accuracy of the synchronization process of raw sensory data from multimodal data sources, monitoring of user life log, and data persistence. • It is evident that DCF’s implementation performs efficiently and effectively in realistic situations and scenarios while integrating with a health and wellness platform as the client. Information Curation Layer (ICL) Awareness of Low-level and High-level Context for Health and Wellness Platforms Background / 87

• Healthcare and Wellness Platform require human behavior information for providing services based on human health status and personalized life routine.

• Human Behavior Information includes physical activities, , location, nutrition consumption and more complex situation.

• Most relevant research focus on awareness of human centric context with high accuracy.

Image source : http://danieleizans.com/2011/01/context-in-content-strategy-personal-behavioral-context/ Motivation for Information Curation / 88

• Exploding big data from smart and personal devices • Healthcare platforms can utilize those big data to get better understanding of user health status

• Most current solutions for recognizing user context are based on single data source, with shallow methodologies • They are not able to generate comprehensive user information from raw multimodal big data

• A hierarchical, structured framework for recognizing human behavior from the exploding big data is required

Image source: http://mobihealthnews.com/17827/docomo-omron-healthcare-launch-connected-health-venture-in-japan Information Curation as a Framework (ICL) / 89

. Verify the ontological . Provide persistent storage for High Level Context-Awareness consistency of the unclassified the Context Ontology and Context Ontology Manager High-Level Context Notifier high-level context and perform context instances, as well as High-Level Context Reasoner ontological reasoning to classify management of the Context Classifier Context the high-level context interactions with the persisted Ontology Context Verifier context information Storage . Transform low-level contexts into High-Level Context Builder ontological format, align them Context Query Generator Context Instantiator . Activity decision fusion for the temporally and create Context Handler Context Synchronizer unclassified high-level context combination of the activities Ontology Model Manager Context Mapper identified by each recognizer instances from concurrent low- level contexts into a single one Low Level Context-Awareness Low-Level Context Notifier . Emotion decision fusion for the combination of the . Activity recognition based on Activity Unifier Emotion Unifier Location Food Unifier Unifier identified by each recognizer into smartphone inertial data (3D Inertial Inertial Video Position Indep. Position Dep. Video Audio Geopositioning Tag based Food ACC, 3D GYR), smart watch Activity Emotion Emotion Location Recognizer a single one Activity Activity Recognizer Recognizer Recognizer Detector inertial data (3D ACC, 3D GYR) Recognizer Recognizer Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter . Personal location and nutrition and depth video data (3D POS) Output Adapter Classification Classification Classification Classification Classification GPS Tracking identification based on GPS and Feature Feature Feature Feature Feature Feature DB Mapping Extraction Extraction Extraction Extraction Extraction Extraction tagging data Segmentation Segmentation Segmentation Segmentation Segmentation Segmentation . Distribution of synchronized Tag Parser . Emotion recognition based on compatible sensory data for Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Input Adapter the audio data and image data each low-level context Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter (WAV, JPG) generated during recognizer Sensory Data Router regular video phone-calls ICL High Level Architecture / 90

High Level Context-Awareness Input Source High-Level Context Notifier

Context Ontology Manager High-Level Context Reasoner

Context Classifier

Context Verifier Context Query Manager High-Level Context Builder Context Context Handler Ontology Context Instantiator Smartphone Smart Watch Storage Ontology Model Manager Context Synchronizer Inertial Sensor Inertial Sensor

Context Mapper

Low Level Context-Awareness Activity Notifier Emotion Notifier Location Notifier Food Notifier Smartphone KINNECT Activity Unifier Emotion Unifier Location Unifier Food Unifier Camera Camera

Position Dep. Position Indep. Video Physiological Video Audio Inertial Video Geopositioning Image based Tag based Inertial Activity Inertial Activity Activity Emotion Emotion Emotion Location Location Location Food Food Recognizer Recognizer Recognizer Recognizer Recognizer Recognizer Detector Detector Detector Recognizer Recognizer

Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Output Adapter Classification Classification Classification Classification Classification Classification InertialNavigation Video GPS Tracking Tracking Tracking Classification Wearable DB Mapping Feature Feature Feature Extraction Feature Feature Feature Feature Feature Feature Extraction Extraction Extraction Extraction Extraction Extraction Extraction Extraction Feature Extraction Smart Cup Sensor Segmentation Segmentation Segmentation Segmentation Segmentation Segmentation Segmentation Segmentation Segmentation Segmentation Tag Parser Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Preprocessing Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter Input Adapter

Sensory Data Router Physiological Google Glass Sensor ICL Execution Flow in Mining Minds Platform / 91 Related Work / 92

Contributions Limitations iHealth • Track user behavior and calories consumption with i-watch • Only considering simple activities and , iphone and bio sensors. information • The system provide visualization services for • Limited service contents based on low-level understanding health status easily. information

Noom • Suggestion diet plan with goal and mission • Only considering step information in physical • Nutrition centric services activities part • Track user behavior and provision expert recommendation • There is no comprehensive context services awareness mechanism

SAMI • Data-driven Development (D3) platform for receiving, • Only considering health related context storing and sending data to/from IoT devices. • No curation mechanism for data • Any device can send data in various formats which is then representation normalized into a JSON format and stored in the cloud ICL vs. State-of-the-art / 93

• ICL is unique in a way that it tries to recognize comprehensive user context from personal big data, from the hierarchically structured models • Each recognizers of ICL works independently. They can be added and removed with ease, which is similar to the concept of Plug and Play. • Considering the scalability with respect to the number of users and contexts we wish to recognize, ICL is implemented in a completely distributed way for each user, and for each recognizer. • ICL possesses low level recognizers for activity, emotion, and location of users with state-of-the-art methodologies. • ICL tries to infer the high-level context of user based on low-level user contexts. It uses Mining Minds Context Ontology and inference mechanism which allows ICL to perform inference on low level contexts of large number of users. Inertial Sensor based Activity Recognition Introduction / 95

• Life management system provides health related information and services to the user • User itself is the key factor of the system • System will collect the data related to the user

• User Context awareness is the fundamental part in this regard • Various valuable information can be acquired • Location, Situation, activity, etc. • Activity recognition is the cornerstone of context awareness • User context can be inferred based on user’s activity information Motivation / 96

• Motivation • Inertial sensor based activity recognition has long been used, but works well only in limited condition with few designated activities. • Robustness and reliability must be settled within the range of diverse activities

• Goal • Create a model for recognizing user’s activities in a highly accurate and robust manner

• Objectives • Accurately recognize several diverse and commonplace activities • Make two separate AR models for position dependency and independency • Achieve the robustness and reliability based on fusion technique Related Works / 97

Published Sensor Authors Sensor type Techniques Limitation year placement • Two feed-forward neural • Few basic activities Chun zhu et al. [1] 2009 Waist, Foot ACC, Gyro networks fusion • Offline evaluation • Heuristic segmentation ACC, Gyro, skin Head, two arms • Dynamic feature selection • Device is to bulky to use in temperature, Jun-Ki Min et al. [3] 2011 and two wrists, • Outputs of classifiers are real life heat flux, fingers combined and compared galvanic skin • Considered the difference of sensor orientation change Waist, chest, Lei Gao et al. [4] 2011 ACC using estimate of constant • Only used ACC thigh, side gravity vector • Sensor fault is considered ACC, GPS, • Build separate models for each Ming Zeng et al. [2] 2014 Free Speed, Ambient activity • Heavy weighted system light • Feature transformation Upper arm, • Considered orientation Muhammad Shoaib wrist, waist, independency 2014 ACC, Gyro • Few basic activities et al. [5] two pockets on • Compared the difference of pants sensor types and feature sets Challenges / 98

• Position dependency • Some of the sensor devices are attached in fixed position while some doesn’t such as a smartphone inside a pocket. • Satisfying these two different characteristics into one AR model is hard to accomplish

• Achieve Reliability • AR model is usually made based on limited condition which does not show expected performance in the real environment

• Achieve Robustness • In multiple sensor based AR, It does not work fine in a single AR model based recognizer using all the sensor values at the same time if one of the sensor does not work properly Sequence Diagram for inertial sensor based AR / 99

sd Interaction

Inertial Activity Recognizer Signal Preprocessor Signal Segmenter Feature Extractor Classifier

recognizeLowLevelContext(rawSensoryData)

preprocess(rawSensoryData)

:preprocessedSensoryData

segment(preprocessedSensoryData)

:segmentedSensoryData

extractFeatures(segmentedSensoryData)

:extractedFeatures

classify(extractedFeatures)

:activityLabel Architecture in Mining Minds / 100 Proposed method / 101

• Use Multiple Activity Recognizer • Solve the position problem • Strong on sensor failure • Final decision will be made by decision fusion • Achieve high accuracy with the highest possibility

Smartphone Recognizer 1 (ACC) Output adapter Input adapter Smartphone Recognizer 2 (Activity Label from (Distribute sensor values) (ACC, GYR) smartphone)

Smartphone Recognizer 3 (ACC, GYR, MAG) Decision Fusion Final Activity Label Smartwatch Recognizer 1 (ACC)

Input adapter Smartwatch Recognizer 2 Output adapter (Distribute sensor values) (ACC, GYR) (Activity Label from smartwatch)

Smartwatch Recognizer 3 (ACC, GYR, MAG) Flow chart (Abstract) / 102

Preprocessing/ Feature Extraction Feature Selection Classification Segmentation

Input Preprocessing/ Output Feature Extraction Feature Selection Classification Adapter Segmentation Adapter

Preprocessing/ Feature Extraction Feature Selection Classification Segmentation Decision Activity Fusion Label Preprocessing/ Feature Extraction Feature Selection Classification Segmentation

Input Preprocessing/ Output Feature Extraction Feature Selection Classification Adapter Segmentation Adapter

Preprocessing/ Feature Extraction Feature Selection Classification Segmentation Activity label: walking Input Adapter / 103

• Distribute sensor values to each recognizer

ACC: 0.4, 0.7, 1.6 Recognizer 1 (ACC) Time: 2015-05-14 14:00:00 Device ID: 123456789 Preprocessing Feature Extraction Feature Selection Classification

ACC: 0.4, 0.7, 1.6 Recognizer 2 (ACC, GYR) Input GYR: 1.4, 2.1, 5.6 Time: 2015-05-14 Adapter 14:00:00 Preprocessing Feature Extraction Feature Selection Classification Device ID: 123456789

ACC: 0.4, 0.7, 1.6 GYR: 1.4, 2.1, 5.6 Recognizer 3 (ACC, GYR, MAG) MAG: 1.9, 4.3, 4.5 Time: 2015-05-14 14:00:00 Preprocessing Feature Extraction Feature Selection Classification Device ID: 123456789 Flowchart / 104

Multiple Recognizer

Decision fusion MM v3.0 Plan - Inertial Based AR (Tae Ho) / 105

• New sensor inclusion  Recognition for leg to achieve 1. Multiple activity recognition and 2.Position independent recognition  Shimmer or different kind of sensor device will be included • Multiple activity recognition  Recognition based on upper side (arm) and lower side (leg) - Ex) Eating while sitting, sweeping while walking • New activity inclusion (TBD)  Type of working in office - Using laptop, writing Video based Activity Recognition Overview / 107

• Recognizing human activity is one of the important areas of computer vision and artificial intelligence today.

• Human activity recognition has been found in many applications of video surveillance, human-computer interaction, robotic control, and health care.

• Due to natural limitations in the use of traditional color camera for object detection, pose estimation, and action recognition, the depth camera with evident advantages is considered as a potential solution. Motivation / 108

• Natural limitations of color cameras in object detection and pose estimation • Depth camera supports 3D skeleton estimation

• Complex indoor activities with different posture representations • Combining skeleton-based spatial features with temporal descriptive statistics features in the temporal dimension.

• Performance balance between accuracy and processing time • Feature selection for feature dimension reduction Related works / 109

Authors Year Dimension No. Activity Activity types Accuracy Key points Limitation

Hidden Markov Mode High complexity Gu 2010 3D 8 Single action 94%

Annotated joint feature Low recognition rate Ofli 2013 3D 12 Single action 80%

Temporal joint distance Vantigodi 2013 3D 12 Single action 96% feature

Dynamic time warping Kruthiventi 2014 3D 12 Single action 97%

Actionlet ensemble model High complexity and Wang 2014 3D 12 Single action 95% computational cost

• Single action: hand catching, forward punching, two hand waving, ward kicking, high throwing … Component Architecture / 110

Video Activity Recognizer

Output Adapter

Classification

Feature Extraction

Segmentation

Preprocessing

Input Adapter Execution Workflow / 111

Video-based Activity Recognizer

Feature 3 Feature selection extraction

2 4

Segmentation Output adapter

1 5 • Skeleton data maintained in DCL is transferred to ICL • Data containing 3D coordinates of 25-joint complete Input Adapter Classification skeleton Execution Workflow / 112

Video-based Activity Recognizer

Feature 3 Feature selection extraction

2 4 • Segment skeleton data flow with 30 fps of frame rate into 3-second windows for recognition. Segmentation Output adapter • The number of frames is reduced from 90 to 30 to boost the processing time 1 5

Input Adapter Classification Execution Workflow / 113

Video-based Activity Recognizer

• Calculate the joint distance and joint angle features based on 3D skeleton data Feature 3 Feature selection • Calculate the mean and standard deviation of joint extraction distance and angle 2 4

Segmentation Output adapter

1 5

Input Adapter Classification Execution Workflow / 114

Video-based Activity Recognizer

• Ranks the features using an independent evaluation Feature for binary classification 3 Feature selection extraction • Select the most of important features to reduce the processing time of online recognition 2 4

Segmentation Output adapter

1 5

Input Adapter Classification Execution Workflow / 115

Video-based Activity Recognizer

Feature 3 Feature selection extraction

2 4

• Training the SVM classification model using Weka tool Segmentation Classification • The learned model is used for online recognition

1 5

Input Adapter Output adapter Execution Workflow / 116

Video-based Activity Recognizer

Feature 3 Feature selection extraction

2 4

Segmentation Classification

1 5 • Result of the classifier is the label of recognized action Input Adapter Output adapter • Provide the video-based activity recognizer to the Activity Unifier Contribution / 117

• Collect a new dataset for evaluating the video-based activity recognition using MS Kinect sensor. • Develop an efficient algorithm for video-based activity recognition using 3D skeleton data. • Achieve a good trade-off between the accuracy and computational cost for real-time recognition. Plan for MM v3.0 - Video Based AR (Thien) / 118

• Multiple activity recognition • Extend new indoor activities • Using computer (officework) • Stretching • Improve recognition accuracy as possible Audio Based Emotion Recognition Introduction / 120

Audio Data Stream • Emotion Recognition has been more important Processing

for provision Smart Healthcare-services Smartphone • Depression monitoring • Recommend services based on emotion state • Analysis satisfaction of services Mining Minds cloud Platform

• Emotion Recognition is one of important factor for assuming personalized user context.

Record Voice Analysis Audio Signal Emotion state Motivation / 121

• Why audio for emotion recognition? • Speech is the most commonly used and most natural method of human communication • Recently, people frequently communicate with each other by speech on their own machines (Phone, PC etc. ) • Especially a phone call is powerful data source for recognizing emotion

• Issue of Real-time Speech Emotion Recognition on phone call • How to gather emotional audio data sources • How to recognize user emotion on real-time using by conversation in phone call Various emotion expression on a phone call Goal and Objectives / 122

• Goal: Design and implement a methodology that is able to recognize emotional states from user speech of phone call

• Objectives: o Achieve acceptable accuracy o Development of smartphone call speech based emotion recognition on the real-time o Collect a real-world and diverse call speech dataset Related Works – Datasets / 123

Database Modalities Elicitation Method Emotional Content Size AIBO database (Batliner et al., 2004) [1] Audio Natural: children interaction , bored, emphatic, helpless, ironic, joyful, 110 dialogues, 29200 words with robot motherese, reprimanding, rest, , touchy Berlin Database (Burkhardt et al., 2005) Audio Acted anger, boredom, , , , 493 sentences; 5 actors & 5 [2] , neutral actresses ISL meeting corpus (Burger et al., 2002) Audio Natural: meeting corpus negative, positive, neutral 18 meetings; average 5 persons per meeting Adult Attachment Interview database Audio-Visual Natural: subjects were 6 basic emotions, contempt, embarrassment, 60 adults: each interview was 30-60 (Roisman, 2004) [3] interviewed to describe the shame, general positive and negative minutes long childhood experience emotions

Belfast database (Douglas-Cowie et al., Audio-Visual Natural: clips taken from Dimensional labeling/categorical labeling 125 subjects; 209 clips from TV and 2003) [4] television and realistic 30 from interviews interviews with research team Busso-Narayanan database (Busso et al., Audio-Visual Acted anger, happiness, sadness, neutral 612 sentences; an actress 2007) [5] IEMOCAP: Interactive emotional dyadic Audio-Visual Acted happiness, sadness, anger and frustration 10 actor recorded motion capture database (Busso et al, 2008) [6] Haq-Jackson database (Haq & Jackson, Audio-Visual Acted: emotion stimuli were 6 basic emotions, neutral 480 sentences; 4 male subjects 2009) [7] shown on screen Related Works – Methodologies / 124

Reference Data Features Classifier Classes Accuracy

Lee et al., 2011 [8] AIBO Dataset Prosody, MFCC + Statistical Functions Hierarchical 5 48.2% Bayesian Logistic

Purnima Chandrasekar et, EmoDB MFCC, Pitch, Energy SVM 7 86.6% al. ,2014 [9]

Kun Han et al., 2014 [10] IEMOCAP: Interactive emotional Segmented MFCC Features Vector Deep neural 4 54.3% dyadic motion capture database network

Arianna Mencattini et al., EMOVO 520 features (divided 12 different groups); Support Vector 7 72% 2014 [11] TEO, Energy Sequence, wavelet Machine (SVM) approximation coefficients etc. Wang et al., 2015 [12] EmoDB, CASIA MFCC, Fourier Parameters SVM 6 88.9% (EmoDB), 79% (CASIA)

Poria et al., 2015 [13] Audio-Visual; eNTERFACE V: characteristic points, distances; SVM 6 81.2%(V), 78.6%(A), A: MFCC, spectral features 87.95% (AV)

Amiya Kumar et al., 2015 600 speech sample by 5 speakers MFCC, LPCC, MEDC SVM 7 82.26% [14] Limitations of existing works / 125

• Focus on a evaluation based on formatted speech database • EmoDB, eNTERFACE, Berlin Emotion DB, etc.

• Lack of preprocessing for supporting real-time process • silent remover, user speech signal extraction ICL System Specification / 126

Sequence Diagram

sd Interaction

Audio Emotion Recognizer Signal Preprocessor Signal Segmenter Feature Extractor Classifier

Recognize recognizeLowLevelContext(rawSensoryData) emotion based preprocess(rawSensoryData) on audio data

:preprocessedSensoryData

segment(preprocessedSensoryData) Usecase ICL2-SUC-08 :segmentedSensoryData

extractFeatures(segmentedSensoryData)

:extractedFeatures

classify(extractedFeatures)

:activityLabel Architecture / 127

Audio Emotion Recognizer

Output Adapter Recognize emotion based on audio data Classification

Usecase Feature Extraction ICL2-SUC-08 Segmentation

Preprocesing

Input Adapter Workflow – Emotion Recognizer / 128

Classification

Input Adaptor Preprocessing Segmentation Feature Extraction Training

Silent Remover MFCC Filter Bank Audio Stream Window based Classification Gathering Segmentation Models Audio Data Buffer Time Domain Management Features

Classifying

[f1, f2, …, fN] Single Speech Content Emotion Real-time Audio stream Segment Feature Vector label HAPPY Preprocessing Silent Remover / 129

• Preprocessing module removes uniformative data using Silent Remover • Non-speech section is an uniformative data that it recognizes no emotion at all • Silent Remover: Remove the segment of signal which does not contain the speech con tent only using dB threshold (Value is 15dB : regular whispering speech) • Input: Recorded raw audio signal • Output: Refined audio signal without silence section

Before Remove Silent Signal Formula of dB value extraction

Removed Silent Signal Real-time Window-based Segmentation / 130

Real-time Audio Stream Data • Real-time Window-based Segmentation • Input : Real-time audio stream data • Output : Segmented Audio Stream Data • Work Flow:

1) Store refined data in the audio buffer w w 2) Accumulate over 3sec data on the audio buffer Discard Discard Discard Store in buffer 3) Segment 3sec data and utilize to recognize emotion 4) Restore remained data in the audio buffer Audio Buffer

Accumulate Segment 3sec data and Restoring remained audio data recognize emotion Data in audio buffer Feature Extraction / 131

• Feature Extraction Algorithms -Input: Segmented audio data -Output: Feature Vector (total 72features) -Workflow: 1) Construct frequency-domain signal using Fast Fourier transform(FFT) 2) Transform to Mel-frequency scale 3) Compute 13 MFCC(Mel-Freq. Cepstral Coefficients) 4) Separate unit of 1sec and Calculate mean and StdDev from each Coefficient without First Coefficient (24features) 5) Combine 3sec features (72 features)

Record Voice Unit of Recognition is 3sec 1. MFCC1 2. MFCC2 Create Spectral using FFT … 1. MFCC1 12. MFCC122. MFCC2 1. MFCC2 Mean 13. ΔMFCC1… 2. MFCC3 Mean 14. ΔMFCC212. MFCC12 Apply Mel-Scale … Filter Bank … 13. ΔMFCC1 12. MFCC12 Mean 24. ΔMFCC1214. ΔMFCC2 13. MFCC2 StdDev … 14. MFCC3 StdDev 24. ΔMFCC Extract 72 Features Time #3 … From Each Time Frame and Time #2 24. MFCC12 Each Coefficient for 3sec Time #1 StdDev Classification / 132

• SVM (Support Vector Machine) • Input: Feature Vector • Output: Emotion • Kernel: RBF Kernel • Filter: Standardize Training Data

1vs2 1. MFCC1 2. MFCC2 … 1. MFCC1 1vs3 12. MFCC122. MFCC2 1. MFCC2 Mean 13. ΔMFCC1… 2. MFCC3 Mean 1vs4 14. ΔMFCC212. MFCC12 Major … … 13. ΔMFCC1 12. MFCC12 Mean 24. ΔMFCC1214. ΔMFCC2 2vs3 Voting 13. MFCC2 StdDev … 14. MFCC3 StdDev 24. ΔMFCC

Time #3 … Model Trained 2vs4 Time #2 24. MFCC12 Time #1 StdDev 3vs4 Extracted 72 Features From Each Time Frame and Training Phase: Find the optimal Each Coefficient for 3sec separating hyperplane that has the Recognition Phase: Recognize the largest margin. emotion based on major voting Flow Chart / 133

1. Gather Audio Raw Sensory Data from Sensory Data Router Module Start 2. Stored Audio Buffer for Collecting Sufficient Data 3. Remove Silent in Audio Audio Data Gathering from Segmentation 4. Check Audio Buffer Size, If Audio Buffer Size Sensory Data Router More Than 3sec, Begin Recognize Process. If Yes Remained Segmented Not, Return to Audio Data Gathering Data Store in Size > 3sec 5. Segment Data based on 3sec Store Audio Buffer Audio Buffer 6. If Segmented Data Size More than 3sec, No Remained Data store in Audio Buffer. If Not, Feature Extraction Begin Feature Extraction Remove Silent 7. Extract Features based on Time Domain Features, Statistical Feature and Frequency Classification Features Audio Buffer 8. Classification based on SVM Size >= 3sec No 9. Transfer Emotion Label to DCL And HCL Server Transfer Emotion to DCL and HCL Yes

End Contributions / 134

• Proposing a methodology for speech based emotion recognition • Creation of an emotion set based on requirements of services to be delivered • Collection of a real-world emotional speech dataset • Offline evaluation and validation of various emotion recognition models • Online validation Plan for MM v3.0 – Emotion Recognition (Jaehun) / 135

Emotion Unifier

Boredom, user_9876, Boredom, user_9876, 11:05:14 11:05:12 • Input Anger, user_9876, 11:05:13 Boredom, user_9876, 11:05:14  Video Phone Call – Video and Audio

Audio Video Emotion Emotion Recognizer Recognizer 2 Output 2 Output Adapter Adapter Segmentat Segmentat • Output ion ion  4 emotion accuracy (TBD) Classificat Classificat Preproces ion Preproces ion sing sing 3 Feature 3 Feature Extraction • Implementation Module Input Input Extraction Adapter Adapter  Audio Emotion Recognition

1 1  Video Emotion Recognition  Emotion Fusioning

Sensory Data Router Location Detection Introduction / 137

• The identification of people location is of much interest to support diverse type of services • Location-based services have been widely explored during the last years, mainly exploited in social networks such as Foursquare, TripAdvisor or Twitter among many others • Location detection techniques can be essentially categorized into outdoor or geopositioning methods and indoor positioning approaches Problem Statement / 138

• Motivation o Users location is a primary source of information to infer context o Tracking of people locations can also serve to identify behaviors and routines • Goal o Define mechanisms to seamlessly identify and register the user location by using technologies readily available to every user • Objectives o Automatic identification of user location o Creation of personalized maps with general and user-centric points of interest Challenges / 139

• Geolocation mechanisms are subject to GPS signal availability; thus, no detection is possible in places without coverage (e.g., indoor spaces) • There exist diverse sort of map APIs; however, they present limitations in their use for some particular countries (e.g., Google Maps in Korea) • A very location may have different meanings for different users (e.g., “Peter’s fitness center” is in the same building where “David’s favourite restaurant” is located) Architecture / 140

Geopositioning Location Detector

Output Adapter

GPS Tracking

Feature Extraction

Segmentation

Preprocesing

Input Adapter Workflow / 141

End User Data Curation Sensory Data Input Adapter Application Layer Router

Sensory Data (Audio, GPS, ACC, GYRO, GPS: (37.246968, 127.078464, 40), User’s Profile GPS: (37.246968, 127.078464, 40), MAG) (37.243551, 127.080653, 41), Sensory Data (Audio, (37.243551, 127.080653, 41), Time: 2015-05-14 14:00:00 (37.240989, 127.084215, 40), GPS, ACC, GYRO, MAG) (37.240989, 127.084215, 40) Device ID: 123456789 Time: 2015-05-14 14:00:00 Device ID: 123456789 Geopositioning Location Detector

Recognized label: HOME

Data Curation Location Location Output Adapter Layer Notifier Unifier

Location label: HOME Location label: HOME Location label: HOME Time: 2015-05-14 14:00:00 Time: 2015-05-14 14:00:00 Time: 2015-05-14 14:00:00 Device ID: 123456789 Device ID: 123456789 Device ID: 123456789 Location Detection – Progress / 142

• Implementation of user-centered location registation module • Implementation of classfying general POI (e.g., “City Hall”), user- centered POI (e.g., “home”) and Frequency POI (e.g., “Restuarants”) • Visualization of registered location. Designed Location Detector Flowchart / 143

1. Initialize Current Location and Registered start Location Variable.

2. Gather User GPS Data from Sensory Data CurLoc = null, Router in LLCA RegLoc = null 3. Load Registered Location and Save in RegLoc Variable form DCL Server Gather GPS Data from Sensory 4. Calculate Distance Between CurLoc and Data Router RegLoc. Load Registered Location and 5. If Distance < 10m, the CurLoc has Save in RegLoc from DCL Server changed RegLoc 6. If Distance > 10m, Put “OutSide” in Cur Distance between CurLoc and CurLoc = “OutSide” Loc RegLoc < 10m No 7. Transfer Extracted Location to DCL Server Yes and HCLA Transfer Location Information to CurLoc = RegLoc DCL Server and HCLA

End Conclusion / 8

• The identification of users location is of primal necessity in order to fairly identify the user context • Location tracking can be also used to better determine user behavior and routines • Both user-centric and generic maps are considered to personalize the information acquired from the user location High-level Context Awareness High Level Context Awareness – Overview / 146

Low Level Context Awareness High Level Context Awareness DCL

Happy Sitting Labels Office •Low Level Work Activity Context Office Lifelog •Low Level Location Context + MetaData Dinner Exercising Repository

•Low Level Emotion Context Outdoor Running Evening Shopping Meeting

Mid-Day Happiness

Low level Context Physical Activity - High level Context 14 High Level Context Awareness – Overview / 7

Low Level Context Awareness High Level Context Awareness DCL • High Level Context Reasoner • New rules addition to the ontology

Labels Happy Eating

Nutrition •Low Level Context Activity Context Protein

•Low Level Restaurant Location Context Lifelog + MetaData Dinner Exercising Repository •Low Level Emotion Context

Outdoor Running Evening •Low Level Food Context Shopping Meeting

Mid-Day Salmon

Low level Context Nutrition - High level Context Motivation / 148

• Abstract description of user’s context • Extraction of High level context from low level context for better understandability of user’s context. • Identification of High Level Context for decision making by upper layers: • Personalized recommendations • Behavior modeling • Personalized predictions HLCA Architecture / 149 Execution Workflow – Conceptual View / 150

Low Level Context High Level Context Builder

1 2 3 Activity: Sitting Context Context Context + Synchronizer Instantiator Metadata Mapper

Emotion: Boredom Sitting + Metadata Metadata Boredom Location: Office Metadata

+ Office Metadata Metadata

3 2 1 Execution Workflow – Conceptual View / 151

High Level Context Reasoner 4 5 6 High Level Data Curation Context Verifier Context Classifier Context Notifier Layer

6

5 4 Execution Workflow – Component Level Context Ontology Engineering & Persistence / 152 High Level Context-Awareness Retrieval Context Ontology Manager High-Level Context Notifier Persistence

Context Query Generator Context Notifier Retrieve Notify Classify Notify 3 Access Store Context LastHLC NewHLC NewHLC DCL Ontology in Jena PreviousHLC Context Ontology TDB GenerateQuery Storage 3 LLCWindowSize High-Level Context Reasoner

Context Handler Context Verifier Context Classifier VerifyNew ValidateHLC Store Retreive UnclassifiedHLCInstance ClassifiedNewHLC LLCWindowSize RetrieveOntModel InferHLC Receive Store ClassifiedNewHLC MappedLLC ReceiveNew RetrieveOntModel UnclassifiedHLC Handle Receive PreviousHLC MappedLLC High-Level Context Builder Ontology Model Manager Context Instantiator Context Mapper Send 2 StoredOntModel Ontology is loaded NewUnclassifiedHLC and Ont Model is MapLLC InstantiateNew created for storage CreateOntModel UnclassifiedHLC as triple storage RetreiveOntModel Ontology Engineer 2 Context Synchronizer 1 LoadOntology Ontology Engineers ReceiveLLC models the context Synchronized LLCWindow ontology 1

LLCA Activity Recognizer Location Detector Emotion Recognizer Execution Workflow – Component Level LLC Instance Mapping & Persistence / 153 High Level Context-Awareness Retrieval Context Ontology Manager High-Level Context Notifier Persistence

Context Query Generator Context Notifier 9 10 Retrieve Notify Classify Notify Access LastHLC NewHLC NewHLC DCL Transformed LLCA PreviousHLC Context data is stored into Ontology 10 GenerateQuery triple storage Storage LLCWindowSize High-Level Context Reasoner

Context Handler Context Verifier Context Classifier VerifyNew ValidateHLC Store Retreive UnclassifiedHLCInstance 8 ClassifiedNewHLC LLCWindowSize RetrieveOntModel InferHLC MapLLC transforms Receive Store the LLCA data into ClassifiedNewHLC MappedLLC ReceiveNew 9 RetrieveOntModel ontological format UnclassifiedHLC and send to Handle Receive 8 ReceiveMappedLLC PreviousHLC MappedLLC High-Level Context Builder Ontology Model Manager Context Instantiator 6 7 Context Mapper Send The ontology StoredOntModel NewUnclassifiedHLC model is retrieved MapLLC 4 5 for mapping and 6 InstantiateNew LL activities are notified transformation and 7 UnclassifiedHLC CreateOntModel to HLCA, received forwarded to RetreiveOntModel 5 ReceiveLLC function and MapLLC Context Synchronizer LoadOntology forwarded to MapLLC ReceiveLLC Synchronized LLCWindow

4 LLCA Activity Recognizer Location Detector Emotion Recognizer Execution Workflow – Component Level Context Synchronization & Unclassified HLC / 154 High Level Context-Awareness Retrieval Context Ontology Manager High-Level Context Notifier Persistence

Context Query Generator Context Notifier Retrieve Notify Classify Notify Access LastHLC NewHLC NewHLC DCL PreviousHLC Context Ontology GenerateQuery 13 13 Storage LLCWindowSize Retrieves High-Level Context Reasoner concurrent LLC 12 from the triple Context Handler Context Verifier Context Classifier storage VerifyNew 16 Store Retreive ValidateHLC UnclassifiedHLCInstance Generate an ClassifiedNewHLC LLCWindowSize unclassified HLC RetrieveOntModel InferHLC Receive Store Instance which contains the LLC ClassifiedNewHLC MappedLLC ReceiveNew 16 RetrieveOntModel instances and UnclassifiedHLC notify the new Handle Receive unclassified HLC to 11 12 PreviousHLC MappedLLC the High Level High-Level Context Builder Context Reasoner Context Ontology Model Manager 11 Context Instantiator Synchronizer Context Mapper Send retrieves the LLC StoredOntModel instances which NewUnclassifiedHLC 15 14 15 start & end within a MapLLC InstantiateNew time window. CreateOntModel UnclassifiedHLC Receive the trigger RetreiveOntModel LLC instance and its Context Synchronizer 14 concurrent ones. LoadOntology ReceiveLLC Synchronized LLCWindow Execution Workflow – Component Level HL Context Reasoning / 155 High Level Context-Awareness Retrieval Context Ontology Manager High-Level Context Notifier Persistence

Context Query Generator Context Notifier Retrieve Notify Classify Notify Access LastHLC NewHLC NewHLC DCL 22 PreviousHLC Context Ontology Classification of the GenerateQuery Storage 23 HLC instance using LLCWindowSize Pellet 19 High-Level Context Reasoner Context Handler 21 Verify the Context Verifier Context Classifier consistency of VerifyNew Retrieve OntModel 20 ValidateHLC unclassified high- Store Retreive UnclassifiedHLCInstance and serve it to level context ClassifiedNewHLC LLCWindowSize 19 22 InferHLC for instance. RetrieveOntModel InferHLC reasoning Receive Store 18 17 18 ClassifiedNewHLC MappedLLC ReceiveNew RetrieveOntModel 20 UnclassifiedHLC Logical consistency Handle 21 Receive The valid check versus the PreviousHLC MappedLLC ontology unclassified high- High-Level Context Builder level context instance is served 17 Ontology Model Manager Context Instantiator Context Mapper to InferHLC Send Receive an StoredOntModel function unclassified high- NewUnclassifiedHLC MapLLC level context InstantiateNew instance CreateOntModel UnclassifiedHLC RetreiveOntModel Context Synchronizer LoadOntology ReceiveLLC Synchronized LLCWindow Execution Workflow – Component Level HL Context Notification / 156 High Level Context-Awareness Retrieval Context Ontology Manager High-Level Context Notifier Persistence

Context Notifier 27 Context Query Generator Retrieve Notify Classify Notify Access 28 Retrieve the LastHLC NewHLC NewHLC DCL PreviousHLC 27 Context previous classified Ontology Add end time to high level context GenerateQuery 28 28 Storage 24 previous high-level LLCWindowSize context instance. High-Level Context Reasoner Provide both instances to NotifyNewHLC Context Handler Context Verifier Context Classifier function and new high VerifyNew level context to ValidateHLC Store Retreive UnclassifiedHLCInstance NotifyDCL function. ClassifiedNewHLC LLCWindowSize RetrieveOntModel InferHLC Receive Store ClassifiedNewHLC MappedLLC ReceiveNew RetrieveOntModel UnclassifiedHLC Handle 26 Receive 24 25 26 PreviousHLC MappedLLC 25 Retrieve the High-Level Context Builder previous classified Ontology Model Manager Context Instantiator Context Mapper high level context Send StoredOntModel NewUnclassifiedHLC MapLLC InstantiateNew CreateOntModel UnclassifiedHLC RetreiveOntModel Context Synchronizer LoadOntology ReceiveLLC Synchronized LLCWindow Execution Workflow – Component Level HL Context Notification / 157 High Level Context-Awareness Retrieval Context Ontology Manager High-Level Context Notifier Persistence

Context Query Generator Context Notifier Retrieve Notify Classify Notify Access LastHLC NewHLC NewHLC DCL 29 PreviousHLC Context Ontology Communication of the GenerateQuery Storage newly recognized high- LLCWindowSize level context to Data High-Level Context Reasoner Curation Layer for storage into the Context Handler Context Verifier Context Classifier LifeLog. 31 VerifyNew ValidateHLC Store Retreive UnclassifiedHLCInstance ClassifiedNewHLC LLCWindowSize 30 RetrieveOntModel InferHLC 30 Receive Store 29 ClassifiedNewHLC MappedLLC ReceiveNew Storage of high-level RetrieveOntModel UnclassifiedHLC context into the Handle Receive Context Ontology Storage PreviousHLC MappedLLC High-Level Context Builder Ontology Model Manager Context Instantiator Context Mapper Send StoredOntModel NewUnclassifiedHLC MapLLC InstantiateNew CreateOntModel UnclassifiedHLC RetreiveOntModel Context Synchronizer LoadOntology ReceiveLLC Synchronized LLCWindow Contributions / 158

• Engineering of an Mining Minds Context Ontology for context definition • Design and implementation of methodology for high level context recognition. • Identification of the issues associated to low-level and high-level context synchronization • Reasoning in order to derive High-level context from Low-level context Mining Minds V3.0 – Future Work / 159

Future Work (MM 3.0) • Enrichment of the user behavior description by Inclusion of SWRL Rules • Handling of multiple activity recognitions based on inputs from LLCA. • Inclusion of Diabetic Ontology concepts (based on Requirements) • Water Intake Monitoring Services. Uniqueness & Contributions Uniqueness and Contributions (1/4) / 161

• Comprehensive low-level context awareness on real-time • Real-time low-level context recognition Service • Activity Recognition – 11 activities • Emotion Recognition – 4 emotions • Location Detector – 5 location • Simultaneous recognition with 4 Kinds of low-level context

• Context Fusion for high accuracy • Appropriate mixed methodology with weight and majority voting Uniqueness and Contributions (2/4) / 162

• Comprehensive High-level Context Awareness with nutrition information • The High-level context awareness engine recognize more specific context considering 4 kinds information • Physical Activities • Emotions • Nutrition • Location Evaluation 1. Inertial based AR Evaluation / 164

• Devices — Smartphone : Samsung Galaxy S5 — Wearable : LG G Watch R

• Testing Model — Utilized Weka Classification — SVM, KNN(k = 3), KNN(k = 5), Decision Tree — 10-fold testing — Window Size : 3sec 1. Inertial based AR Evaluation - Experiment Result / 165

• Smartphone & Smartwatch • Position of devices : Smartphone– Trouser front pocket , Smartwatch – right wrist

• Evaluation result – Average accuracy: 79%

Stairs Cycling Eating Jumping LyingDown Running Sitting Standing Stretching Sweeping Walking Stairs 62.80487805 0 0 0 0 0 0 0 0 0 37.19512195 Cycling 0 88.64970646 0 0 0 0 0 11.35029354 0 0 0 Eating 0 0 81.1023622 0 0 0 17.32283465 1.181102362 0 0.393700787 0 Jumping 69.81132075 0 0 29.24528302 0.943396226 0 0 0 0 0 0 LyingDown 0 0 0.869565217 0 80.86956522 0 17.39130435 0.869565217 0 0 0 Running 0 0 0 0 0 100 0 0 0 0 0 Sitting 1.740294511 0 5.890227577 0.133868809 0.401606426 0 84.3373494 2.275769746 0.133868809 0.93708166 4.149933066 Standing 0.445434298 0 0.222717149 0 0 0 10.91314031 77.50556793 4.677060134 4.677060134 1.559020045 Stretching 0.546448087 0 0 0 0 0 0.546448087 0.546448087 93.44262295 4.918032787 0 Sweeping 2.777777778 0 0 0 0 0 0 0.925925926 11.57407407 81.48148148 3.240740741 Walking 0.711743772 0 0 0 0 0.118623962 0.59311981 0 0.118623962 0 98.45788849 2. Video based AR - Dataset collection / 166

Scenario: • Collect daily indoor activities: stretching, sweeping, sitting (reading book), sitting (calling), lying, standing (watching movie), standing (calling), eating. • Number of collected candidates: 10 (age: 21-30, height: 1m65-1m75) Device: • Laptop (Windows 8.1, USB 3.0) • MS Kinect device • Television Output data: • Body frame: 30 frames/second 2. Video based AR - Evaluation results / 167

• Benchmark the algorithm with different types and number of feature • Feature types: joint distance, angle, and combination • Number of used features • Less than 6 features, angle is better than distance. • Achieve 99% in accuracy at 10-features using distance metric • Higher accuracy, higher computational cost 2. Video based AR - Evaluation results / 168

• To preserve the smooth transmission from the Kinect to cloud, the sampling rate should be reduced. • Benchmark the algorithm with various values of sampling rate, for example as 30fps (default), 15 fps, 10 fps, 6 fps, and 5 fps. • Overall accuracy is generally reduced on the sampling rate factor. • To get the good tradeoff between the accuracy, reality for data transmission, and computational cost, algorithm is employed with 5fps of sampling rate, 30-distance selected features. 3.Audio based ER - Data Collection / 169

• Data Collection Number of emotions 7 • Scenario: Phone call Number of users 10 • Acted emotions: users try to simulate emotion by Language Korean reading scripts Duration 1 min/emotion/person

• 7 emotions: surprise, anger, sadness, neutral, Device Galaxy S5 boredom, fear, happiness • Recording voice based on various situation

Call on sitting Call on standing Call in cafeteria Call on walking Call in office 3. Audio based ER - Evaluation Result / 170

• Evaluation Method : 10-fold cross validation • Classifier: SVM • Evaluation Tool : Weka

7 emotions – 44.1176% 4 emotions – 70.0935% Angry Bored Fear Happy Normal Sad Surprise Angry Happy Normal Sad Angry 46 1 7 2 2 3 6 Angry 52 6 6 3 Bored 1 30 4 1 1 9 1 Happy 13 42 2 4 Fear 10 12 27 5 2 5 1 Normal 4 3 27 9 Happy 12 3 5 31 1 2 7 Sad 2 0 12 29 Normal 3 4 8 3 15 9 1 Sad 0 22 6 0 6 8 1 Surprise 31 1 5 3 3 0 8 Summary / 171

• ICL focuses on recognizing the comprehensive context of users based on the multimodal data sources, in a structured and hierarchical way.

• At the lower part of the hierarchy, ICL tries to recognize low level context of user from raw multimodal big data source with diverse methodologies.

• Introduction of Plug-and-Play concepts for the low level recognizers makes ICL scalable to the number of context wish to recognizer from the multimodal raw data.

• At the higher part of the hierarchy, ICL tries to infer the high-level user context based on the low-level context it generates, with the help of Mining Minds Ontology and inference mechanism.

• ICL is implemented in a completely distributed way at a single user and single recognizer level, based on the latest Java distributed computing framework. It makes ICL scalable in IoT and big data environment.

• Each recognizers of ICL has been evaluated thoroughly for the performance and accuracy, via the offline cross-validation and online real-time tests. Knowledge Curation Layer (KCL)

Data Driven and Expert Driven Knowledge Acquisition Data-Driven Knowledge Acquisition Overview / 174

• Data driven is a powerful approach, which • process existing data and then create new ones from them. • requires less up-front knowledge, but a lot more back-end computation and experimentation • Improves business and society through data • Machine learning follows data driven approach • Extracts hidden knowledge from data using different computerized algorithms • So data should • be right, in a useful scale & format, and includes meaningful features • No quality data, no quality mining results!

http://www.cad.zju.edu.cn/home/zhx/csmath/lib/exe/fetch.php?media=2013:csmath-01-data-driven.pdf Motivations / 175

• Real-world data is often incomplete, inconsistent, and/or lacking in certain behaviours or trends. • Data selecting and pre-processing for yielding quality mining results • There exist lot of Machine Learning (ML) methods that can be utilized to build classification model for large amount of data • Selecting automatically an appropriate ML method for learning classification model • A classification model can auto generate recommendation for different services • Building a classification model to generate production rules Scope of Data Driven Knowledge Acquisition / 176

Knowledge Acquisition 1

Expert Knowledge- Algorithm Selection Data-Driven Driven

Guidelines Structured Knowledge Resources Knowledge Case Base Probabilistic Model

1 Knowledge Descriptive Unstructured Classification Model Acquisition Editor Knowledge (CNL)

1 Rules creation from domain data Knowledge Maintenance (MCRDR) Cornerstone Rule Base Case Base Data-Driven Knowledge Acquisition / 177

Data-Driven Knowledge Acquisition #1 Feature Modeling Toolkit Appropriate Algorithm Selection

User Profile and Lifelog Structured Data J48 (Wellness)

Feature Model Manager Feature Model Model Creation Classification Model

Expert-Driven Knowledge Acquisition

Guidelines Manager Conceptual View / 178 Component Architecture / 179 Knowledge Creation & Evolution Data-Driven

5 1 Data Curation Feature Model Manager Preprocessor

Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler Model Handler Intermediate Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation Data

Algorithm Selector Model Learner Expert 11 13 New Problem 12 Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection

Computation Modeling

- Driven

Dynamic Algorithm Selection Model Creator Machine LearningMachine Datasets Meta-features ML Algorithms 1 6 Algorithms 2 Computation Evaluation 2 Archived Algorithm Datasets Meta features-algorithm Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model Dataset1 n Algorithm Selection Model Creation 5

16

Model Translator 14 15 Classification Prob. Model Conformance Knowledgebase Rule Base Case Base Models Model Execution Workflow (1/6) / 180 Knowledge Creation & Evolution Data-Driven Feature Model Manager 1 5 1 Data Curation Feature Model Manager Preprocessor • Retrieves schema object (JSON format) from Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler DCL Model Handler Intermediate • Data 3 10 9 8 Parses the object to display the schema into Query Configuration Preprocessed Features Selection Transformation Data tree structure • Domain expert selects and writes the

Algorithm Selector Model Learner Expert 11 13 New Problem 12 conditions of features Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection Computation Modeling

- • Creates the query object (JSON format) based Driven

Dynamic Algorithm Selection Model Creator on selected features and their conditions and Machine LearningMachine Datasets Meta-features ML Algorithms 1 6 Algorithms forwards that query to DCL 2 Computation Evaluation 2 Archived Algorithm Datasets Meta features-algorithm Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model Dataset1 n Algorithm Selection Model Creation 5

16

Model Translator 14 15 Classification Prob. Model Conformance Knowledgebase Rule Base Case Base Models Model Execution Workflow (2/6) / 181 Knowledge Creation & Evolution Data-Driven Preprocessor 2 5 1 Data Curation Feature Model Manager Preprocessor • Retrieves data object (JSON format) from DCL Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler • After parsing, converts the object into CSV file Model Handler Intermediate (Original Data) Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation Data • Identifies the missing values (i.e. 0) from original data and replaces them with attribute

Algorithm Selector Model Learner Expert 11 13 New Problem 12 mean values Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection Computation Modeling

- • Detects and replaces the outliers with attribute Driven

Dynamic Algorithm Selection Model Creator mean values Machine LearningMachine Datasets Meta-features ML Algorithms 1 6 Algorithms • Discretizes the data for high level concepts 2 Computation Evaluation 2 Archived Algorithm Datasets Meta features-algorithm abstraction Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model Dataset1 n • Filters the data to improve the classification Algorithm Selection Model Creation 5 accuracy 16 • Persists the processed data Model Translator 14 15 Classification Prob. Model Conformance Knowledgebase Rule Base Case Base Models Model Execution Workflow (3/6) / 182 Knowledge Creation & Evolution Data-Driven

5 1 Data Curation Feature Model Manager Preprocessor

Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler Model Handler Intermediate Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation Data

Algorithm Selector Model Learner Expert Dynamic Algorithm Selection Model Creator 3 11 13 New Problem 12 Algorithm Probabilistic Meta-features Rule Learning Case Authoring • Selection Extracts basic and advanced statistical meta-

Computation Modeling - Driven feature and information theory features of Dynamic Algorithm Selection Model Creator Machine LearningMachine classification dataset Datasets Meta-features ML Algorithms 1 6 Algorithms 2 Computation Evaluation 2 Archived • Selects the appropriate algorithm after Algorithm Datasets Meta features-algorithm Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model computing balanced accuracy and performing Dataset1 n Algorithm Selection Model Creation 5 on-parametric test using Decision Tree (DT) algorithms 16

Model Translator 14 • Creates a DT-based Model for Algorithm 15 Classification Prob. Model Conformance Knowledgebase Rule Base Selection Case Base Models Model Execution Workflow (4/6) / 183 Knowledge Creation & Evolution Data-Driven

5 1 Data Curation Feature Model Manager Preprocessor

Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler Model Handler Intermediate Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation Algorithm Selector 4 Data • Extracts meta-features of new problem dataset

Algorithm Selector Model Learner Expert 11 13 • Loads and applies DT-based Model for new New Problem 12 Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection

Computation Modeling problem dataset - Driven • DT-based Model recommends appropriate DT

Dynamic Algorithm Selection Model Creator Machine LearningMachine Datasets Meta-features ML Algorithms algorithm for new problem dataset 1 6 Algorithms 2 Computation Evaluation 2 Archived Algorithm Datasets Meta features-algorithm Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model Dataset1 n Algorithm Selection Model Creation 5

16

Model Translator 14 15 Classification Prob. Model Conformance Knowledgebase Rule Base Case Base Models Model Execution Workflow (5/6) / 184 Knowledge Creation & Evolution Data-Driven

5 1 Data Curation Feature Model Manager Preprocessor

Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler Model Handler Intermediate Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation Data Model Learner 5 • After loading processed data and selected DT

Algorithm Selector Model Learner Expert 11 13 New Problem 12 algorithm, learns the model Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection Computation Modeling

- • Generates the classification model (decision Driven

Dynamic Algorithm Selection Model Creator tree) Machine LearningMachine Datasets Meta-features ML Algorithms 1 6 Algorithms • Persists the classification model 2 Computation Evaluation 2 Archived Algorithm Datasets Meta features-algorithm Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model Dataset1 n Algorithm Selection Model Creation 5

16

Model Translator 14 15 Classification Prob. Model Conformance Knowledgebase Rule Base Case Base Models Model Execution Workflow (6/6) / 185 Knowledge Creation & Evolution Data-Driven

5 1 Data Curation Feature Model Manager Preprocessor

Layer 2 6 7 4 4 Missing Value Schema Loader Feature Data Loader Outlier Handler Model Handler Intermediate Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation Data

Algorithm Selector Model Learner Expert 11 13 New Problem 12 Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection

Computation Modeling

- Driven

Dynamic Algorithm Selection Model Creator Machine LearningMachine Datasets Meta-features ML Algorithms 1 6 Algorithms 2 Computation Evaluation 2 Archived Algorithm Datasets Meta features-algorithm Algorithm Selection Selection 3 Alignment 4 Training Dataset Dataset1 Dataset Model Dataset1 n Algorithm Selection Model Creation 5 Model Translator 6 16 • Loads the classification model (decision tree ) Model Translator 14 15 Classification • Transforms the decision tree into rules to Prob. Model Conformance Knowledgebase Rule Base Model Case Base Models conform them from domain expert Example Scenario - Workflow / 186

1 Knowledge Creation & Evolution <> , Data-Driven , , 2 , …… …, 5 1 Data Curation Feature Model Manager Preprocessor 8 Layer 2 6 7 3 4 4 > 1650  High Missing Value Schema Loader Feature Data Loader Outlier Handler 1600 – 1650  Normal Model Handler Intermediate < 1600  Low Feature Modeling Toolkit Data 3 10 9 8 Query Configuration Preprocessed Features Selection Transformation 9 Data CaloriesBurned/Day Low Normal 15 Algorithm Selector Model Learner Low <> 11 13 4 IF ( WeightStatus = New Problem 12 Feature Model Low Algorithm Probabilistic Overweight ) THEN Meta-features Rule Learning Case Authoring Low Selection Recommendation = Computation Modeling 4-5 … Low HeavyActivity 9 9 8 User-ID Age Gender BMI WeightStatus CaloriesBurned/Day Recommendation Normal

Dynamic Algorithm Selection Model Creator 1 34 M 26.5 Overweight 1250 ModerateActivity Machine LearningMachine Datasets Meta-features ML Algorithms 2 22 M 22.8 Normal 1620 LightActivity 14 1 6 Algorithms 2 Computation Evaluation 2 3 18 M 6 24.3 Normal 1600 ModerateActivity Archived Classification Model Algorithm 4 34 F Normal 1630 ModerateActivity Datasets Meta features-algorithm Algorithm Selection Selection 5 65 F 33.9 Obese 500 HeavyActivity 3 Alignment 4 Training Dataset ...... … .. Dataset1 Dataset Model 7 Dataset1 n 19 19 M 229.0 Overweight 1400 HeavyActivity Algorithm Selection Model Creation 20 65 M 24.5 Normal 1650 ModerateActivity 5 13

10 Age Gender WeightStatus Recommendation Model Translator 14 34 M Overweight ModerateActivity Processed Data 22 M Normal LightActivity 15 Classification 18 M Normal ModerateActivity Prob. 10 Classification Model Model Conformance Knowledgebase Rule Base Case Base Models Model 34 F Normal ModerateActivity 65 F Obese HeavyActivity Algorithm Selection ...... Model 19 M Overweight HeavyActivity 12 65 M Normal ModerateActivity 11 J48 Relative_Mean, Entropy, Root_Mean_Square,…….. 6 12 Algorithm Selection / 187

• The process of selecting an appropriate Algorithm Selection algorithm for learning a dataset

• Types of selecting appropriate algorithms Empirical Analysis • Empirical analysis Automatic (cross-validation over all (meta-learning) • Automatic recommendation possible algorithms) Automatic Algorithm Selection - workflow / 188 Offline Phase | Online Phase

1 Offline Phase • Creation of the Algorithm Knowledge Creation & Evolution Selection Model Data-Driven

1b A1, ML Algorithms A2, Feature Model Manager Preprocessor Archived Evaluation . Datasets An 2 Algorithm Selector 1c 1d New Dataset 1a Dtaasets Meta- Metafeatures- Algorithm Model Learner Model Algorithm Meta-features features Algorithm Training Selection Creation Selection Computation computation Alignment Dataset Model Expert-

1 Automatic Algorithm Selection Model Creator Driven Machine LearningMachine Datasets Meta-features ML Algorithms

2 Algorithms Computation Evaluation Archived Online Phase Algorithm Datasets Meta features-algorithm Algorithm Selection Selection Alignment Training Dataset • Selection of Appropriate Dataset1 Dataset Model Algorithm Dataset1 n Algorithm for a New Dataset Selection Algorithm Selection Model Creation Model

2a 2b Meta-features Selection of Knowledge Base Preprocessed Preprocessor computation Algorithm Model Learner Dataset

Appropriate Algorithm Use-case diagram of automatic algorithm selection / 189

• New Dataset Knowledge Creation & Evolution  Meta-feature extraction Data-Driven  Reasoning for recommendation of algorithm Feature Model Manager Preprocessor

Lifelog Schema Lifelog Data Missing Value Feature Outlier Handler 5. 4. Loader Model Loader Handler Integration Recommend with Data Classification Query Configuration Preprocessed Features Selection Transformation Data Driven Algorithm Algorithm Selector Model Learner New Problem Algorithm Probabilistic 2. Meta-features Rule Learning Case Authoring Selection 1. Computation Modeling Performance Meta-feature Evaluation of

Extraction Automatic Algorithm Selection Model Creator DT Algorithm Machine Machine Learning

Datasets Meta-features ML Algorithms Algorithms Computation Evaluation • Meta-feature extraction Archived Datasets Algorithm Meta features-algorithm Algorithm Selection Selection  Statistical Alignment Training Dataset Dataset1 Dataset Model  Information theory Dataset1 n Algorithm Selection Model Creation  Landmarking

Knowledge Base• Model Creation 3.  Case-base Model for Prob. Knowledgebase Cornerstone Knowledgebase CBRRule Base Case Base ModelsModel Creation Case Base 1. Meta-Features Extraction / 190 Use Case

Use-case (KCL2-SUC-07): Extract meta- Knowledge Creation & Evolution features of classification dataset Data-Driven

• Objective Feature Model Manager Preprocessor  Extraction of meta-characteristics Lifelog Schema Lifelog Data Missing Value Feature Outlier Handler • Methodology Loader Model Loader Handler  Library used: OpenML dataset • Meta-feature extraction characterization library is used Query Configuration Preprocessed Features Selection Transformation  Basic & Advanced Data  Dataset Meta-features computation Statistical (operation)  Information theory Algorithm Selector Model Learner 1. Input: UCI and OpenMl classification New Problem Algorithm Probabilistic Meta-features Rule Learning Case Authoring datasets Selection Modeling Meta-feature Computation 2. Processing: Extraction Automatic Algorithm Selection Model Creator

a. Takes each dataset form the local Machine Learning

Datasets Meta-features ML Algorithms Algorithms copy of the datasets Computation Evaluation Archived Algorithm b. Extracts basic and advanced Datasets Meta features-algorithm Algorithm Selection Selection Alignment Training Dataset Dataset1 Dataset Model statistical meta-feature and Dataset1 n information theory features Algorithm Selection Model Creation c. Stores the extracted features to meta-feature base Knowledge Base 3. Output:

a. A set of meta-features Prob. Knowledgebase Cornerstone Knowledgebase Rule Base Case Base Models Case Base 1. Meta-Features Extraction / 191 Use Case Features Functions Knowledge Creation & Evolution Data-Driven Loading & Implementing Validating Algorithm for Feature Model Manager Preprocessor Datasets into Statistical and Meta-features Lifelog Schema Feature Lifelog Data Missing Value Info. Theory Outlier Handler Loader Model Loader Handler Extractor Meta-Features Interface Query Configuration Preprocessed Features Selection Transformation Data

Algorithm Selector Model Learner Meta- New Problem Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection Features Computation Modeling Extraction

Automatic Algorithm Selection Model Creator Machine Machine Learning

Datasets Meta-features ML Algorithms Algorithms Computation Evaluation Archived Datasets Algorithm Meta features-algorithm Algorithm Selection Selection Integrating Alignment Training Dataset Dataset1 Model Persistence of Datasetn Meta-feature Dataset1 extracted Algorithm Selection Model Creation extractor in features in data driven Meta-features know. Acq. base Interface Knowledge Base

Prob. Knowledgebase Cornerstone Knowledgebase Rule Base Case Base Models Case Base 1. Meta-features Extraction (One Dataset) / 192

Mean Absolute Deviation • The sum of the differences between data values and the mean, divided by the count; • [ (x1 - mean) + (x2 - mean) + (x3 - mean) + ... + (xn 1 - mean) ] / n Offline Phase: Creation of Algorithm Selection Model Skewness • The sum of the cubed differences between data Datasets and Algorithms Characterization values and the mean, divided by the count minus 1 times the cubed standard deviation; A1, • [ (x1 - mean)3 + (x2 - mean)3 + (x3 - mean)3 + ... + Freely 1b ML Algorithm A2, (xn - mean)3 ] / [ (n - 1) * s3 ] Available Archived . Meta-feature Evaluation Kurtosis Datasets Datasets . • The sum of the fourth power of differences A9 Extractor between data values and the mean, divided by the count minus 1 times the fourth power of the standard standard deviation; • [ (x1 - mean)4 + (x2 - mean)4 + (x3 - mean)4 + ... + (xn - mean)4 ] / [ (n - 1) * s4 ] 1c 1a Datasets Meta- Meta-features- Mean Standard Deviation (s) features Algorithm • The square root of the variance; extraction Alignment • 2√variance or variance = s2 Root Mean Square (RMS) Meta-features . Dataset 1. Meta-features Extraction (For All Datasets) / 193

• Types of meta-features OpenML Dataset Characterization Library  Basic statistical (13)  Advanced statistical (11) 푚푓 푚푓 푚푓 푚푓 푚푓  Information theory (5) 1 2 3 푚−1 29

 Complexity characteristics Mean Absolute Mean Standard Mean Mutul Skewness Class Entropy Deviation Deviation (s) Inform  Landmarking features 0.750390634 2.231884058 13.209362 ………… 0.814765885 0.02238773  Model-based meta-features 푑1 -1 -1 -1 ………… 0.991231 -1 UCI and OpenML 푑 -1 -1 -1 ………… 0.791645 -1 • Types datasets Classification 2 Meta-features Datasets  Classification datasets (100) . . . ………… . . 푑푛−1 2.059351 3.21875 -13.2185 ………… -1 -1  Clustering 푑100 2.070336 2.3125 -23.5571 ………… -1 -1

# 퐸푥푝. 푓표푟 푚푒푡푎 − 푓푒푎푡푢푟푒푠 푐표푚푝푢푡푎푡𝑖표푛 = 푛 푛표 푑푎푡푎푠푒푡푠 ∗ 푚(푛표 푚푒푡푎푓푒푡푢푟푒푠) 2. Performance Evaluation of Classification Algorithm / 194 Use Case Use-case (KCL2-SUC-08): Evaluate decision tree Knowledge Creation & Evolution classification algorithms Data-Driven • Objective  To find the most appropriate DT algorithm for the datasets Feature Model Manager Preprocessor • Methodology Lifelog Schema Lifelog Data Missing Value Feature Outlier Handler  Library used: Weka, Excel Statistician Tool Loader Model Loader Handler  ML algorithm evaluation (operation) 1. Input: UCI and OpenMl classification datasets, Weka DT Query Configuration Preprocessed Features Selection Transformation Data algorithms 2. Processing: Algorithm Selector Model Learner New Problem a. Takes each dataset form the local copy of the Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection archeived datasets Computation Modeling b. Use Weka experimenter and test the DT algorithm on Dynamic Algorithm Selection Model Creator the loaded dataset Machine Learning

Datasets Meta-features ML Algorithms Algorithms c. Obtain all the evaluation metrics Computation Evaluation Archived Datasets Algorithm Meta features-algorithm d. Compute Wgt.Avg.Fscore, Stdev Algorithm Selection Selection Alignment Training Dataset Dataset1 Dataset Model e. Perform parametric test for the significantly better DT Dataset1 n algorithm Algorithm Selection Model Creation f. Select the appropriate algorithm • Evaluation of DT classification Algorithms g. Align selected algorithm with the meta-features of Knowledge Base that dataset  DT classification algorithm Performance h. Select appropriate algorithm as a class label  Wgt.Avg.Fscore, Stdev as evaluation metricsProb. Knowledgebase Cornerstone Knowledgebase Rule Base Case Base Models Evaluation of DT Case Base 3. Output: Appropriate algorithm  Non-parametric t-test for significance Algorithm 2. Performance Evaluation of Classification Algorithm / 195 Use Case Feature Functions Knowledge Creation & Evolution Performing 10-fold Data-Driven CV on each dataset Persistence of using all DT Algo. Evaluation Results Feature Model Manager Preprocessor In Weka in CSV File Format Experimenter Lifelog Schema Lifelog Data Missing Value Feature Outlier Handler Loader Model Loader Handler 1 2 Query Configuration Preprocessed Features Selection Transformation Data

Select the Performance Definition and Algorithm Selector Model Learner Computation of appropriate New Problem Evaluation of Algorithm Probabilistic Algorithm as Multi-objective Meta-features Rule Learning Case Authoring 6 DT Algorithm Selection Modeling the class label Evaluation Computation criteria

Dynamic Algorithm Selection Model Creator Machine Machine Learning

Datasets Meta-features ML Algorithms Algorithms Computation Evaluation 5 Archived Datasets Algorithm Meta features-algorithm Algorithm Selection Selection Alignment Training Dataset Dataset1 Dataset Model Dataset1 n Compare the Algorithm Selection Model Creation Perform Statistical results based on Significance Test significance results Knowledge Base

Knowledgebase Cornerstone Rule Base Case Base 2. Performance Evaluation / 196

1 Offline Phase: Creation of Algorithm Selection Model

Datasets and Algorithms Characterization A1, Freely 1b ML Algorithm A2, Available Archived . Evaluation Datasets Datasets . A9

1c 1a Datasets Meta- Meta-features- features Algorithm extraction Alignment

ML Algorithm Evaluation • Performance measurement – P:Each dataset d’s performance is measured on 9 Weka Decision Tree algorithms using 10-fold CV • Performance evaluation criteria – E: Criteria used is Wgt.Avg.F-score, Consistency (Stdev) • Evaluation method: Multiple comparison – Sort P in descending order – TopAlg := algorithm with maximum Wgt.Avg.F-score, if a unique algorithm with maximum Wgt.Avg.F-score exist – Else – TopAlg := algorithm with maximum Wgt.Avg.F-score && minimum Stdev 2. Evaluation of Decision Tree Algorithms (All Datasets) / 197

• Types algorithms Experimenting datasets by the Comparison and Alignment meta-  Decision tree-based algo. (9) candidate algorithms in Weka Evaluation of the features and  Rule-based algo. environment Results Algorithm  Probabilistic algo.  Distance-based algo.  Meta-learning algo.  Function-based algo. Appropriate Algo • Evaluation criteria 퐽48 푅퐸푃푇푟푒푒 푅푎푛푑퐹표푟푒푠푡 퐶퐴푅푇 퐼푑3  Accuracy F-  Wgt.Avg.F-score F-Score(stdev) F-Score(stdev) F-Score(stdev) Score(stdev) F-Score(stdv)  Stdev 푑 퐽48 1 0.86(0.02) 0.71(0.01) 0.82(0.04) ……… 0.61(0.01) 0.82(0.00)  Balanced accuracy 푅퐸푃푇푟푒푒 Meta-features + UCI and OpenML 푑2 0.56 0.73 0.52 ……… 0.61 0.62  Precision Classification Appropriate  Datasets 퐶퐴푅푇 Algorithm Error rate . . . ……… . .  Time-complexity 푑푛−1 푅푎푛푑퐹표푟푒푠푡  . 0.85 0.71 0.89 ……… 0.88 0.82 푑100 퐽48  . 0.96 0.91 0.92 ……… 0.88 0.82 # 퐸푥푝푓표푟 푒푣푎푙푢푎푡𝑖푛푔 푛 푎푙푔표푟𝑖푡ℎ푚푠 • Statistical significance test = 푛 푛표 푑푎푡푎푠푒푡푠 ∗ 푚(푛표 푎푙푔표푟𝑖푡ℎ푚푠)  Parametric  Non-parametric 3. Algorithm Selection Model Creation: Case-base Creation / 198 Use Case Knowledge Creation & Evolution Use-case (KCL2-SUC-09): Create Algorithm Selection Model Data-Driven (Algorithm Selection Case Base) Feature Model Manager Preprocessor • Objective Lifelog Schema Lifelog Data Missing Value  To Create the Case base for recommending best algorithm Feature Outlier Handler Loader Model Loader Handler • Methodology

 Library used: jColibri, Excel Statistician Tool Query Configuration Preprocessed Features Selection Transformation  Algorithm Selection Case Base Creation (operation) Data

1. Input: Algorithm Selection Training Dataset, Similarity Algorithm Selector Model Learner Functions New Problem Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection 2. Processing: Computation Modeling a. Define Case base structure in jColibri

b. Takes Algorithm Selection Training Dataset and load to Dynamic Algorithm Selection Model Creator Machine Machine Learning

Datasets Meta-features ML Algorithms Algorithms Case base structure in jColibri Computation Evaluation Archived Datasets Algorithm c. Assign local similarity functions to each meta-feature Meta features-algorithm Algorithm Selection Selection Alignment Training Dataset d. Assign global similarity functions to each meta-feature Dataset1 Dataset Model Dataset1 n e. Save the Algorithm Selection Case Base as a CBR Algorithm Selection Model Creation Model 3. Output: Algorithm Selection Model (CBR Case Base) Knowledge Base

Knowledgebase Cornerstone Rule Base Case Base 3. Algorithm Selection Model Creation : Case-base Creation / 199 Use Case Feature Functions Knowledge Creation & Evolution Data-Driven

Load Training Feature Model Manager Preprocessor Design Case Base Dataset to the Lifelog Schema Lifelog Data Missing Value Structure Feature Outlier Handler Case Base Loader Model Loader Handler Structure

Query Configuration Preprocessed Features Selection Transformation 1 2 Data

Algorithm Selector Model Learner New Problem Algorithm Probabilistic Algorithm Meta-features Rule Learning Case Authoring Selection Selection Computation Modeling Model Dynamic Algorithm Selection Model Creator

Creation Machine Learning

Datasets Meta-features ML Algorithms Algorithms Computation Evaluation Archived Datasets Algorithm Meta features-algorithm Persistence the Algorithm Selection Selection Alignment Training Dataset Dataset1 Dataset Model Designed Case Dataset1 n Base as an Assign Local and Algorithm Selection Model Creation Algorithm Global Similarity Recommendation Functions model Knowledge Base

Knowledgebase Cornerstone Rule Base Case Base 3. Algorithm Selection Model Creation: Case-base Creation / 200

1 Offline Phase: Creation of Algorithm Selection Model ML Algorithm Evaluation

Datasets and Algorithms Characterization 1. Propositional (feature- A1, Freely 1b vector) case representation ML Algorithm A2, Available Archived . Evaluation Datasets Datasets . 2. Numeric data types are A9 Model Creation: Case-base Creation assigned to each f € F

1c 3. Assigned equal weight to 1a Datasets Meta- Meta-features- 1d Algorithm Algorithm features Algorithm Training Selection Cases Selection each f extraction Alignment Dataset Creation Model (Case- Base)

Model or Case-base Creation 1. define structure for case representation 2. Assign data types to each f € F 3. Compute weight of each f and 4. Assign Weight for each f 4. CBR-based Algorithm Recommendation / 201 Use Case Knowledge Creation & Evolution Use-case (KCL2-SUC-11): Recommendation of best Data-Driven Algorithm Feature Model Manager Preprocessor • Objective Lifelog Schema Lifelog Data Missing Value  To select best algorithm Feature Outlier Handler Loader Model Loader Handler • Methodology

 Library used: jColibri Query Configuration Preprocessed Features Selection Transformation  ML algorithm Recommendation (operation) Data 1. Input: new preprocessed datasets, CBR model Algorithm Selector Model Learner 2. Processing: New Problem Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection a. Takes preprocessed dataset form data driven Computation Modeling environment Dynamic Algorithm Selection Model Creator

b. Extract meta-features through meta-feature extractor Machine Learning

Datasets Meta-features ML Algorithms Algorithms c. Apply Retrieve step of the CBR methodology Computation Evaluation Archived Datasets Algorithm d. Select top-k similar cases Meta features-algorithm Algorithm Selection Selection Alignment Training Dataset e. Select the top case as the recommended algorithm Dataset1 Dataset Model Dataset1 n f. Provide the recommendation algorithm to the data Algorithm Selection Model Creation driven environment g. Retain the new case in Case Base Knowledge Base 3. Output: the best algorithm

Knowledgebase Cornerstone Rule Base Case Base 4. CBR-based Algorithm Recommendation / 202 Use Case Feature Functions Knowledge Creation & Evolution Receiving Preprocessed Data-Driven Dataset From Data Driven Feature Model Manager Preprocessor

Lifelog Schema Lifelog Data Missing Value Feature Outlier Handler Loader Model Loader Handler 1

Query Configuration Preprocessed Features Selection Transformation Retain the new Data Extracting meta- Case in the Case features Base 2 Algorithm Selector Model Learner New Problem Model Algorithm Probabilistic Meta-features Rule Learning Case Authoring Selection Integration Computation Modeling

Dynamic Algorithm Selection Model Creator 3 Machine Learning

5 Datasets Meta-features ML Algorithms Algorithms Computation Evaluation 4 Archived Datasets Algorithm Recommend the Meta features-algorithm Algorithm Selection Selection top Algorithm to Alignment Training Dataset Dataset1 Dataset Model Data Driven Preprocessing of Dataset1 n Algorithm Selection Model Creation the Extracted Features Perform Case- based Reasoning Knowledge Base and Retrieve Top- k Cases Knowledgebase Cornerstone Rule Base Case Base 4. CBR-based Algorithm Recommendation / 203

Meta-features Extraction and Pre- Processing

Pre-process meta- Extract meta- New Preprocessed New Case (set of features Features Dataset meta-features Retain

Case-based 1 5 Retain Reasoning (CBR)

Learned New case(s) Case Model Creation: Case-base Creation Retrieve Reparsed Case Algorithm Selection 2 Algorithm 4 Revise Model (Case- Selection Model Base) (CBR Case Base) Top k, Relevant Top-Ranked Retrieved Cases Reuse Cases Algorithm Retrieve 3

2 New Data Online Phase: Algorithm Recommendation Using Case-based Reasoning 2d 2c 2b Characteristic Reuse Similar Cases Retrieve Similar Cases 2a Dataset Meta- -> Retain New Case (Recommend Top-k (local & Global features extraction New Dataset Recom. Algorithms Cases) Similarity Matching) (New Case Creation) 5. Algorithm Selection Model Integration / 204 Use Case Feature Functions Knowledge Creation & Evolution Use-case (KCL2-SUC-10): Integrate Algorithm Selection Data-Driven Model • Objective Feature Model Manager Preprocessor  To Select Best Algorithm from The Data Driven Environment Lifelog Schema Lifelog Data Missing Value Feature Outlier Handler • Methodology Loader Model Loader Handler  Library used: Net Beans or Eclipse Query Configuration Preprocessed Features Selection Transformation  Algorithm Selection Model Integration(operation) Data 1. Input: Algorithm Selection Model 2. Processing: Algorithm Selector Model Learner New Problem Algorithm Probabilistic a. Integrate meta-feature Extractor Meta-features Rule Learning Case Authoring Selection Modeling b. Integrate CBR algorithm selection model Computation

3. Output: Integrated Algorithm Selection Model Dynamic Algorithm Selection Model Creator Machine Machine Learning

Datasets Meta-features ML Algorithms Algorithms Computation Evaluation Archived Datasets Algorithm Meta features-algorithm Algorithm Selection Selection Alignment Training Dataset Meta Features Algorithm Dataset1 Dataset Model Dataset1 n Extractor Code Selection Model Algorithm Selection Model Creation Integration to Data Model Integration Driven (CBR System) 1 Integration 2 Knowledge Base

Knowledgebase Cornerstone Rule Base Case Base 5. Algorithm Selection Model Integration / 205 Tasks Flow

Java Net Beans etc.

Algorithm Selector Algorithm (Reasoning Process) Selection Model (CBR Meta-features Algo. Selection Data Driven Interface Model) Extractor Model

Use-case KCL2-SUC-10 (Model Integration) Experiments and Results / 206

• Experimental setup • Datasets • 100 OpenML [18] UCI Library [19] for Case ID DatasetName ID DatasetName

Base Creation

Attributes NominalAtts NumericAtts BinaryAtts Classes IncompInstances Instances MissingValues Attributes NominalAtts NumericAtts BinaryAtts Classes IncompInstances Instances MissingValues • Tools and Library 1 abalone 9 1 7 0 3 0 4177 0 51 analcatdata_reviewer 9 8 0 0 2 367 379 1368 • Weka [20], jCollibri [21], OpenML 2 abe_148 6 0 5 0 2 0 66 0 52 analcatdata_runshoes 11 6 4 5 2 14 60 14 • Environment: Win.PC CPU(3.3 GHz) RAM 8GB. 3 acute-inflammations 7 5 1 5 2 0 120 0 53 analcatdata_supreme 8 0 7 0 2 0 4052 0 4 ada_agnostic 49 0 48 0 2 0 4562 0 54 analcatdata_uktrainacc 17 0 16 0 2 25 31 150 • Algorithms: 09 Weka DT classifiers … … … … … … … 48 analcatdata_neavote 4 2 1 0 2 0 100 0 98 bridges_version2 13 12 0 2 6 37 107 73 • Performance metrics: Wgt.Avg.F-score, Stdev 49 analcatdata_negotiation 6 1 4 1 2 17 92 26 99 bupa 7 0 6 0 2 0 345 0 50 analcatdata_olympic2000 13 1 11 0 2 0 66 0 100 car 7 6 0 0 4 0 1728 0

S.No Decision Tree Algorithm Sno Meta-Feature Sno Meta-Feature Sno Meta-Feature 1 MeanSkewnessOfNumericAtts 12 NumBinaryAtts 23 MinNominalAttDistinctValues 1 trees.BFTree 2 MeanKurtosisOfNumericAtts 13 ClassCount 24 MaxNominalAttDistinctValues 2 trees.FT 3 MeanStdDevOfNumericAtts 14 NegativePercentage 25 ClassEntropy 3 trees.J48 4 MeanMeansOfNumericAtts 15 PositivePercentage 26 MeanMutualInformation 4 trees.J48graft 5 NumAttributes 16 DefaultAccuracy 27 NoiseToSignalRatio 5 trees.LADTree 6 Dimensionality 17 IncompleteInstanceCount 28 MeanAttributeEntropy 6 trees.RandomForest 7 PercentageOfBinaryAtts 18 InstanceCount 29 EquivalentNumberOfAtts 8 PercentageOfNominalAtts 19 NumMissingValues Simple statistics 7 trees.RandomTree 9 NumNominalAtts 20 PercentageOfMissingValues Advanced statistics trees.REPTree 8 10 PercentageOfNumericAtts 21 MeanNominalAttDistinctValues Information Theoretic 9 trees.SimpleCart 11 NumNumericAtts 22 StdvNominalAttDistinctValues Experiments and Results / 207

Position of colleges-usnews 2 Recommended collins 1 Testing Model- Experimental setup and Dataset Classifier in Top-3 onfidence 2 Results analysis Actual Classifiers ontact-lenses 9 contraceptive 3 • Datasets cardiotocographt-10clas 1 costamadre1 1 cars 1 cps_85_wages 3 • 52 OpenML [18] UCI Library [19] for Case Base Creation cars_with_names 2 cpu 3 CastMetal1 1 cpu_act 1 • Tools and Library chess-small 1 cpu_small 1 • Weka [20], jCollibri [21], OpenML cholesterol 1 credit-rating 3 chscase-adopt 3 crx 4 • Environment: Win.PC CPU(3.3 GHz) RAM 8GB. chscase-census2 3 DATATRIEVE 1 dbworld-subjects- chscase-census3 2 1 stemme • Algorithms: 09 Weka DT classifiers chscase-census5 1 dbworld-subjects 1 chscase-census6 2 • Performance metrics: Wgt.Avg.F-score, Stdev delta_ailerons 1 chscase-funds 1 dermatology 1 chscase-geyser1 1 desharnais.csv-weka.fil 2 Results Analysis hscase-health 1 pima_diabetes 1 chscase-vine1 1 diggle- • For 3 datasets out of 52, the recommended algorithm 2 chscase-vine2 3 Table_A1_Luteniz was not lying in the list of top 3 algorithms. chscase-whale 1 disclosure-X_BIAS 1 • Overall accuracy (k=3) = > 48*100/52=94% cjs 1 disclosure-X_NOISE 4 disclosure- cleveland 1 3 X_TAMPERED • Overall accuracy (k=2) = > 30*100/52=73% climate-simulation-crac 3 disclosure-Z 3 cloud 3 • Similarly, for top (k=1) = > 30*100/52=57.6%. dresses-sales 1 cm1_req 1 eastWest 1 cmc 1 horse-colic 1 horse-colic.ORIG 2 colleges-aaup 1 Request-response Implementation / 208

Spring MVC Request Lifecycle Representing Backend Services

http://terasolunaorg.github.io/ Uniqueness and Contributions / 209

• Support of efficient real-world data preprocessing application for optimal classification model creation • Support of production rules generation from classification model • Use of multi meta-features learning to exploit intrinsic behaviors of datasets for improved performance of the algorithm selection model • Use of multi-metric objective function to evaluate algorithms performance and recommend optimum performance classification algorithm • Use of accurate case-matching and retrieval function for correct recommendation of classifier for a new dataset at run time Evaluation - Classification Accuracy / 210

1. Selected Dataset Characteristics 2. Impact of Preprocessing Steps on Classification Accuracy

Title: Pima Indians Diabetes Database Number of Instances: 768 Number of Attributes: 9 For Each Attribute: all numeric-valued Missing Attribute Yes Values: Class Distribution: Class value 1 is interpreted as "tested positive for diabetes" Class Value Number of instances 0 500 1 268 URL: https://archive.ics.uci.edu/ml/machine-learning- databases/pima-indians-diabetes/ Evaluation of Decision Tree Algorithms (One Dataset) / 211

Comparison and Evaluation of the Results

Random Forest (winner)

Dataset Comparison and Evaluation of Results (One Dataset) / 212

• Perform statistical significance

test (Paired t-test) Results of Algorithm 1. Perform significantly better Performance 2. Perform equally likely, cannot decide

3. Perform significantly poor Statistical Significance Test • Drops algorithms that performs (Paired t-test) significantly poor Check • Perform evaluation for balanced significance? Performs Performs equally likely accuracy significantly argmax 푆푒푛푠푖푡푎푡푖푣푖푡푦+푆푝푒푐푖푓푖푐푖푡푦 Compute Balance • 퐵푎푙푎푛푐푒푑 퐴푐푐푢푟푎푐푦 = = Accuracy f(x):=Max(BalancedAccuracy) 2 0.5∗푇푃 0.5∗푇푁 + 푇푃+퐹푁 푇푁+퐹푃 Algorithm with Max • Select the algorithm with highest Balanced Accuracy balanced accuracy value (Random Forest) Related Works / 213

Category Study Features Limitations

• Addresses quality attributes at run time • Requires extra engineering efforts at by means of feature models development time for designing and mapping Feature Modeling Sánchez [1], Vranić [2] • Context based feature selection the model • Used in the field of Software Product Lines (SPL) • Prepared consistent and calibrated • Knowledge engineer dependency baseline dataset Data Preprocessing Kwiatkowska [3], Dimitriadis [4], • Handle noisy, highly variable data

• Finds best classification algorithm • Use a sub-set of meta-features Smith [5], Ali and Smith [6], Song • Considered multiple datasets • Use single-metric evaluation criteria Algorithm Selection [7], Wang [8] • Considered multiple algorithms • Use black-box learning techniques for model creation

• Extract new knowledge • Used fixed machine learning methods (ZeroR, • Create effective set of decision rules NaiveBayes, J48, SVM) Model Learning Dimitriadis [4], Bachman [9] • Worked on Time, space, and medical • Knowledge engineer dependency domain Selected Tools and Techniques / 214

• Language • Java • Frameworks: • Spring MVC • AJAX - jQuery • APIs: • Weka • OpenML • IDEs: • Eclipse • Tools: • Weka • RapidMiner Summary / 215

• Data Driven Knowledge Acquisition (DDKA) is an approach that curates the knowledge using data-driven approach. • DDKA acquires the data, preprocesses it, selects the appropriate algorithm, produces the accurate classification model, and generates production rules. Expert-Driven Knowledge Acquisition Overview / 217

Knowledge Authoring Tool

Expert System Needs up-to-date Domain knowledge-base needs a domain knowledge- base to provide user-friendly Authoring right recommendations at right Environment to Create and Update time. the knowledge.

Authoring Environment facilitates the experts to evolve and maintain the knowledge in the knowledgebase. Overview / 218

Advantages: • Reduce dependency on Knowledge Engineers • Expert driven Knowledge • No need of communication Knowledge • Reduce cost acquisition helps in acquiring Engineers validated knowledge from

experts heuristics and Up to date experiences Knowledge

• It reduces the dependency of Intelligent-Knowledge Authoring Tool (I-KAT) knowledge base on knowledge Domain Expert engineers Disadvantages: Knowledge Acquisition • Lack of user-friendly environment for domain expert to hide complexity Knowledge Maintenance • Integration with legacy system is difficult Motivations / 219

• Lack of user-friendly environment to transform experts’ heuristics, knowledge and experiences into computer integrateable guidelines and rules to enhance knowledge • Guidelines and rules are needed to represent in reusable integrateable format

1 2 Incorporating Providing guideline editor to transform the Providing user-friendly environment to expert Incorporating expert Guidelines for guideline’s knowledge into computer for transformation of their experiences into experiences wellness interpretable format. into knowledge base domain knowledge base

3 4

Transforming Providing transformation mechanism to Situational Providing situation based indexing for the rules into represent the knowledge rules into multiple based indexing knowledge base in order to classify diverse rule-based of knowledge format. rules and enhance performance of reasoning representation base Goal and Objectives / 220

Creating • Goal Guideline

• Provide user-friendly environment to create Editing Rule

recommendation’s and alert’s guidelines to Editing

transform experts knowledge into knowledge Guideline base Creating Rule • Objectives Create easy-to-use Rule Editor to facilitate the domain experts to create knowledge rules using contextual selection of concepts from Intelli-sense window and/or domain model tree Providing user-friendly Guideline Editor to generate guidelines with the help of Intelli-sense and domain model tree selection as in Rule Editor Transformation of guidelines into knowledge rules and then to executable format to generate recommendations and alerts Scope of Knowledge Curation Layer: Rule Creation Paths / 221

Knowledge Acquisition 2 3

Expert Knowledge- Algorithm Selection Data-Driven Driven

Guidelines Structured Knowledge Resources Knowledge Case Base Probabilistic Model

2 3 Knowledge Descriptive Unstructured Acquisition Editor Knowledge (CNL)

Knowledge Maintenance 2 Direct Rules Creation (MCRDR) Cornerstone 3 Rules creation from Guidelines Rule Base Case Base Conceptual View (Path 2) / 222

Knowledge Acquisition Tool 2 1

Domain Knowledge Transformation Model Rule Editor Bridge Manager

1 Rule Editor provides a convenient and user-friendly environment to create rules for MLM File Repository enhancing and maintaining knowledge.

MLM1 MLMn Knowledge Transformation Bridge transform knowledge rules into MLM and production … 2 rule/MCRDR to maintain the knowledge bases. Production Rule base Conceptual View (Path 3) / 223

Knowledge Acquisition Tool 3

decision tree Diabetes-Type2 Diabetes Type-2 (Recommendations) 2 1 Guideline Manager True True Male Fruits Berries Knowledge Transformation Domain Vegetables and Fruits | The True Nutrition Source | Harvard T.H. Chan School of Public Health.

Available: Diabetes Type-2 http://www.hsph.harvard.edu/n utritionsource/whatshouldyou Model eat/ Rule Editor vegetablesandfruits/ Bridge True GTM

True Adulthood (19-65 True Green Leafy Manager Female years) Vegetables, Fruits

Legend

Test Outcome

1 Guideline Template Model (GTM) to transform the experts knowledge into guidelines tree MLM File Repository

2 Rule Editor provides a convenient and user-friendly environment to create rules for enhancing and maintaining knowledge. MLM1 MLMn … Knowledge Transformation Bridge transform knowledge rules into MLM and production Production Rule base 3 rule/MCRDR to maintain the knowledge bases. Component Architecture / 224

Knowledge Creation & Evolution Expert-Driven

Knowledge Acquisition Tool Domain Model Manager Rule Editor Guideline Manager

Model Creator Model Relationship Node Handler Loader RuleCreator Handler Model Updation Artifacts Loader Meta Model Guideline Model Loader Transformation Meta Model Intelli-sense Manager Guideline Guideline Rule Validator Validator Model Situation Event Manager Knowledge Transformation Bridge Rule Transformation RBR Generator Bridge

Knowledge Base

Index Rule Base KB Based Rules

Knowledge Sharing Interface

Situation Event Sharing Rules Index Sharing Component Architecture / 225

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL Execution workflow / 226

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Enables domain experts to create, update, and remove wellness concepts and relationships • Provides concepts to rule editor and guideline manager to create rules Execution workflow / 227

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Enables domain experts to create, update, and validate knowledge base rules • Provides Intelli-sense feature for wellness concepts selection and validating functionality to remove duplication and conflict Execution workflow / 228

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Enables domain experts to transform their experiences and heuristics to guidelines and then transforms into multiple rules • Guideline generation is facilitated with easy selection of wellness concepts and validation by guideline validator Execution workflow / 229

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Enables domain experts to select salient features in a rule for detecting abnormal situations of the users Execution workflow / 230

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Transforms the created rules and guidelines into computer interpretable formats. • Single rule can transform into multiple representation Execution workflow / 231

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Provides storage for selected situations (Salient Features) and complete rules in separate knowledge bases. Execution workflow / 232

Concepts and Knowledge Creation & Evolution Relationships Experts heuristics and Experts heuristics and experience experience 1 3 Expert-Driven 4 Wellness Model Knowledge Acquisition Tool Domain Model Manager Tree Rule Editor Wellness Model Guideline Manager Tree Model Creator 2 Relationship Model Loader 3a Node Handler Handler Operators Rule Nodes 4a Model Updation RuleCreator 4b Rule Relationships Artifacts Loader 3b Meta Model Guideline Model Loader Concepts Validated Transformation Meta Model Intelli-sense Manager 3c Guideline 4c Generated Guideline Salient Features 6 Created Rule 5 Guideline Guideline Validator Rule Validator 3d Model Validated Rule Situation Event Manager 7 Knowledge Transformation Bridge Rule Transformation Bridge Transformed RBR Generator Rule 8 Knowledge Base

Index Rule Base KB Based Rules Transformed Rules and situations 9 Created Situations Knowledge Sharing Interface Created Rules DCL 10a Situation Event Sharing Rules Index Sharing 10b SCL • Selected situations (Salient features) are sharing with Lifelog Monitor (in DCL) to observe abnormal situations • Created rules are sharing with SCL to generate recommendation based on the user data and context Duplication and conflict checking Algorithm / 233

Persist Rule • Algorithmic steps to check conflict and duplication of Existing New Or New Update Rule Add Rule Existing? • Conditions Fetch Next Condition • Situations

Check Fetch Existing Exist Not Exist Add Condition • It also discard the duplicate rules Conditions in Condition ID to KB KB in the knowledge base Fetch created Condition ID

No Final Add Condition Is it Yes Add condition Condition? to Rule situation? to situation Exist Check Not Exist situation in Yes KB Add Conclusion to Rule Add Situation to Fetch existing Add Situation to KB Rule Situation ID

Persist Rule to Fetch created KB Situation ID Evaluation Process / 234

UCLab GC-Healthcare UCLab

Define Developed I-KAT Deploy tool weightage criteria

Design Wellness Model Provide helping evaluation guideline materials matrix

Preparation of Training to Evaluate the tool manual experts results

Experts Questionnaire Experts Filled Results knowledge Preparation questionnaire representation creation Evaluation setup / 235

Physical Instructor

LLM Data Curation RESTful Web Services Layer

Situation Situation ::= Nutritionist

RESTful Web Services Service Curation Nurse Layer

Rule Action::= Decision::=| User Satisfaction Questionnaire preparation / 236

Working Model

Product Quality

Subjective 28 Different Perception and Comparisons evaluation of Quality [1, 2] Pragmatic and hedonic Qualities Questionnaire Behavior and (28 Questions) Emotion

[1] Hassenzahl, M. (2006). Hedonic, emotional, and experiential perspectives on product quality. Encyclopedia of human computer interaction, 266-272 [2] Hassenzahl, M., & Tractinsky, N. (2006). User experience-a research agenda. Behaviour & information technology, 25(2), 91-97 Evaluation Setup / 237

Wellness Model

Fitness Instructor ( 2 ) Rules Authoring • 5 Simple Rules • 5 Complex Rules Fitness System Usage Help

Dietist ( 2 )

Nurse ( 2 )

Questionnaire Results / 238

Pragmatic Quality Describes the usability of a product and indicates how successfully users are in achieving their goals using 4 Dimensions the product

[1, 2] Questionnaire Hedonic Quality – Simulation (HQ-S) This dimension indicates to what extent the product can support those needs in terms of novel, interesting, and simulating functions, contents, and interactions-presentation styles

Hedonic Quality – Identity (HQ-I): Indicates to what extent the product allows the user to identify with it

Attractiveness (ATT) Describes a global value of the product based on the quality perception

[1] Hassenzahl, M. (2006). Hedonic, emotional, and experiential perspectives on product quality. Encyclopedia of human computer interaction, 266-272 [2] Hassenzahl, M., & Tractinsky, N. (2006). User experience-a research agenda. Behaviour & information technology, 25(2), 91-97 Results / 239 Summary / 240

• Expert Driven Knowledge Acquisition (EDKA) deals knowledge creation, updation and maintenance acquired from wellness domain. • EDKA facilitate domain expert with easy to use and flexible knowledge authoring environment. • EDKA enables and support wellness model which helps in easy adoption of diverse wellness domains and assist knowledge acquisitions. • EDKA support situation based knowledge management and produce situation based recommendations. Service Curation Layer (SCL)

Context-aware Recommendation Background / 242

• The key responsibility is to provide knowledgeable and personalized wellbeing Physical Activities recommendations using domain knowledge, users’ conditions, context, and preferences

• Objectives Preferences

• Enabling service communication within SCL and HealthCondition other layers of the MM platform through restful web services

• Generating knowledgeable and reliable contents for physical activities and nutrition recommendations, using users profile information Time Profile

• Interpreting physical activities and nutrition Schedule recommendations contents for personalization and explanation, using user context and preferences Situation-aware Personalized Recommendations Motivation for Service Curation / 243

• Explanatory sentences to be added to the Rules: • Alternate recommendation for weather • Indexes (BMI, METs, Act Level etc.) recommendations (e.g. walking: why not to conditions (e.g. outdoor exercises • Definitions for key condition attributes of take water because weather is too hot) recommendation should have alternative if the rules (BMI, METs, Cal, Steps etc.) • Emotion based recommendation sentences there is raining) • Conversion and transformation logics • Links to educational resources for different (steps-Cal, Cal-Mets etc.) recommendations

Generate Interpret Context Content Filtration & Explanations & Recommendations Alterations Education Support

. User preferences . User profile data . Contextual Explanation . High level Context . User Possessing . Activities data . Link to Educational . Location Context . Weather Resources Service Curation as a Framework (SCF) / 244

• Provides a common • Performs inferencing on the • Processes contextual- communication gateway for an knowledge base and user information of the user intra-module and inter-module profile communication • Evaluates user’s interruptibility • Performs conflict resolution for and contextual suitability of • Provides a seamless accurate recommendation the recommendation abstraction for communication among distributed • Generate recommendations • Enriches the contents of the components based on production rules recommendation

Responsibilities 1. To provide mechanism for reasoning on domain knowledge for generating accurate recommendations 2. To provide mechanisms for intelligently resolving conflicts in production rules 3. To provide extensibility for incorporating additional user contexts for situation-aware recommendations 4. To provide mechanisms for incorporating explanation and educational nuggets for aiding recommendations adaptability SCL Execution Flow in Mining Minds Platform / 245

Service Curation Layer 11 12

Recommendation Builder Recommendation Interpreter 1 Rule-based Reasoning Context Interpreter Content Interpreter Service Pattern Select Moderator Moderator Orchestrator Conflict Resolver Matcher Context Alternative Resolve Data Type Selector User Status Eval. Select Context Eval. Result 9 Match Rules 16 Path Generator Conflict Conversion Receive 13 17 SCL Services Recv Rec. Process Vect. Context Generate Final Reco EP. 1 11 10 Execute Rule Results Receive Data Receive SE Recv Context Pref-based Filter JSON Rules JSON Fetch Blok EP. 2 2 Rules 19 17 14 15 2 8 Context Interpreter Filter Recommendation Match Rules Input/output Lifelog Data loading Interface Context Eval. Adapter Data Fetcher Data Transformer Utility Filter Recom. Explanation Manager RB Data Receive SE Parse & Prepare Data Library Req/Resp. Req. Send SE Generate Alternatives UtilFun1 Data JSON RI Data 3 7a Moderator Explanation 6 Pref-based Filter Req/Resp. Data Prepare UtilFunn Recv. Desc Generation Req./Resp. Data Req. Send Data Eval. Desc Recom. 20 Fetch Template 7 18 Receive/Send 20 Process Template Interpretation Knowledge Loading 5 Education Post Proc. 14 18 Receive/Send Interface Convert SE to JSON Support Send Rules to RBR & DF Blocked 5 Fetch URL Global Template Rules Send/Rec Rules JSON from KCL Pref 23 21 6a 4 1 23 Data Curation Layer Knowledge Curation Layer Supporting Layer 22 Related Work / 246

Contributions Limitations Misfit Shine 1 • An Activity Tracker application • Lack of knowledge-based activity recommendations • It tracks a range of activities including walking, • No function to handle situation-based request to running, cycling, swimming and sleeping generate recommendation • Capability to monitor the sleep depth • Lack of nutrition recommendations

Google Fit 2 • A health-tracking platform • Only support goal-based recommendations on • It has support for activity tracking which the basis of steps count. includes walking, running, cycling etc. • No support for knowledge-based • Provided goal-based recommendation recommendation • No support for contextual interpretations of recommendations.

S Health 3 • A android-based platform for health tracking • Only support goal-based recommendations • Support activity tracking, food tracking, and • No support for situation-based recommendations sleep monitoring • No support for contextual interpretations of • Provides goal oriented calories-based recommendations. recommendations SCL Logical Structure / 247

SCL Service

Communication Logic Modules

Utilities Service Recommendation Recommendation Orchestrator Builder Interpreter SERVICE ORCHESTRATOR

Muhammad Afzal UCLab, KHU, Korea Overview / 249

• Orchestrator enable communication of SCL with other layers and user application • It allows implicit or explicit communications

implicit explicit

User

Time Schedule Situation Motivation / 250

• A common gateway of communication

• Communication abstraction for intra-layer modules

• Provide guidance to intra-layer logical modules fulfilling service requirements Service Orchestrator - Architecture / 251

Recommendation Service Orchestrator Builder UI/UX

Input/ Output Adapter

LLM Recommendation Data Interpreter Event Handler Curation Layer SCL Communication- Service Orchestrator / 252

User App Uid 13 Uid, Request 10 Uid, Context, preferences

1

Uid, Situation Event Recommendation Recommendation Service 2 Uid, Data Request Builder Interpreter Orchestrator 5

Input/output Event Rule-based Reasoner 9 Interpretation 8 12 Adapter Handler Pattern Conflict Result Context Content Explanation Matcher Resolver Generator , Recommendations

Uid, Data Interpreter Interpreter Generator

Uid

, , Context Uid 1 7 Set of Rules Situation Event

, , Data Request 5 3

14 Uid 11 6 Uid, Personalized

, , Data Recommendation Uid

RESTful Web Services , , Situation Event RESTful Web Services

, , Context RESTful Web Services Index- Uid Uid based Reasoning KCL Rule Index RESTful Web Services DCL Rule-base SL (UI/UX) base Service Orchestrator /

Recommendati Recommendati Recommendati on Service on Builder on Interpreter

1 EP. 1 2 2 EP. 2 3 Receive 5

Data Comm Manager 6

Send/Receive Send/Receive 1 Builder Data Recommendatio Req n Send/Receive 8 Send/Receive Interpreter Interpreted Rec Data Req 4 7 9 9 Data Curation Supporting Layer Layer Request for Recommendation / 254

Situation

Push Model

Request Modes

User Request

Pull Model 1 Push Model (Situation Event) / 255

Situation Event 2 a Index (situation event) Reasoner Service Orchestrator 3 Data Request Recommendation b Prepare data request based on scheduled rules { "userId":39, "activityDate":"2015 05 14} Builder 4 Data Response 3 1 4

User Profile

{ RESTful Web Services "userRiskFactorId":1, "userId":39, "riskFactorId":5, "statusId":1 } 1 DCL---SCL 3 4 Lifelog Communication

{ RESTful Web Services "userLifelogID":null, "userID":39, “activityID":6, “…”

3 } 1 4 DCL 1 Pull Model (User Request) / 256

Situation Event 2 a Index (situation event) Reasoner RESTful 1 1 1 3 Web Services Service Orchestrator Data Request Recommendation b Prepare data request based on scheduled rules { "userId":39, "activityDate":"2015 05 14} Builder { "userId":39, “serviceId” : 2”" } 4 Data Response 3 4

User Profile } { "userRiskFactorId":1, "userId":39, "riskFactorId":5, "statusId":1 } User App---SCL 3 4 Lifelog Communication

{ RESTful Web Services "userLifelogID":null,

{ "userId":39, { "userId":39, "activityDate":"2015 05 14 "userID":39, “activityID":6, “…” 3 } 4 DCL Contributions / 257

• Handling the situation based on dynamically generated events from DCL

• Handling user request explicitly made for the recommendation

• Handles the communication of data required for SCL services Recommendation Builder (RB) Overview / 1

Knowledge Based Recommendation System Reasoning (definition) Reasoning • An application whose Situation Event computational function is to generate recommendations for user Service Query

query or a situation event Recommendations using the available data and knowledge rules using an intelligent reasoning methodology [1].

Ingredients

[1] Moses, Yoram; Vardi, Moshe Y; Fagin, Ronald; Halpern, Joseph Y (2003). Reasoning About Knowledge. MIT Press. ISBN 978-0-262-56200-3. Motivation / 2

• Enabling data integration for recommendation generation • Lifelog data loading interface

• Enabling knowledge integration, for reasoning purpose • Knowledge loading interface

• Generating accurate recommendations contents, using individual’s personal profile and lifelog data along with the knowledge • Rule-based Reasoner Objectives – Challenges - Solutions / 261

Objective Challenges Solutions

S1  Lifelog Data Loading Enabling Data How to load and prepare Interface (Data fetcher, integration data (DCL Integration) Transformer and Utility Library, Restful webs ervice)

S2  Knowledge Loading Reasoning Enabling Knowledge How to load required Interface (Situation event base Integration knowledge (KCL Integration) Framework restful web service)

S3 Generating Accurate  Rule-based Reasoner Recommendations Use of appropriate reasoning methodology (Forward Chaining and Conflict Contents Resolution) Recommendation Builder - Architecture / 3

[Situation event, data request/response] KCL DCL Lifelog Situation Data Based Rules SL

Recommendation Builder Situation-

[Service based Rules Rules Rules request] [Situation event or Lifelog Data Loading Interface  Service & data Service request/response] Data Fetcher Knowledge Loading Orchestrator

request/Response] Interface

Data Transformer Utility Library event [Situation

[Recommendations] [Prepared data /Recommendations] Rules Loader Rule-Based Reasoner (RBR)

Recommendation Recommendations based based Rules]

Interpreter Patterns Matcher - [ Situation [ Conflict Resolver Results Generator Component Architecture / 4

S1: [Situation event, data request/response] KCL Situation event/Service request- Lifelog Situation DCL Based Rules based Knowledge Integration Data SL

S2: S2 Recommendation Builder Situation- Utility Library-based Data [Service based Rules request] [Situation event or Transformation Lifelog Data Loading Interface  S1 Service & data Service request/response] Data Fetcher Knowledge Loading

S3: Orchestrator Interface Rules request] Rules

Rule-based Reasoning with Data Transformer Utility Library event [Situation

Forward Chaining and Conflict [Recommendations] [Prepared data Resolution Algorithms /Recommendations] Rules Loader S3 Rule-Based Reasoner (RBR)

Recommendation based based Rules]

Interpreter Patterns Matcher - [ Situation [ Conflict Resolver Results Generator Communication View / 5

Uid, Context 13 Uid, 10 Context

Uid, Situation Recommendation Builder 2b Event Data Loading 2a Knowledge Loading 2 Interface 4a Interface 9 Uid, Uid, Data Recommendation Orchestrator 5 8a Recommendations Request Rule-based Reasoner Interpreter 8 12 Pattern Conflict Result

Uid, Data Matcher Resolver Generator

, ,

Uid Context , Data , Set of Situation

7 4 3 14 Request

Uid Rules Event

, , Uid, Personalized 6

Recommendation Uid

Data RESTful Web Services Event 11 Situation , RESTful Web Services

RESTful Web Services Uid , , 1 Index-based

Uid Reasoning Context KCL Rule Index Rule-base RESTful Web Services DCL base SL (UI/UX) Execution workflow / 6 Data Loading Interface

1 [Situation event, data Lifelog Data Loading request/response] KCL DCL Lifelog Situation Interface Data Based Rules Data Fetcher SL 1. Receives requests from SL and DCL for recommendations Recommendation Builder Situation- 2. Parse request [Service based Rules request] [Situation event or 3. Request Rules for the service Lifelog Data Loading Interface  Service & data 1 using knowledge loading Service request/response] Data Fetcher Knowledge Loading interface Orchestrator 2 2a Interface

4. Requests life-log data from DCL request] Rules 5. Forwards recommendations to Data Transformer Utility Library event [Situation Recommendation Interpreter [Recommendations] [Prepared data /Recommendations] Rules Loader Rule-Based Reasoner (RBR)

2 Recommendation based based Rules] Lifelog Data Loading - Interpreter Patterns Matcher Interface

Data Transformer Situation [ 2a • Parse rules conditions part Conflict Resolver Results Generator • Conformance check of Lifelog Data Loading Interface condition attributes and data Utility Library • Transformation of data to the • Utilities functions are defined condition formats using Utility • Computations for rules Library functions condition attributes are done Execution workflow / 7 Knowledge Loading Interface

KCL KnowledgeDCL Loading Interface Lifelog Situation Index Synchronizer Data Based Rules • 1 SL Receives Updated Index From KCL Knowledge Loading Interface • Updates the Rules Index Base Recommendation Builder Situation- Rules Loader [Service based Rules request] [Situation event or 1 Lifelog Data Loading Interface  1. Takes situation event request from data Service & data Service request/response] Data Fetcher 3 fetcher Knowledge Loading

2. Send rules request to KCL as a restful web Orchestrator Interface Rules request] Rules

service request Data Transformer Utility Library event [Situation 4

3. Receives Rules as a web service response [Recommendations] [Prepared data from KCL /Recommendations] 1 Rules Loader 4. Read rules service and interpret Rule-Based Reasoner (RBR)

5. Provides Rules to Reasoner and Data Recommendation based based Rules] loading Interface Interpreter Patterns Matcher -

2 [ Situation [ Conflict Resolver Results Generator Execution workflow / 8 Rule-based Reasoner

1 [Situation event, data Rule-based Reasoner request/response] KCL DCL Lifelog Situation Pattern Matcher Data Based Rules 1. Perform data types conversion conversion SL 2. Matches rules’s conditions against each [index synchronization, data (prepared data) from DCL Recommendation Builder rules request/response] 3. Uses forward chaining algorithm for [Service request] Knowledge Loading matching [Service & data Lifelog Data Loading Interface request/response] Interface 4. Returns matched rules Service Data Fetcher Rules Loader 2 Orchestrator Rule-based Reasoner Data Transformer Data Refiner Index Conflict Resolver [Recommendations] [Service and data Synchronizer • Receive matched set of rules from pattern request/response] matcher Rule-Based Reasoner (RBR) Index Matcher • Performs conflict resolution using maximumRecommendation

1 [index [ Rules request/response] Rules [ specify algorithm Interpreter Patterns Matcher request/response] • Returns final set of resolved rules 2 3 3 Conflict Resolver Results Generator Rules Index Base Rule-based Reasoner Results Generator • Prepare the final results, fires conclusion part • Bind user id and service id with conclusion • Provides recommendations to interpreter as an object model Execution Workflow Summary / 9

SO receives Situation Event (SE) SO forwards the SE TO RB [Situation event, data request/response] KCL DCL Lifelog Situation Data Based Rules Upon receiving situation event, RB performs the SL following operations: Recommendation Builder Situation-

[Service based Rules Rules Rules request] i. Using Life Log Data Loading Interface it [Situation event or Lifelog Data Loading Interface  loads user profile data from DCL through SO. Service & data Service request/response] Data Fetcher Knowledge Loading Orchestrator i. Using Knowledge Loading Interface it request/Response] Interface

retrieves Rules from KCL. Data Transformer Utility Library event [Situation

[Recommendations] [Prepared data ii. Using Rules and User profile data, the RBR /Recommendations] Rules Loader reasons and generates Recommendation Rule-Based Reasoner (RBR)

Recommendation Recommendations based based Rules] Recommendation Builder sends the Interpreter Patterns Matcher -

recommendation back to Service Orchestrator [ Situation [ Conflict Resolver Results Generator

SO forwards the recommendation to Recommendation Interpreter. Recommendation Contents Generation / 269 Pattern Matcher Matching Strategies

• Works from facts (conditions) to a conclusion. • Matches data against 'conditions' of rules in Forward chaining the rule-base. (our choice is forward chaining) (data driven)

RBR • Works from the conclusion to the facts (conditions). Backward chaining Strategies • Matches a goal against 'conclusions' of rules in the (goal-driven) rule-base. (Matching)

Mixed chaining • Some rules are used for chaining forwards, and others for chaining (forward-backward or backwards. backward-forward) • System first chains in one direction, then switches to the other direction (use of meta-rules). Recommendation Contents Generation / 270 Forward Chaining Algorithm

Recommendation Builder Rules Pattern Matching Algorithm

Rules Condition from KCL from DCL

Pattern Matcher

Rules/ All Rules Conditions All Matched Rules Conditions Conditions Pattern

Conflict Resolver

Final Resolved Rules

Results Generator

Recommendations

Recommendations Recommendation Contents Generation / 271 Forward Chaining Algorithm (Output)

Situation Event Related Rules

1 Matched Rules Prepared Data (Facts) matchedRules=[ { { "Activity":"sitting", Recommendation Builder RuleConclusion=TAKE REST! Move around simply for a cup of tea or glass "Duration":1, 3 of water!!!, numConditions=3, "AgeGroup":"Adult", 2 Rule-based Reasoner Rule-ID=3 "Disability":"None", }, "HealthCondition":"Normal" Pattern Conflict Result { } Matcher Resolver Generator RuleConclusion=Sitting is killing you, RESTful Web Services take a break for 5 minutes minutes!!!, numConditions=2, Rule-ID=4 }, Rule Conditions Matching { DCL (forward chaining) RuleConclusion=Please avoid sitting and be active!!!, numConditions=1, Intermediate Database Rule-ID=5 } ] Conflict Resolution / 272 Strategies

• Specificity or Maximum Specificity (our choice) • Fire All Selected Rules • based on number of condition attributes • Executes all the matched rules matched • Context Limiting • choose the rule with the most/least matches for • condition attributes partition rule base into disjoint subsets • doing this we can have subsets and we may also • Priority-based approach have preconditions • arrange condition attributes in priority queue • Execution Time • use rule dealing with highest priority condition • attributes Chose the one with faster execution • • Explicit /meta-rules for conflict resolution Physically ordering of rules • • rule based system within a rule based system hard to add rules based on their physical order • use meta-rules to resolve the conflict • Random • Randomly pick one rule for execution Rule-based Reasoning / 273 Conflict Resolution (Algorithm)

Recommendation Builder Rules Pattern Matcher Algorithm Max. Specificity Conflict Resolv. Algorithm

Rules Condition from KCL from DCL Pattern Matcher

Pattern Matcher

All Matched Rules

Matched Conflict Resolver Rules

Final Resolved Rule Conflict Results Resolver Single Rule Matched Matched Condition Generator Rules rules Counter

Recommendations Loop back # of Conditions

Add to Recommendations Result Resolved Rules Resolved Rule Generator Contributions / 10

• Pattern Matcher performs data type conversion and by using forward chaining algorithm matches rule’s conditions

against data from DCL, returns matched rules

• Conflict Resolver Performs conflict resolution by using maximum specify algorithm, returns final resolved rules

• Results Generator by using final resolved rules prepares and generates final result Recommendation Interpreter (RI) Overview / 276

• Personalization is a key element in Recommender Systems • Personalization consists of tailoring a service or a product to accommodate specific needs of individuals • Contextual Information combined with User Preferences enable Personalization • Recommendation Interpreter performs interpretation according to the contextual information and preferences of the user in order to deliver the appropriate recommendations at right time

http://www.quora.com/What-is-the-definition-of-personalization Goal and Objectives / 277

• Goal Physical Activities • Providing context-aware and personalized wellbeing recommendations

• Objectives Preferences

Personalized Recommendations HealthCondition • Interpreting recommendations to address • Receptiveness of the user for recommendation • Preferences of the user for recommendation • User friendly explanation of the recommendation Time Profile

Schedule Motivation / 278

To deliver recommendation at appropriate time based on user current context

To filter out unnecessary recommendations based on user preferences

To explain recommendation according to situation for user engagement Challenges and Solutions / 279

Motivations Challenges Solutions

S1 To deliver recommendation  User Receptivity Evaluation at appropriate time based  Context aware recommendations on user current context (When to deliver?)

Filtering out unnecessary S2  Exploiting user preferences recommendations based on  User aware content filtration user preferences (What, whom and how to deliver?)

S3 Explaining recommendation  Multidimensional explanations  Situation aware explanations according to situation (How to relate user’s surroundings?) Conceptual View / 280

Deliver the recommendation according to present context and preferences

Recommendation Interpreter

Recommendation

Is user receptive? Is Rec suitable? Rec Recommendations User Status Yes Filter and Weather is Rainy “take umbrella with you” Builder Evaluation Explain Rec

User

Educational Aid Location Weather Preferences IDB

Emotion Location

HLC Explanation

HLC special Generation condition Component Architecture / 281

Recommendation Interpreter

S1 Context Interpreter S2 Content Interpreter

Context Context Content Filterer SNS Trend Identifier Selection Interpreter Select Trend Trend Filter Rec Alternative Selector Processor Context Moderator Recommendation Builder S3 Supporting Data Manager Explanation Manager Layer Blocking Situation Education Support Template Rules Detection Service Resource Resource UI/UX Explanation Selector Linker Results Orchestrator Generator Global Preferences Preparation

DCL Lifelog Data Complete Communication Workflow of Interpreter / 282

Uid

Uid, Context, Preferences 4 7 Recommendation Interpreter 8 10 Uid, Context Interpreter Pref Content Interpreter Situation Rec, ,

Event Recommendation Uid Context, Orchestrator 2 Builder 3 9 Uid, Rec 11 Rec, Context 5

Data Preparer Explanation Manager 12 , ,

Uid 8a 10a 12a

Event , Situation Situation ,

Uid 14 Context

Uid Blocking Global Template Rules preferences 6 1 13 Uid, Personalized 13 Recommendation Uid, Personalized Recommendation DCL SL (UI/UX) Execution Flow /

Service Curation Layer

Recommendation Interpreter

Context Interpreter Content Interpreter 1 SO receives situation event (SE) from Select LLM Service Moderator Moderator Alternative Orchestrato Context Selector Recv Rec. Select Context Eval. 2 r Path Context request is sent to SO Receive SCL 2 Recv Context Process Vect. Context Final Rec Services 4 Pref-based EP. 1 5 User Status Eval. 3 Filter Context is received by Moderator EP. 2 Fetch Blok Context 4 Rules Filter Recommendation Moderator sends context to Context MatchInterpreter Rules Input/outpu Context Eval. Selector for evaluation t Adapter Filter Recom. 5 Context Selector requests for the RB Data Explanation Manager Req/Resp. Generate Alternatives blocking rules 3 Moderator Explanation RI Data Pref-based Filter 6 Blocking rules are fetched from the Req/Resp. 1 Recv. Desc Generation repository Recom. Eval. Desc Fetch Template Receive/Sen Process d Interpretatio 6 Education Template 6 If loc: “Home” AND HLC= n Post Proc. Receive/Sen Support “Sleeping” Blocked d Fetch URL Global Template If HLC= “Having Meal” Rules Pref If HLC = “Commuting” … Execution Flow /

Service Curation Layer Recommendation Interpreter Service Context Interpreter Content Interpreter Orchestrator Moderator Context Selector Moderator Select Alternative Receive Recv Rec. Select Path SCL Services Context Context Eval. EP. 1 Fetch Block Process Vect. Rules Recv Context EP. 2 7 Context Final Rec. Pref-based Interpreter User Status Match Rules Filter Input/output Eval. 9 8 Filter Recommendation Adapter 8 return flag Match(rules, context) Explanation Manager Gen. { RB Data Alternatives Context Eval. Req/Resp. Moderator Explanation flag = -1 9 rules.add(scanFile.nextLine()); Generation Pref-based RI Data RecvUser_Status_Eavl. Desc () { Filter Recom. Filter if(rules.containsReq(context)){/Resp. If flag==-1 Fetch Template flag=1; Delay_Rec() break; } Recom. Eval.else Desc Process 7 Both contextRepositories and rules are forwarded return flag; Content_Interpreter.Select_Path() Receive/Send Template to the Context Interpreter } Education} Interpretation Support 8 ContextBlocked Interpreter evaluates context Receive/Send Fetch URL Post Proc. Global Templat against theRules rules and decides about User Pref e availability Status

9 If user is available then Content Interpreter is invoked otherwise recommendation is delayed Execution Flow /

14 Service Curation Layer 10 Prepared Context Matrix Select Rec Eval Recommendation Interpreter Path Contextual Matrix Content Interpreter Context/Rec Walking Running Stretching Cycling Sitting Moderator Select Alternative selt_path(Rec) { Out doors 1 1 1 1 1 11 If (Rec.len == 1) Amusement 1 0 1 0 1 Select Path Context Eval. Select_Alternative() Sunny 1 1 1 1 1 12 Else Happiness 1 1 1 1 1 Process Vect. Filter_Recommendation() 13 } Aggregate 1 0 1 1 1 Final Rec 14 Pref-based Filter Walking Stretching 10 Select “Select Alternative” or “Filter 10m 15m Filter Recommendation Recommendation” is called for further get_pref (user_id); Gen. processing Alternatives Context Eval. 11 Pref-based Global preferences are fetched Filter Recom. Filter 12 Search Alternative Recommendation is Repositories current Recommendation (running) is unsuitable 13 Blocked For multiple alternative recommendations, Global Template Rules Pref User’s Preferences are weighed in

14 “Walking” is preferred over “Stretching” by the user therefore “Walking” is treated as final recommendation Execution Flow /

Service Curation Layer Recommendation Interpreter

Service 15 Final RecommendationContext Interpreter if forwarded Content Interpreter Orchestrator to Explanation Manager Moderator Context Selector Moderator Select Alternative Receive 16 Recv Rec. Select Path Context Eval. SCL Services No explanatoryContext sentence received EP. 1 17 ExplanationFetch Block Generation component Rules Process Vect. EP. 2 is invoked Recv Context

18 A templateContextis fetched from the local 15 Final Rec Pref-based Interpreter User Status Received repositoryMatch Rules and forwarded to further Filter Input/output 19 processing Eval. 15 DescriptionAdapter Filter Recommendation Explanation Manager Gen. get_DescriptionRB Data() Alternatives Context Eval. //empty stringReq/Resp. Moderator Explanation Generation Pref-based 16 RI Data Recv. Desc Filter Recom. Fetch Template Filter ForwardReq /Resp. 16 18 DescriptionRecom. Eval. Desc 17 19 17 Process Repositories Evaluate Description Receive/Send Template forward_Descption(); Education Interpretation Support Blocked Eval_Desc() { Receive/Send Fetch URL Post Proc. Global Template Rules Pref if (Descprption.isEmpty()) Explanation_Generation() else Education_Support() } Execution Flow /

Service Curation Layer

20RecommendationTemplate processed accordingInterpreter to the context Service Context Interpreter Content Interpreter 21 Orchestrator Post Processing is applied to reflect Moderator Select Alternative additionalContext Selector information e.gModerator. weather Receive Recv Rec. Select Path Context Eval. SCL Services Context 22 A complete recommendation along EP. 1 withFetch education Block support is forwarded Rules Process Vect. EP. 2 to SO for further processingRecv Context 23 SO sendsContext the recommendation to the Final Rec Pref-based Interpreter User Status SupportingMatch Rules Layer and Data Curation Filter Input/output Layer for persistence Eval. 20 Adapter ProcessFilter Template Recommendation Explanation Manager Gen. RB Data Alternatives Context Eval. Req/Resp. Moderator Explanation String Generation get_TemplatePref-based (Rec,Duration); RI Data Recv. Desc Filter Recom. Filter Req/Resp. Fetch Template // “You are Recommended Walking For 10 mins” Recom. Eval. Desc Process Repositories Receive/Send Template 21 Education 20 Post Process Template Interpretation Support Blocked Receive/Send Fetch URL Post Proc. Global Templat Rules 21 String post_process(Sentence,Pref e 23 Context); 22 23 // “You are Recommended Walking For 15 mins and it may Data Curation Layer Supporting Layer rain so take umbrella with you” Solution 1: User availability rule interpretation / 288

Context Interpreter ensures to deliver the recommendation at right time Recommendation Interpreter 1. selects context one by one from lifelog S1 2. Interpret the selected context Context Interpreter 3. If user is available, the recommendation are forwarded for

Context Content next component Selection Interpreter 4. Else it inform the recommendation builder about the reason of not delivering the recommendation Context Moderator No Recommendation Builder Recommendations Data Manger Recommendations Select Interpret Yes User is available ? Blocking Builder Context Context Template Rules

Global Preferences Interpret Context Interpretation Rules: Content IF Low level High Level Life THEN: Context Context Log IF THEN IF THEN Solution 2: Contextual aggregate matrix generation / 289

Content Interpreter ensures to deliver the right recommendation to the right user Recommendation Interpreter • Evaluate Cardinality • If cardinality == 1 then forward the recommendation S2 Content Interpreter Content Filterer SNS Trend Identifier

Select Trend Trend Filter Rec Alternative Selector Procssor Cardinality == 1 Select Alternative

Recommendations Data Manager Context Rec Explanation Blocking Interpreter Cardinality Template Generation Rules

Global Preferences Filter Cardinality > 1 Recommendations Solution 2: Contextual aggregate matrix generation / 290

“Select Alternative” evaluates Original Recommendation and suggests alternative based on context, if required Recommendation Interpreter • Receive Original Rec S2 • Check suitability of the rec against current context Content Interpreter • If “unsuitable” find out alternative from “Contextual Matrix” Content Filterer SNS Trend Identifier • If alternative recommendations more than 1 check user preferences • Forward the final recommendation(s) to “Explanation Generation” Select Trend Trend Filter Rec Alternative Selector Processor Explanation Forward List Generation as-is

None Data Manager matched unsuitable Multiple Blocking Process Eval Select Cardinality Check Template Rec ? Preference Rules Context Alternative

At least one Global Preferences Check Single suitability matched

Explanation Explanation Explanation Generation Generation Generation Solution 2: Contextual aggregate matrix generation / 291

“Filter Rec” is evoked when multiple recommendations are received • Receive list of recommendations Recommendation Interpreter • Check suitability of each recommendation against the current context S2 • Store the applicable recommendations in AR_List Content Interpreter • Check user preferences Content Filterer SNS Trend Identifier • Forward the final recommendation(s) to “Explanation Generation”

Select Trend Trend Filter Rec Alternative Selector Processor Explanation Forward List Generation ARP

List of applicable List of preferred => 1 Data Manger recommendations Recommendations?

Check Cardin Blocking Process Rec Check List List Template Pref ? ality? Rules Suitability AR ARP

= 0 Global Preferences Next Selection Next Selection

Explanation Forward List Generation AR Solution 2: SNS Trend Identifier / 292

Food is recommended based on required nutrition • Receiver nutrient category recommendation and user preferences Recommendation Interpreter • Receiver latest food item trends from SNS S2 • Combine both aforementioned information Content Interpreter • Check user preferences Content Filterer SNS Trend Identifier • Forward recommended food items to Result Preparer and Explanation Generation Select Filter Rec Trend Trend Alternative Selector Processor

Data Manger

Blocking Template Rules

Global Preferences Solution 3: Template-based explanation / 293

Recommendation Interpreter Explanation Manager explains the recommendation according to the situation S3 • Receive context Data Manager Explanation Manager • Evaluate explanation requirement Blocking Situation Education Support • Template Select template Rules Detection • Get recommendation specific URL Resource Resource Explanation Selector Linker • Attached the link to resources Global Preferences Generator • Prepare the results

Content Pick URL URL Interpreter Repository

Weather String matching Emotion Receive Generate Result context explanation preparation

Location Templates Standard Template = “You are Recommended” + [Recommendation] + “for” + [Duration] + “mins” Contributions / 294

• Delivering recommendation at right time (user interruptibility) • Context Interpreter

• Delivering preference-in-context personalized recommendation • Content Interpreter

• Delivering contextually explained recommendation • Explanation manager Uniqueness & Contributions Uniqueness and Contributions (1/3) / 296

• The ability to orchestrate diverse services through enabling intra and inter layer communications • Push based service request handling (Situation-based) • Pull based service request handling (User-based) • Time-bound service request handling (Time-based)

• The ability to make inter-layer data request for generating recommendations • Run-time data request generation for data acquisition from the outer layer. Uniqueness and Contributions (2/3) / 297

• The ability to reason over knowledge in order to generate the knowledge-based recommendations. • Data preparation for the reasoner to work on unmatched data elements exist in data and knowledge layers. • Implementation of forward-chaining algorithm for reasoning

• Strategy for the conflict resolution • Resolution of the conflicts occurred during the reasoning process through the maximum-specificity technique. Uniqueness and Contributions (3/3) / 298

• The ability to make context-aware recommendations • Context-aware user Interruptibility • Situation based recommendation adaptability • SNS-based nutritious food recommendation

• Recommendation enrichment by embedding explanatory and educational nuggets • Explanatory note embedding with the recommendation • Audio-visual aids for recommendation adaptability Evaluation Evaluation Environment / 300

• Evaluation Matrix • Recommendation Interpreter Evaluation Criteria Explanation Questionnaire

Execution Time • Average execution time of the knowledge- base reasoning Questionnaire Snapshot Accuracy • Accuracy of the knowledge-base reasoning • Accuracy of the recommendation interpreter

• Reasoner Evaluation Data Input cases 1. Execution Time Evaluation / Results Summary: 301

Experimental Setup Experiment 1: • Standalone system (Windows 7 Operating System) • Processor: Intel® -CoreTM i5-4590 CPU @ 3.30 Ghz • Memory: 8 GB RAM • No. of Test cases: 40 input cases • No. of Rules: 19

Results Summary • KRF’s execution time remains uniform i.e. stable and consistent by varying both the cases

Experiment 2: 2. Accuracy Evaluation for Knowledge-base Reasoner / 302

Experimental Setup Analysis for Experiment 3 • Standalone system (Windows 7 Operating System) • No conflict found in 24 cases (out of 40) • Processor: Intel® -CoreTM i5-4590 CPU @ 3.30 Ghz • Conflicting rules found in 16 cases • Memory: 8 GB RAM • Successfully resolved 8 cases (out of 16) • No. of Test cases: 40 input cases • Unresolved remained cases 8 • No. of Rules: 19

Experiment 3 Results Summary • KRF framework successfully achieved 80% accuracy 3. Accuracy Evaluation for Recommendation Interpreter / 303

• Experimental Setup Conducted Questionnaire • Standalone system for RI • Questionnaire items: 40 • Number of participants: 40 • Survey conducted with the following population characteristics

Result Summary Experiment 1 (Participant-Agreement) Result Summary Experiment 2 (Item-Participant score) Based on meta-accuracy scores the proposed system 24 Scenarios (out of 40) achieved favorable results achieved 87% accuracy Summary / 304

• SCL handles dynamically both push and pull models for situation-aware recommendation generation

• SCL provides reliable recommendations using rule-based reasoning with forward chaining mechanism

• SCL provides a flexible reasoning framework with loosely coupled data and knowledge that supports diverse services

• SCL performs contextual information processing for situation-aware recommendation by deciding on factors such as user’s interruptibility and contextual suitability of the recommendation

• SCL enriches the contents of the generated recommendation by embedding audio-visual aids and relevant nuggets of contextual information in terms of current weather conditions, food trends and user’s emotional state Supporting Layer (SL) Background / 306

• Complex Large unstructured data sources are becoming a norm and extracting important and hidden information from it is becoming the need of today • Additionally displaying them interactively with interesting graphics and adaptive UI is necessary.

• Analytics can identify abnormal behavior and different attributes related to the user through visualization and summarization to bring context. Adaptive UI • Adaptive User interface generation based on user information, context of user & device at run time Motivation for Supporting Layer / 307

• Use of smartphones as primary devices • Personalized and adaptive user interface of the mobile phone application is very important in today’s world. • User experience should be evolved constantly based on profile and performance metrics.

• Complex Large unstructured data sources Hidden Information • Complex datasets have hidden information which must be extracted for additional and complex decision making. • Visualization and trends are necessary for complex data to find abnormal information. Supporting Layer as a Framework (SLF) / 308

• Creating graphs and facts with • Trend analysis extracts • Adaptation engine to create • UX measurement in terms of task respect to clustering new rules and change existing success, errors and time completion • Timeline and intuitive way to data facts • Clustering for visualization rules with respect to user • User satisfaction modelling with display SNS and trends behavior and profile respect to performance/issues metrics SL Architecture / 309

Descriptive Analytics UI/UX Related Work (Descriptive Analytics) / 310

Systems Limitations

• The automated extraction of interrelated data objects from ERP • No parameter type classification systems is discussed but without using a graph model and for the • No distribution of data ranges and rendering information single analytical goal of process mining

Clustering • Gradoop (Graph analytics on Hadoop) analyze graph data for • Only graph analytics and focus on trends based on images business intelligence and social network analysis. • It stores graph formats only

• Radoop is a big data analytics solution for Hadoop which computes • It is based on complex machine learning techniques

the jobs on the cluster using ensemble learning • Less information as it is being developed into a product. Summarization

Rudolf, Michael, et al. "SynopSys: large graph analytics in the SAP HANA database through summarization." First International Workshop on Graph Data Management Experiences and Systems. ACM, 2013. Junghanns, Martin, et al. "GRADOOP: Scalable Graph Data Management and Analytics with Hadoop." arXiv preprint arXiv:1506.00548 (2015). radoop.eu Limitations of Existing Work (Descriptive Analytics) / 311

• No query library for mapping the queries, lack of which results in inefficient data retrieval in case of big data.

• Lack of Parameter filtration which makes it difficult to group and summarize the data.

• No model transformation for preserving semantics of data and integration of heterogeneous data Overview of existing adaptive systems (UI/UX) / 312

Existing Systems Descriptions Pros: Cons: • Sharing of user information among several applications. It was intended to produce a personalized, • Diverse types of sensors contribute to an extensive user Doppelgänger [20] • No systematic feedback mechanism printed Newspaper for the user model. • Unobtrusive user modelling. It enhances Microsoft Excel by an adaptive • Some of the user dialogues for adaptability seem Flexcel [21] • Users have control over their own user profiles User Interface very complex • It combined the temporal reasoning and Bayesian user Lumière- • It only focus on recognizing user goals in order to It led to the later MS-Office assistant models in order to manage the uncertainty of recognizing Project[22] provide appropriate user goals from a stream of user actions over time. • Adaptation covers only the selection of content It gives the user suggestions for interesting • User profiling and clustering is based on publically • User modelling covers aspects such as purchasing Lifestyle Finder[25] websites available demographic mass data history, lifestyle characteristics and survey responses It is an application that adapts the display • Adaptation focuses on layout and selection of • Run-time rendering of the user interface of objects considering window size and appropriate controls and display elements Supple[26] • Information about the user is collected by analyzing user user • It does not address accessibility issues tracking preferences • It does not provide an authoring tool It generates individualized user interfaces • Toolkit: supports industrial developers and designers to and performs adaptations to diverse user easily create self-adaptive applications • No feedback functionality MYUI [19] needs, devices and environmental • Explicit and implicit data collection about user for user • Manual setting for platform device category conditions during run time. modeling • Run-time rendering of user interface. Limitations of existing work (UI/UX) / 313

• The existing user models are not comprehensive • Lacks behavioral & physiological measurement for adaptive UI • Environmental variables • Lack addressing accessibility issues • No feedback functionality • Lack of user experience for adaptive UI Descriptive Analytics Background / 315

V 1.0 Traditional Analytics V 2.0 Descriptive Analytics

Provide trending information

• Internally Sourced and relatively • Complex Large unstructured data small structured data sources Quantitative summary • Teams of Analysts • New Analytical and computation capabilities • Internal Decision support • Data based Product and Services

Data visualization Introduction / 316

Analytics

Raw Data Processing Filtered Data Visualization UI/ UX

Data Analytics Users Experts Motivation / 317

Through analytics we can observe abnormal behavior and different attributes related to the user through visualization Anomaly Detection

Descriptive Trends Visualization and summarization brings Analyzer context to the data and removes Analytics misunderstandings and improve predictions

Data Comprehension Understanding the data and integrating the data and highlighting outliers Workflow of Descriptive Analytics / 318

Id X-axis Y-axis Multiple series Graph

1 Yes No No Pie

2 Yes Yes Yes Line

3 Yes Yes Yes Bar

Graph Data Graph 4 Yes Yes No Bubble

Return Conditional Temporal Attribute Attributes Attribute

Food Age Days

Activity Duration DateTime

*.activity uid None

Query Library Query Activity GPS/Location Days

uid ActId DateTime Activity Duration accX accY accZ

2 2 20/08/1513: Walking 5Min 1213.23 1213.23 1213.23 00:03 1123 1123 1123

29 2 20/08/1510: Walking 45m24s 14334.1 14334.1 14334.1 00:03 23 23 23

Data Store Store Data 30 4 20/08/1511: Sitting 2hr 1232 1232 1232 00:03

23 2 21/08/1518: Walking 10min 1213.23 1213.23 1213.23 00:03 1123 1123 1123

32 4 21/08/1512: Sitting 2m24s 14334.1 14334.1 14334.1 00:03 23 23 23

30 3 21/08/1513: running 4m 1232.84 1232.84 1232.84 00:03 1 1 1 Workflow of Descriptive Analytics / 319

Id X-axis Y-axis Multiple series Graph

1 Yes No No Pie

2 Yes Yes Yes Line

3 Yes Yes Yes Bar

Graph Data Graph 4 Yes Yes No Bubble

Return Conditional Temporal Attribute Attributes Attribute

Food Age Days

Activity Duration DateTime

*.activity uid None

Query Library Query Activity GPS/Location Days

uid ActId DateTime Activity Duration accX accY accZ

2 2 20/08/1513: Walking 5Min 1213.23 1213.23 1213.23 00:03 1123 1123 1123

29 2 20/08/1510: Walking 45m24s 14334.1 14334.1 14334.1 00:03 23 23 23

Data Store Store Data 30 4 20/08/1511: Sitting 2hr 1232 1232 1232 00:03

23 2 21/08/1518: Walking 10min 1213.23 1213.23 1213.23 00:03 1123 1123 1123

32 4 21/08/1512: Sitting 2m24s 14334.1 14334.1 14334.1 00:03 23 23 23

30 3 21/08/1513: running 4m 1232.84 1232.84 1232.84 00:03 1 1 1 Workflow of Descriptive Analytics / 320

User HTML View UI/UX Find the weekdayModel walking, running Visualization Interface and sitting and their average in last 30 days 1 Visualization Visualization Admin Request/ Enabler Data View Trend Analyzer Response Model Association Textual Query Creation Interface Clustering Classification Visualization Data Rendering Query Data Query Library Manager Mining Minds Client Model Transformation Data Store Interface SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 321

User HTML View UI/UX Model Visualization Interface Break Down the Query into Visualization Visualization operator attribute Admin Request/ Enabler Data View Trend Analyzer Response Model ReturnAssociation Conditional TextualTemporal Query Creation Interface AttributeClustering Attributes ClassificationAttribute Visualization Data 2a Rendering Query Data Query Activity Duration Days Library Manager Mining Minds Client Model Transformation Data Store Interface SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 322

User HTML View UI/UX Model Visualization Interface

Search a match for query based Visualization Visualization on attributes Admin Request/ Enabler Data View Trend Analyzer Response Model ReturnAssociation Conditional TextualTemporal Query Creation Interface AttributeClustering Attributes ClassificationAttribute Visualization Data 2b Rendering Query Data Query Food Age Days Library Manager Activity Duration Days

Mining Minds uid All None

Client Activity GPS/Location Days Model Transformation Data Store Interface SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 323

User HTML View UI/UX Model Visualization Interface Return Conditional Temporal Attribute Attributes Attribute Visualization Visualization Big data connector decides to Admin Request/ RunningEnabler Duration >1m DataDay View30 send the queryTrend Analyzerto HDFS based Response Sitting Duration >15 DayModel30 on theAssociation temporal attribute of 30 m Textual Query Creation Interface days Clustering Classification Visualization Data Walking Duration >2m Day 30 Rendering Query Based on theData query Query big data Library connector orManager IDB interface is Mining Minds called. Client Model Transformation Data Store Interface 3 SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 324

User HTML View UI/UX Model Visualization Interface

Visualization Visualization Admin Request/ Enabler Data View Trend Analyzer Response Model Association Textual Query Creation Interface Clustering Classification Visualization Data Rendering Query Based on theData query Query big data Library connector orManager IDB interface is Mining Minds called. uid ActId DateTime Activity Duration accXClientaccY accZ

2 2 20/08/1513: Walking 5Min 1213.23 1213.23 1213.23 Model Transformation 00:03 1123 1123 1123 Data Store Interface 3 29 2 20/08/1510: Walking 45m24s 14334.1 14334.1 14334.1 SNS DCL Data Integration 00:03 23 23 23 Connector Interface 30 4 20/08/1511: Sitting 2hr 1232 1232 1232 00:03

23 2 21/08/1518: Walking 10min 1213.23 1213.23 1213.23 00:03 1123 1123 1123 Intermediate HDFS 32 4 21/08/1512: Sitting 2m24s 14334.1 14334.1 14334.1 Database Mining Minds 00:03 23 23 23 Platform 30 3 21/08/1513: running 4m 1232.84 1232.84 1232.84 00:03 1 1 1 Workflow of Descriptive Analytics / 325

User HTML View UI/UX Model Visualization Interface

Visualization Visualization Admin Request/ Enabler Data View Trend Analyzer Response Model AssociationThe data is integratedTextual and Query Creation Interface Clusteringtransformed intoClassification a model for fast Visualization Data Rendering Query Data Query processing. Library Manager Mining Minds Client { 4 “Walking":[45,10,5,7,2,0,2,14] Model Transformation Data“Running":[4,0,0,0…..], Store Interface “Sitting”:[120,60,30,21] SNS DCL Data Integration } Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 326

User HTML View UI/UX The data view model sends the data Model for analyzing trends. Visualization Interface

Visualization Visualization Admin Request/ Enabler Data View 5 Trend Analyzer Response Model Association Textual Query Creation Interface Clustering Classification Visualization Data Rendering Query Data Query Library Manager Mining Minds Client Model Transformation Data Store Interface SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 327

User The trend analyzer classifies the data HTML View depending on the metadata, adds facts UI/UX Model and clusters the data for visualization. Visualization Interface Activity Max Time Walking 32 Visualization Visualization 6 Running 12 Admin Request/ Enabler Data View Trend Analyzer Response Model Sitting 240 Activity Average Association Textual Query Creation InterfaceWalking 16min Clustering Classification Visualization Data Rendering Query DataRunning Query 6 min Activity Calories MET Library Manager Burned Mining Minds Sitting 63 min Walking 523 80 Client Running 180 60 Model Transformation Data Store Interface Sitting 26 SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 328

The data is sent in descriptive and User graphical view whichever is HTML View UI/UX requested. Model Visualization Id X-axis Y-axis Multiple series Graph Interface 1 Yes No No Pie 7 2 Yes Yes Yes Line Visualization Visualization 3 Yes Yes Yes Bar Admin Request/ Enabler Data View Trend Analyzer Response Model 4 Yes Yes No Bubble Association Textual Query Creation Interface Clustering Classification Visualization Data Rendering Query Data Query Library Manager Mining Minds Client Model Transformation Data Store Interface SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Workflow of Descriptive Analytics / 329

User HTML View UI/UX Model Visualization Interface

Visualization Visualization Admin Request/ Enabler Data View Trend Analyzer Response Model Association Textual Query Creation Interface Clustering Classification Visualization Data Rendering Query Data Query UsersLibrary Below 50 Manager Users Above 50 Mining Minds Ratio of walking on total active time: 70% Ratio of walking on total active time: 80% Client Ratio of Running on total active time: 30% Ratio of Running on total active time: 20% Model Transformation Data Store Interface SNS DCL Data Integration Connector Interface

Intermediate HDFS Database Mining Minds Platform Contributions / 330

• A query library for efficient data retrieval from big data repository.

• Model Transformation module to transform the data from HDFS and integrate it semantically

• Data Store interface for bringing unstructured (Big Data) and structured data (Intermediate).

• Trend analyzer to classify parameters and associate and cluster them for further insights • Focus on grouping and association techniques Supporting Layer UI/UX Authoring Tool Introduction [1/2] / 332

An Adaptive User interface is a User Interface(UI) which adapts, its layout and elements to individual users based on information acquired about its user(s), the context of use and its environment.

Information about user

Adaptive UI Context of use

Device Information Introduction [2/2] / 333

• In order to achieve a satisfactory adaptation, it is necessary to have several inputs such as

User information Adaptive UI Environmental information

Device characteristics 33 Granularity levels- UI adaptation at different level / 4

 To guide a user to their specific goal within the system by altering the way the system is navigated based on certain factors of the user

Adaptive UI

 It concern the design of the interface, e.g. font size or color, layout of user interface elements

 Adaptations that deal with the context of the situation occurred. Motivation / 335

Adaptive UI Adaptive User interface generation based on user information, context of user & device at run time Motivation Context & Device Model Adaptive UI Gathering knowledge of device and context of use

User Model Gathering knowledge of user capabilities and limitations based on user experience 33 UI/UX Authoring Tool – Communication View / 6

Adaptive User Interface (AUI) User Experience (UX)

6 Application Layer Observational data UX Measurement 5-b 5-a 4 Feedback Analytics Collector Performance metrics Issue-Based Metrics collector Events Screen Session Task success Errors Learnability Issues by Category GUI Observational Issues by Task Capabilities Dispatcher 5-c Crashes & exceptions User ID logs Time on task collector Efficiency Frq. Unique Issues

3 Self-Reported Metrics AdaptationRule 1: Color Layer blind Rule 2: Low vision Rating scales Adaptation EngineRule 3: Low Light 3-d : 3-b Semantic Differential Scales 3-a Theme 1: Color blind 3-c Theme 2: Low vision UI & Style selector GUI generator Reasoner Adaptation and Theme 3 : Low Light navigation rules UX Metrics :

Rule 1: Color blind 7 User ID: 1 Rule 2: Low vision User Satisfaction Model Full Name: Jamil Hussain Rule 3: Low Light Age:Modelling 31 year Layer : Pragmatic Quality Hedonic Quality 2 Vision: Low CapabilitiesColor Blind: False Retriever Screen size: 360x640 dpi User Satisfaction values Context 8 User profile Battery Level: 50% Device characteristics Light Level:characteristics 1000 lx Personalized Information Evolution

RESTful Web Services RESTful Web Services

Profile Data{Name, ID, DOB, Vision} 1 1 Device Information {Screen Size, Battery Level} DCL Update/store user profile value

Intermediate Database Life-log User Profiles Data 33 UI/UX Authoring Tool – Overall Concept / 7

UX Measurement  It deals with metrics that reveal something about the user experience- about the personal experience/ interaction of user with product or system like  effectiveness (being able to complete a task),  efficiency (the amount of effort required to complete the task),  satisfaction (the degree to which the user was happy with his or her experience while performing the task)

Analytics Collector  User interaction behavioral data collection  To measure the Pragmatic qualities

Feedback Collector  Responsible for display the prompt feedback form based on implicit collected data then the user feedback data can be send to self-reporting component to in order to find out (1) the hedonic and (2) pragmatic quality

Capabilities collector  Responsible to collect information about user perception, cognitive and motor skill using different game/ techniques

Adaptation Engine  Reasoner  Input data User Satisfaction Model  Rules  It can be measures it by checking to which extent the users achieved the hedonic  Fire rules based on collected data and pragmatic goals. It considered many UX variables such as  GUI Generator render UI based on fired rules  likability (to which extent user achieve the pragmatic goals),  pleasure (to which extent user achieve the hedonic goals), Capabilities Retriever  comfort (to which extent user feel comfort), and  It collect data for adaptation engine such as  trust (to which extent user satisfy with overall system).  User Profile data  Environmental data  Device information UI/UX Authoring Tool – Overall Concept of AUI / 338

1 Device Information {Screen Size, Battery Level}

2 Environmental Information {Light Intensity Level} Supporting Layer

Adaptive User Interface (AUI) User Experience (UX)

 Restful Service Adaptation Engine 1 Profile Data{Name, ID, DOB, Vision}

Adaptive Rules

If ( Age =32 & Vision= low) DCL Then adaptTextViewSize = 60dp; Intermediate Database Life-log User Profiles Data 33 UI/UX Authoring Tool – Overall Concept / Modelling Layer – Capabilities Retriever 9 1 Adaptive User Interface (AUI) Capabilities Retriever • Its component can enable to retrieve the required data based on context for AUI Rules Engine component Application Layer 1 a 1 b 1 c Analytics Collector Capabilities Retriever Capabilities Retriever Capabilities Retriever Events Screen Session GUI Observational Dispatcher User Profile Context Characteristic Device Characteristic Crashes & exceptions User ID logs • When the user sign-in/sign-up • It is responsible for collection • It is responsible to identify , the user profile module send of environmental information several device characteristics request to DCL such as such as Adaptation Layer • The DCL send back the . Light intensity (Lx) • Screen Size Adaptation Engine required user profile attributes . Nosie level • Battery Level that will be need for adaptive . Timing Information • In order to be aware of the UI & Style selector GUI generator Reasoner Adaptation and UI . Temperature whole domain limitations navigation rules Note: The user profile data are fetched based on per user login 1 Modelling Layer or when there will be change in User ID: 1 user profile Full Name: Jamil Hussain Capabilities Retriever Age: 31 year 1 a 1 b 1 c Context Vision: Low User profile Device characteristics Color Blind: False characteristics

User Profile DCL Data Intermediate Database Restful API Life-log User Profiles Data

Step 1 Step 2 34 UI/UX Authoring Tool – Overall Concept / Modelling Layer – Capabilities Retriever 0 1 Adaptive User Interface (AUI) Capabilities Retriever • Its component can enable to retrieve the required data based on context for AUI Rules Engine component Application Layer Algorithm 1 Capabilities Retriever Analytics Collector Events Screen Session GUI Observational Dispatcher 1. function retrieve capabilities(x, y , z) Crashes & exceptions User ID logs Input: Three kind of information where x is user profile information , y is environmental information, and z is device information Adaptation Layer Output: The required capabilities data are retrieve for adaptive rule engine Adaptation Engine 2. If User is not valid Then 3. return(x) ← error UI & Style selector GUI generator Reasoner Adaptation and 4. else navigation rules 5. set(x) ← (u_ID, name, age impairment, & disability, …. ) 6. get(light_level) ← lux 7. get(noise_level) ← dBA 1 Modelling Layer 8. set(y) ← (light_level, noise_level ) Capabilities Retriever 1 a 1 b 1 c 9. get(screen_size) ← dp Context User profile Device characteristics 10. return (x, y, z) characteristics 11. end DCL

Intermediate Database Life-log User Profiles Data 34 UI/UX Authoring Tool – Overall Concept / Adaptation Layer – Adaptation Engine 1 1 Adaptation Engine Adaptive User Interface (AUI) • It can perform reasoning based on pre-defined adaption and navigations rules. It recommend the user interface components and its styles to render the adaptive UI Application Layer 2 3 Analytics Collector Adaptation Engine Adaptation Engine Note: EasyEvents Rule frameworkScreen will Sessionhandle the conflict GUI Observational Dispatcher Adaptation & navigation rules Reasoner (Rule Base) resolution Crashesand final & exceptions fire rules basedUser on ID prioritylogs • The adaptation rule are classified according to Easy Rules is a simple yet powerful Java rules engine framework the target user disabilities, environmental 1 conditions and as well as accessibility rule to in . Name: a unique rule name within a rules namespace Adaptation Layer order to increase positive user experience (UX) . Description: a brief description of the rule Adaptation Engine 3 . Priority: rule priority regarding to other rules 2 . Conditions: set of conditions that should be satisfied to UI & Style selector GUI generator Reasoner Adaptation and apply the rule Actions: set of actions to perform when navigation rules

User ID: 1 conditions are satisfied Full Name: Jamil Hussain Age: 31 year Modelling Layer Vision: Low Step 2 Color Blind: False Capabilities1 Retriever

Context 2 Rule 1: ColorUser blindprofile Device characteristics characteristics Rule 2: Low vision Step 2 UI adaptive Rules Rule 3: Low Light Restful API : Modelling Layer 3 DCL DCL Adaptive UI User Profile Rules Intermediate Database UI Generator Life-log Data User Profiles Engine Data Theme 1: Color blind Step 1 Theme 2: Low vision Theme 3 : Low Light : UI/UX Authoring Tool – Overall Concept / 342 Adaptation Layer – Adaptation Engine

Adapted UI Data Adaptive UI Rules Engine UI Generator UI adaptive Rules

 Adaptive UI Rule Engine use the user, environmental, technology information and UI adaptive rules to fire the UI rules  UI generator use the fire UI rule in order to generates the final adapted UI UI/UX Authoring Tool – Overall Concept / 343 Adaptation Layer – Adaptation Engine

Adapted UI Data Adaptive UI Rules Engine UI Generator UI adaptive Rules

Easy Rules : Easy Rules is a simple yet powerful Java rules engine framework

Used rule from literature

UI adaptive Rules Used Objected oriented approach for rules creation

Note: In MM we will created rule Authoring tool, so that UX expert will be able to create new rule, update and delete the existing rules.

http://www.easyrules.org/index.html 34 UI/UX Authoring Tool – Overall Concept / Adaptation Layer – Adaptation Engine 4 1 Adaptation Engine Adaptive User Interface (AUI) • It can perform reasoning based on pre-defined adaption and navigations rules. It recommend the user interface components and its styles to render the adaptive UI Application Layer Algorithm 2 Adaptation Engine Analytics Collector Events Screen Session 1. Function reasoning(x, y ) GUI Observational Dispatcher Input: Two kind of information where x is retrieved information, y is adaptation rule sets Crashes & exceptions User ID logs Output: Fire the final rule so that GUI generator can render the GUI based on that rule 1 2. If User is not valid Then Adaptation Layer 3. return(x) ← error Adaptation Engine 3 4. else 2 5. set(x) ← (user profile, environmental information, device information) UI & Style selector GUI generator Reasoner Adaptation and 6. set(y) ← load all rules from adaptation and navigation rules from the local DB navigation rules 7. foreach input ϵ y do 8. registered all rules to rule engine Modelling Layer 9. end Capabilities Retriever 10. fire_rule ← fire rules based on matching Context 11. (fire_rule) User profile Device characteristics return characteristics 12. end

DCL

Intermediate Database Life-log User Profiles Data 34 UI/UX Authoring Tool – Overall Concept / Adaptation Layer – Adaptation Engine 5 1 Adaptation Engine Adaptive User Interface (AUI) • It can perform reasoning based on pre-defined adaption and navigations rules. It recommend the user interface components and its styles to render the adaptive UI Application Layer 2 3 Analytics Collector Adaptation Engine Adaptation Engine Note: EasyEvents Rule frameworkScreen will Sessionhandle the conflict GUI Observational Dispatcher GUI Generator UI & style selector resolution Crashesand final & exceptions fire rules basedUser on ID prioritylogs • GUI received the fire rules Easy Rules is a simple yet powerful Java rules engine • framework GUI generator 1 . Name: a unique rule name within a rules namespace Adaptation Layer . Description: a brief description of the rule Adaptation Engine 3 . Priority: rule priority regarding to other rules 2 UI & Style selector GUI generator Reasoner Adaptation and navigation rules

User ID: 1 Full Name: Jamil Hussain Modelling Layer Age: 31 year Step 3 Capabilities Retriever Vision: Low Step 4 Color Blind: False Context1 User profile Device characteristics 2 Rule 1: Color blind characteristics Step 2 Rule 2: Low vision UI adaptive Rules Rule 3: Low Light : Restful API Modelling Layer DCL 3 Adaptive UI DCL Intermediate Database Life-log Rules UserUI Profiles Generator User Profile Data Data Engine Theme 1: Color blind Theme 2: Low vision Default UI Theme 3 : Low Light Step 1 : Adaptive UI 34 Example Scenario 1: Task Success Metric / UX Measurement Performance metrics 6

Task success Errors Learnability

Time on task Efficiency

User ID Screen Events exceptions U_001 Home click Task Success U_001 activity view graph Task 1 Task 2 Task 3 Average U_001 1 0 0 33 % 1 error- integers 2 3 CONFIDENCE U_001 Set goal submit value U_002 1 1 1 100% INTERVAL … … … … … … … … … Edit U_003 0 1 1 80% U_001 profile Data is sent to Google’s Server 4

U_001 Task Success 33 % // Set the dispatch period in seconds. GoogleAnalytics.getInstance(this).setLocalDispatchPeriod(15); 5 // Send an event to Google Analytics tracker.send(new HitBuilders.EventBuilder() .setCategory(“Update weight actions") DCL .setAction("Update weight ") .setLabel(updateweight ) .build()); Intermediate Database */ Life-log User Profiles } Data

// Send an event to Google Analytics tracker.send(new HitBuilders.EventBuilder() .setCategory(“Activities graphs") .setAction("View activities graphs screen") .setLabel(activitiesgraphs) .addActivities(activitiesgraphs) .setActivitiesAction(activitesAction) .build()); } 34 Example Scenario 2: Learnability Metric / UX Measurement Performance metrics 7

Task success Errors Learnability

Time on task Efficiency LEARNING CURVES

User ID Screen Events exceptions U_001 Home click learnability U_001 activity view graph Trial 1 Trail 2 Trail 3 1 error- integers Task 1 55 sec 45 sec 38 sec U_001 Set goal submit value 2 3 Task 2 60 sec 50 sec 40 sec … … … … … … … … Edit Task 3 80 sec 100 sec 200 sec U_001 profile Data is sent to Google’s Server 4

U_001 Task 1 good

// Set the dispatch period in seconds. U_001 Task 2 good GoogleAnalytics.getInstance(this).setLocalDispatchPeriod(15); U_001 Task 3 averave

// Send an event to Google Analytics tracker.send(new HitBuilders.EventBuilder() 5 .setCategory(“Update weight actions") .setAction("Update weight ") .setLabel(updateweight ) DCL .build()); */ } Intermediate Database Life-log User Profiles // Send an event to Google Analytics Data tracker.send(new HitBuilders.EventBuilder() .setCategory(“Activities graphs") .setAction("View activities graphs screen") .setLabel(activitiesgraphs) .addActivities(activitiesgraphs) .setActivitiesAction(activitesAction) .build()); } 34 Example Scenario 3: Efficiency Metric / UX Measurement Performance metrics 8

Task success Errors Learnability

Time on task Efficiency LOSTNESS

User ID Screen Events exceptions Efficiency U_001 Home click U_001 activity view graph Task Completion Task Time Efficiency Rate (min) (%) 1 error- integers U_002 Set goal submit value 2 3 Task 1 65% 1.5 71 … … … … Task 2 8% 1.2 48 Edit … … … … U_00n profile Data is sent to Task 3 40% 2.1 19 Google’s Server 4

Task 1 Very good design // Set the dispatch period in seconds. GoogleAnalytics.getInstance(this).setLocalDispatchPeriod(15); Task 2 above average design Task 3 bad design // Send an event to Google Analytics tracker.send(new HitBuilders.EventBuilder() .setCategory(“Update weight actions") 5 .setAction("Update weight ") .setLabel(updateweight ) .build()); DCL */ } Intermediate Database Life-log // Send an event to Google Analytics User Profiles tracker.send(new HitBuilders.EventBuilder() Data .setCategory(“Activities graphs") LOSTNESS .setAction("View activities graphs screen") L= sqrt[(N/S - 1) 2+ (R/N - 1) 2 ] .setLabel(activitiesgraphs) .addActivities(activitiesgraphs) N: The number of different screen visited while performing the task .setActivitiesAction(activitesAction) S: The total number of pages visited while performing the task, counting revisits to the same screen .build()); R: The minimum (optimum) number of screen that must be visited to accomplish the task } 34 Example Scenario 4: All UX Measurement Metrics / 9 CONFIDENCE INTERVAL User ID Screen Events exceptions U_001 Home click Task Success Time On task Errors U_001 activity view graph Task 1 Task 2 Task 3 Average Task 1 Task 2 Task 3 Task 1 Task 2 Task 3 error- Adaptive User Interface (AUI) U_001 1 0 1 80User % U_001Experience55 20 (UX)10 U_001 1 1 0 U_001 Set goal submit integers value U_002 30 20 10 U_001 0 1 1 … … … … U_002 1 1 1 100% … … … … … … … … Edit … … … … … U_002 profile Application Layer U_003 0 1 1 80% UXU_003 Measurement0 1 1 U_001 0 1 0 1 Feedback Analytics Collector Performance metrics Issue-Based Metrics collector Events Screen Session Task success Errors Learnability Issues by Category GUI Observational Issues by Task Capacity Dispatcher Crashes & exceptions User ID logs 2 Time on task collector Efficiency Frq. Unique Issues

Efficiency Self-Reported Metricslearnability Behavioral & Physiological Metrics Adaptation Layer Task Completion Rate Task Time (min) Efficiency (%) Trial 1 Trail 2 Trail 3 LOSTNESS L= sqrt[(N/S - 1) 2+ (R/N - 1) 2 ] Perceptions Emotions U_001 1 0 1Rating scalesTask 1 60 % 70% 80 % Adaptation Engine U_002 1 1 1 Task 2 60 % 50 % 40 % Semantic Differential Scales Stress Cognitive … … … … … … … …

UI & Style selector GUI generator Semantic reasoner U_003Adaptation and0 1 1 Task 3 80 % 80 % 80 % navigation rules User Satisfaction Model 1. Need simple UI 2. Change the task 2 flow Pragmatic Quality Hedonic Quality Modelling Layer Capabilities Collector Semantic Modeller Personalized Information Evolution Context Device User profile characteristics characteristics Adaptive ontology

DCL

Update pragmatic constructs in user profile Intermediate Database Life-log User Profiles Data Contributions / 350

• Rules for adaptive UI • User experience measurement toolkit development • User Experience (UX) can be improved with Adaptive UI • Adaptive UI can improved accessibility • Adaptive UI can improved users' performance and satisfaction • Continuous evolution of UI with contextual information change Thank you