Tangible User Interfaces in the Smart Home Environment

Exploring the User Experience of Instant Smart Lighting System Control

Iris Bataille

Interaction Design One-year Master 15 ECTS Spring semester 2020 Supervisor: Maliheh Ghajargar

Abstract Smart technologies are becoming ubiquitous and more complex in home environments, which brings the challenge for interaction designers to ensure a pleasant user experience for smart homes. Therefore, this thesis explores how different types of tangible user interfaces influence the user experience of smart lighting system control. After conducting research in this context, two different tangible user interfaces were designed: one counter device using tokens and one hand-held device using embodied metaphors. These devices were validated through testing their engagement, ease of task performance, meaningfulness of representation and controls, as well as richness of interaction and human skills. It was found that the counter device creates the best user experience when performing more complicated tasks in a frequent long-time use scenario, while the hand-held device creates the most pleasant experience when performing less complicated tasks in an infrequent use scenario.

2

Table of Contents

1. Introduction ...... 4

2. Background ...... 6

2.1 Theory ...... 6 2.2 Canonical Examples ...... 8 3. Methodology ...... 12

3.1 Project Plan ...... 12 3.2 Methods ...... 13 4. Design Process ...... 15

4.1 Analysis ...... 16 4.2 Ideation ...... 17 4.3 Conceptualization ...... 21 4.4 Realization ...... 26 4.5 Validation ...... 30 5. Results ...... 37

5.1 Tangible User Interfaces for Instant Smart Home Control ...... 37 5.2 Results from Validating the Tangible User Interfaces ...... 38 6. Discussion ...... 41

6.1 Tangible User Interfaces ...... 41 6.2 User Validation ...... 41 6.3 Other Results from the User Validation ...... 42 6.4 Future Work ...... 42 7. Conclusion ...... 43

8. Acknowledgments ...... 44

9. References ...... 45

Appendix ...... 47

1. Final Digital Context Prototype Code ...... 47 2. User Validation Study Set-up ...... 49 3. Informed Consent Form ...... 50

3

1. Introduction Smart homes are domestic architectural spaces in which devices and systems can be controlled automatically by using smart technologies (Allen et al., 2001). These homes are becoming bigger and more complex, for example by increasing the number of smart devices such as smart lamps. This brings an interesting challenge for interaction designers to make sure that the user still has a pleasant experience while operating smart home systems (Luria, Hoffman & Zuckerman, 2017).

There is a trend in increasing the smart home system autonomy, which can be defined as the ability to automatically and remotely control basic home functions and features (Brennan, McCullagh, Galway & Lightbody, 2015). However, the downside of this is that users can lose the sense of control over the state of their home (Koskela & Väänänen-Vainio-Mattila, 2004). This is why this thesis project is not focused on home automation, but on tangible interfaces that enable users to control their smart homes.

Tangible interfaces can facilitate a richer way of interaction, which increases the understandability and trustability of smart home devices (Angelini, Mugellini, Abou Khaled & Couture, 2018). However, in a comparative study, it was found that the usability of tangible interfaces is still a significant design challenge (Luria et al., 2017). Therefore, the usability and user experience of different kinds of tangible user interfaces in the smart home environment will be further explored in this thesis project.

There are different ways to control smart homes. A distinction can be made between two different types of smart home control. The first type is pattern control, which is a way of programming smart home automation (Koskela & Väänänen-Vaino-Mattila, 2004). Several interfaces have been designed with this goal in mind, such as Smart Home Cards (Tada, Takahashi & Shizuki, 2016). The second type is to control and adapt the smart home instantly. Examples of these interfaces are SensePods (Khan & Zualkernan, 2018) and Ikea’s Shortcut Buttons (Ricker, 2019). Further descriptions of the examples can be found in chapter 2.2. In this project, the focus will be on the latter, instant smart home control.

So this thesis aims to explore the user experience of tangible interfaces for instantly controlling smart homes. To make this research more specific and narrowed down, the use scenario of controlling a smart lighting system is chosen to focus on just one sub-area of smart home devices.

This results in the research question: How can different types of tangible user interfaces influence the user experience of instant smart lighting system control? To answer this research question, two different types of tangible user interfaces will be analyzed through testing 1. their engagement 2. ease of task performance 3. meaningfulness of representation and controls 4. richness of interaction and human skills.

By answering this research question, this thesis aims to contribute to the field of interaction design through generating knowledge about the differences in user experience between different types of tangible user interfaces. This knowledge can be used for future development of new tangible smart home user interfaces. Next to the goal of contributing to the field of interaction design, the topic for this thesis was also chosen because of personal interests. My interests are in tangible interaction, design for the everyday life, user-centered design and IoT, which are topics combined in this thesis project.

In this thesis project report, the background will be explained first by introducing the theory about user control and experience in the smart home and tangible interaction. Next to that,

4

several canonical examples are discussed. After that, the project plan and the methods used during this project will be introduced. Then, the process of this thesis is described, which is followed by the results produced during the process. Lastly, the project process and results will be discussed and a conclusion is drawn.

5

2. Background In this section, an overview of the background of this thesis is given by discussing theory related to user control in smart homes, the user experience of smart homes and tangible interaction. Next to that, several canonical examples related to this thesis are reviewed.

2.1 Theory 2.1.1 User Control in the Smart Home Smart homes are domestic architectural spaces in which devices and systems can be controlled automatically by using smart technologies (Allen et al., 2001). This is an application of , the integration of multiple technical devices into the everyday physical world (Weiser, 1993). These devices are often interconnected, which makes smart homes an application of the Internet of Things (IoT) as well (Xia, Yang, Wang & Vinel, 2012). Smart homes offer a better quality of life by optimizing user comfort at home (Alam, Reaz & Ali, 2012). However, these homes are becoming bigger and more complex because of the increase in smart devices in the home environment. This brings an interesting challenge for interaction designers to make sure that the user still has a pleasant experience while operating smart home systems (Luria et al., 2017).

There are different ways of smart home control. Firstly, a trend can be seen in increasing the smart home system autonomy. Since smart homes are becoming bigger and more complex, automation is a suitable solution for not overwhelming the user. However, the downside of this is that users can lose the sense of control over the state of their home (Koskela & Väänänen-Vainio-Mattila, 2004). Therefore, there is another way of smart home control in which the user is given full authority. This way, they gain a feeling of control again. Another argument for this way of control is that life should be mentally and physically challenging and therefore human effort should be required for operating smart homes (Intille, 2002).

These different ways of smart home control can be viewed as two extremes of a bigger spectrum. The interaction-attention continuum (see figure 1) visualizes this spectrum very well (Bakker & Niemantsverdriet, 2016). An example of a way of smart home control that is in between these extremes is calm computing. With calm computing, the focus is on informing the user in the periphery of attention and only demanding focus when necessary (Weiser & Brown, 1997). This way, the user is still in control and the smart home supports the user when needed.

Figure 1. The interaction-attention continuum (Bakker & Niemantsverdriet, 2016).

Lastly, a division can be made in two different kinds of user control for smart homes. One way of control is pattern control, which focusses on controlling functions in advance. The other way is instant control, which focusses on managing functions instantly. It was found that these different ways of control should correspond to different kinds of interfaces (Koskela & Väänänen-Vainio-Mattila, 2004). In this thesis, the focus will be on instant smart home control. 6

2.1.2 The User Experience of Smart Homes As mentioned before, there is a challenge for interaction designers to improve the user experience of smart homes (Luria et al., 2017). The user experience can be defined as the individual experience of someone using a system, which is influenced by prior experiences and related to its social and cultural context (Roto, Law, Vermeeren & Hoonhout, 2011). For a good user experience, the focus should not only be on the functionality and usability of a , but also on autonomy, competence, relatedness and self-actualization (Eggen, van den Hoven & Terken, 2016).

Several studies have been conducted to compare the user experience of different kinds of user interfaces for smart home control. In 2004, researchers found that user interfaces with diverse input methods are most suitable for pattern control, while portable central interfaces fit instant control best (Koskela & Väänänen-Vainio-Mattila, 2004). More recently, a study was conducted to compare social robot, screen and voice interfaces. It was found that the social robot interface could provide high situation awareness, but that its usability is still a big design challenge (Luria et al., 2017). This social robot interface relied on physical human action by sharing objects with the user and can therefore be seen as a tangible interface. More information about this social robot can be found in section 2.2.2.

2.1.3 Tangible Interaction In 1997, Ishii and Ulmer introduced the term “Tangible Bits”, which are tangible interactive surfaces which allow for grasping and manipulating (Ishii & Ulmer, 1997). This has further developed into the field of Tangible Interaction. Tangible Interaction (TI) can be defined as a subfield of interaction design that focusses on tangibility and full-body interaction and materializes data and computational resources (Hornecker & Buur, 2006). Tangible interaction can be applied to smart homes by introducing tangible user interfaces (TUIs). Tangible user interfaces are interfaces that interlink the digital and physical world (Shaer & Hornecker, 2010). These interfaces facilitate a richer way of interaction, which increases the understandability and trustability of smart home devices (Angelini et al., 2018).

There are two other design areas related to tangible interaction and smart homes that are worth mentioning as well. Firstly, the field of tangible computing, which includes tangible user interfaces and ubiquitous computing, amongst other things. The characteristics of tangible computing are multiple input possibilities, no enforced order of action and the use of affordances to guide the user in their interactions (Dourish, 2001). This is closely related to the field of embodied interaction, which can be defined as interacting through involving people’s physical body with technology, for example through gestures and movements (Dourish, 2001).

Another field that is related to tangible interaction and smart homes is the Internet of Tangible Things (IoTT). In this field, tangible interaction is being applied to the Internet of Things. By applying tangible interaction to the Internet of Things, rich and meaningful interactions that use human skills can be created (Angelini et al., 2018).

It was found that there are several different types of tangible user interfaces. Firstly, a distinction can be made between counter devices and hand-held devices. Next to that, four different types of interacting with tangible user interfaces were found.

The first type is a counter device with which can be interacted through tokens. These types of tangible user interfaces often consist of two parts: tokens and a display which provides the constraints for the tokens. The tokens are physical objects that represent digital information (Ullmer Ishii & Jacob, 2005). By changing the placement of the tokens within the constraints of the display, the output of the connected smart devices changes as well. The second type is 7

interacting with a counter device through physical manipulation (Manches & O’Malley, 2012). This can be done by for example pressing a button or moving a slider to change the output. The third type of interaction is interacting with a hand-held device through embodied metaphors. Embodied metaphors are metaphorical extensions of embodied structures, so added metaphorical meanings to movements (Bakker, Antle, & Van Den Hoven, 2012). By moving while holding the hand-held device, changes are made to the output of the smart home system. Lastly, the fourth type is interacting through gestures. This interaction relies on moving, holding and touching hand-held devices to make changes (Angelini, Lalanne, Van den Hoven, Khaled & Mugellini, 2015).

In the following chapter, examples of these different types of TUIs that are related to the smart home environment are discussed.

2.2 Canonical Examples The examples discussed in this chapter have been selected since they show different types of tangible user interfaces and are all inspirational for the thesis project. The first interface is designed for pattern control and the others are designed for instant control. Example 2.2.3 and 2.2.5 are not related to the smart home environment, but are still included since they are good examples of different types of tangible interaction.

2.2.1 Smart Home Cards

Figure 2. The Smart Home Cards. a) Paper cards, b) drawing parameters, c) putting cards, d) the result (Tada, Takahashi & Shizuki, 2016)

The Smart Home Cards are paper cards (see figure 2, a) that are used to control a smart home programming environment. They are designed by Tada, Takahashi and Shizuki. To change the settings, users can draw on the cards (figure 2, b). Then, the cards are placed on a rule board (figure 2, c). This way, a more expressive end-user tangible programming environment for controlling smart devices is created (Tada et al., 2016).

These cards are a good example of making data physical and of interaction through tokens. One of its strengths is the possibility for users to change the settings by drawing on the cards. This way, the user has a lot of freedom in controlling their smart home. However, these cards are meant for pattern control of smart homes and would therefore not be as useful and meaningful for instant control. Nevertheless, they are still inspirational for designing an interface that enables the user to adapt aspects of the interface to their needs.

8

2.2.2 Vyo

Figure 3. Vyo, the embodied social robot interface for smart-home control (Luria et al., 2017)

This social robot interface was part of a comparative study between different user interfaces for smart home control (see 2.1.2). This interface, which is designed by Luria, Hoffman and Zuckerman, combines tangible user interfaces with expressive robotics. Users can control their smart home by placing and moving icons that represent the smart devices that are connected to the interface over the bottom of the interface, using them as physical sliders. The output of this interface is the state of each smart device that is connected to the interface, expressed in physical gestures and icons that appear on a screen (Luria et al., 2017).

This social robot interface demonstrates the type of tangible user interfaces that uses tokens to enable users to give input to the device. It would be interesting to further explore the possibilities for using tokens and the experience that is created. The use of expressive robotics is not relevant for this thesis project, but still an interesting addition.

2.2.3 Marble Answering Machine

Figure 4. The Marble Answering Machine. Left: new messages have arrived. Middle: The user plays back a message. Right: The user stores a message. (Shaer & Hornecker, 2010)

The Marble Answering Machine has been designed by Durrell Bishop in 1992 (Abrams, 1999). This device is a very iconic example of a . It uses marbles (tokens) that represent messages. By placing the marble on a play-back area (figure 4, middle) or on a dish (figure 4, right), the user can listen to the message and store it. To delete the message, the marble can be put into the device again.

This interface is not related to smart homes, but is still relevant to this thesis project. It is namely a well-known example of using tokens for a tangible user interface. This is an interesting type of tangible interaction that could be further explored.

9

2.2.4 Shortcut Buttons

Figure 5. The Shortcut Buttons with different icons (Ricker, 2019)

At the end of 2019, Ikea introduced their new Shortcut Buttons for smart home control. These buttons can be attached to the wall and have a changeable cover. By pressing the button, the user can change their smart devices to a pre-defined setting. These settings are connected to an event, such as leaving home (Ricker, 2019).

The shortcut buttons are an example of interacting through physical manipulation. Next to that, this is a good example of tangible user interfaces that are not only used for research, but also really designed to be sold to customers. The advantages of these buttons over controlling the smart home via an app are accessibility and situatedness. The buttons are easily accessible to children who do not have access to a smartphone and to guests. Additionally, the buttons could be placed at convenient places that correspond to the related event. For example, a ‘dinner time’ button could be placed in the kitchen. This makes their situatedness an advantage as well.

2.2.5 Moving Sounds

Figure 6. Three out of eight artifacts designed for Moving Sounds. These three artifacts embody manipulating the tempo of sound (Bakker, Antle & Van Den Hoven, 2012)

Moving Sounds is a tangible system for teaching children about abstract sound concepts, which is designed by Bakker, Antle and Van Den Hoven. Different artifacts were designed to embody different abstract sound concepts (see figure 6). This was done by using embodied metaphors in interactive objects. The Moving Sounds have been used to develop a design approach to designing tangible systems with embodied metaphor-based mappings (Bakker et al., 2012).

Even though the design context of Moving Sounds is completely different from the design context for this thesis project, the Moving Sounds project is still relevant for this project. It demonstrates a different type of interacting with tangible user interfaces, namely through embodied metaphors. It would be interesting to see this type of interaction being applied to smart homes as well.

10

2.2.6 SensePod

Figure 7. A prototype of the SensePod (Khan & Zualkernan, 2018)

The SensePod is a hand-held wireless device, designed by Khan and Zualkernan. This device can be used to control a smart home using gestures such as rubbing, tapping or rolling the device on a surface. It has been designed to complement voice or smartphone-based interfaces (Khan & Zualkernan, 2018).

The SensePod is a good example of interacting with a hand-held device through gestures. The use of creates an interesting new smart home user experience. However, the designers’ focus was on the functionality of the device. Therefore, it would be interesting to further look into the user experience of a gesture-based smart home interface.

To conclude, several insights have been gained by analyzing these examples. Firstly, the examples showed different ways of making data physical, which are inspirational for the designs created during this thesis project. Next to that, several interesting examples of types of tangible user interfaces, such as interfaces using tokens and embodied metaphors, were found. These examples are very useful for defining the different types of tangible user interfaces.

11

3. Methodology 3.1 Project Plan

week 1 2 3 4 5 6 7 8 Analysis Ideation Conceptualization Realization Validation

Figure 8. An overview of the different phases of this thesis, placed on a timeline from the start to the end of the thesis project (10 weeks)

This thesis project has been divided into six different phases (see figure 8). Each phase will be discussed below.

3.1.1 Analysis During this phase, the focus is on analyzing the research domain by doing literature research and analyzing related work. By doing this, the research question will be refined and different types of tangible user interfaces will be defined.

3.1.2 Ideation The ideation phase starts with defining a design scenario and the different types of tangible user interfaces that will be used for this thesis project. After that, design ideas are developed using the rapid ideation method. The goal of this phase is to come up with a variety of proposals for each type of tangible user interface.

3.1.3 Conceptualization In this phase, one design idea per type of tangible user interface will be selected and further developed by creating customer journeys. Next to that, low-fidelity prototypes will be created.

3.1.4 Realization During the realization phase, experience prototypes will be produced of the design concepts. The focus will not be on making the prototypes completely functional, but on demonstrating the experience of the concepts using the Wizard of Oz method.

3.1.5 Validation In this phase, the different types of tangible interfaces will be validated by conducting user tests. For these user tests, several design requirements will be defined using the literature from the analysis phase. The goal of these tests is to compare the different user interfaces and to draw conclusions about the user experience of these different interfaces.

12

3.2 Methods 3.2.1 Literature Research The literature research for this project consisted of several different phases. Firstly, possible relevant sources were searched for on several platforms and databases using specific keywords.

After having found the sources, each source was numbered, to keep a clear overview, and then read. Extra sources were found by looking at the references in the papers that were selected. Notes were made for each source, which were later used to divide the sources into different categories. Lastly, schematics were made of different fields and the sources belonging to these fields.

This method was chosen to get a better understanding of the different subfields of interaction design that are relevant for this thesis project. Next to that, the knowledge gained from doing the literature research was used for defining a realistic sketch of use (see 3.2.2) and different types of tangible user interfaces that could be designed (see 4.2.2).

3.2.2 Scenario-Based Design The term Scenario-Based Design covers several techniques that describe the use scenario of a future system, early in the design process (Rosson & Carroll, 2009). The focus of this design practice is on the use of a system to accomplish tasks. The scenario of this user interaction is a sketch of use (Rosson & Carroll, 2009).

This method was used during the ideation phase to make the design context more specific. By doing this, it was made sure that both devices can be used in the same scenario, since there is no difference in the details of the different concept directions. This way, the different concepts are easily comparable without the influence of external factors.

3.2.3 Rapid Ideation The goal of rapid ideation is to generate, evaluate and refine a wide range of designs in a short time period (Clark & Reinertsen, 1998). It can be defined as an active idea generation session with high speed. During this ideation session, notes, drawings and photographs are made.

This ideation method was chosen to make sure that qualitative ideas were generated within the tight time constraints of this thesis project.

3.2.4 User Journey Mapping A user journey map, also known as a customer journey map, visualizes the user’s journey of performing a certain task (Marquez, Downey & Clement, 2015). It is used to get a better understanding of the steps required to perform this task. The visualization can be done in several ways, one of which is making a storyboard.

This method was used during the conceptualization phase to further develop the selected tangible user interfaces. It was decided to make a customer journey storyboard of a certain use scenario, to further define the concepts.

3.2.5 Experience Prototyping An experience prototype is a prototype that allows for first-hand experiencing existing or future conditions through interacting with it (Buchenau & Fulton Suri, 2000). This means that the prototype does not have to be completely functional, as long as it still can convey the experience that would be created by using a design concept. An experience prototype can be used for understanding existing experiences, exploring design ideas and communicating design concepts (Buchenau & Fulton Suri, 2000). 13

This method was used to develop prototypes that can communicate the design concepts for the two different tangible user interfaces to potential users during the validation phase. These prototypes are not fully functional since they aim to envision and explore the user experience.

3.2.6 Wizard of Oz The Wizard of Oz method is a way of creating the experience of using a design concept, without the prototype being completely functional (Dow, MacIntyre, Lee, Oezbek, Bolter & Gandy, 2005). In studies that use this method, which is inspired by the movie with the same name, a wizard operator plays a role in the system that would otherwise be performed by the system itself.

This method was used during the user validation to create the user experience of the different tangible user interfaces without making the prototypes fully functional. The role of wizard operator was taken by me, by operating the digital context prototype. So instead of using technology to change the light settings when the user performed certain actions with the tangible user interfaces, they were changed manually. This way, I was able to test the devices without having to make them fully functional.

3.2.7 Design Requirements Design requirements state the important characteristics that a design has to meet to be successful (Van Boeijen, Daalhuizen, van der Schoor & Zijlstra, 2014). In several resources found during the analysis phase, various design requirements related to the user experience were discussed (Luria et al., 2017, Koskela & Väänänen-Vaino-Mattila, 2004, Angelini et al. 2018).

The design requirements were used during the validation phase to find categories for comparing the two tangible user interfaces. Relevant design requirements were selected from the resources mentioned above and used to make interview questions. This way, well- founded questions were formulated that could be used for answering the research question.

3.2.8 Semi-structured Interviewing A semi-structured interview is a conversation between an interviewer and another person during which the person is asked questions, but is also able to discuss topics and issues that they think are important (Longhurst, 2003). So even though the interviewer has prepared interview questions, the interviewee still has the opportunity to deviate from these pre- determined questions.

This method was chosen for the user validation to get a better understanding of the experience of the participants. Even though I prepared specific questions related to the design requirements (see 3.2.7), I also wanted to know about the general experience of the participants. Therefore, I chose to use the semi-structured interviewing method instead of a fully structured interview.

3.2.9 Affinity Diagram An affinity diagram is a structured way of presenting information in groups based on their natural relationships. It is often used to analyze and organize data and ideas (Naylor, 2019).

This method was used during the validation phase to process the data gathered during the user validation sessions. By using this method, the qualitative data were organized in a clear and structured way.

14

4. Design Process

Analysis Ideation Conceptualization Realization Validation

2

4 e

3 11 nvergenc 6 co 5 7 9 8 10 1 12 4.1 4.2.1 4.2.3 4.3.1 4.3.4 4.4.1 4.4.2 4.5.1 4.5.5

4.2.2 4.3.2 4.5.2 4.5.6 4.3.3 4.5.3 4.5.4

Figure 9. An overview of the design process of this thesis project. For each phase, the corresponding chapters in this thesis report can be found underneath the graph.

The design process of this thesis project has been divided into five phases (see 3.1). Each of these phases is again divided into several sub-phases. A graph was made to show the process (see figure 9). An ascending line shows the process of broadening the project, while a descending line shows the process of narrowing down the project. Each peak and valley are explained below:

1. Start of the project. I knew that I wanted to do something with tangible interaction in the smart home environment, but that was all I knew. 2. End of the analysis phase. Information was gathered about several ways of smart home control and several types of tangible interfaces. 3. Mid-ideation phase. I defined the sketch of use and the two tangible user interfaces that I was going to design for. 4. End of the ideation phase. Several ideas for the two tangible user interface concepts were developed. 5. Mid-conceptualization phase. One idea per concept was selected and further developed by making customer journey storyboards and low-fidelity prototypes. 6. Second iteration for the hand-held device. I ideated again about implementing a signifier for changing the color temperature into the hand-held device. 7. End of the conceptualization phase. For each tangible user interface type, a well- documented design concept was defined. 8. Mid-realization phase. A tangible experience prototype for each design concept was made. After this, the development of the digital context prototype started. 9. Mid-realization of the digital context prototype phase. Two extra iterations were conducted to develop an understandable way of showing the change of the color temperature of each lamp. 10. End of the realization phase. A final digital context prototype was made. 11. Mid-validation phase. New insights were gained by testing the prototypes with several possible users. 12. End of the validation phase. I formulated several insights gained and conclusions drawn from testing the prototypes.

15

4.1 Analysis

Figure 10. The schematic of the literature read during the analysis phase. Each circled number refers to a source that was read.

This thesis project started with exploring the context by doing literature research (see 3.2.1). First, a collection of papers was gathered using the keywords smart home, tangible interaction, interaction design, user experience design, Internet of Things, design for the everyday life and combinations of these words. The platforms and databases used for finding these sources are the ACM digital library, Google Scholar and the Malmö University library.

After that, these papers were read while taking notes. To not become too overwhelmed with all the different kinds of literature, I divided the literature into different categories: smart home theory, user experience theory, tangible interaction theory and canonical examples (see figure 10). Since some theory sources belonged to multiple categories, a Venn diagram was made.

Figure 11. The schematic of the literature about tangible interaction and the subfields I encountered.

To get a better understanding of the subfields of tangible interaction that I encountered, another Venn diagram was made (see figure 11). All overviews were then used to write a coherent text about the background theory and canonical examples using the relevant literature. By going through this phase, I got a better understanding of the context that I am designing for, which helped me define a realistic sketch of use. Next to that, I was able to define different types of tangible user interfaces by looking at the canonical examples.

16

4.2 Ideation 4.2.1 Sketch of Use To get a better understanding of the design context I was designing for, I made a sketch of use (see 3.2.2):

The context that I am designing for is a smart home, which has one room with two smart light bulbs, such as Philips Hue (Philips, 2020) or Ikea Trådfri (Ikea, 2020). There are no other smart devices in this room. For each lamp, the user can change: - the brightness of the lamp - the color temperature of the light Both variables can be changed over a range and are therefore not binary (either on or off). These changes are also made instantly, which means that the user immediately controls and adapts the system.

The smart home will be operated by just one user. This user hasn’t had a smart home before. Therefore, this is the first time that the user is operating a smart home. However, the user has prior experience with using technology and is therefore not technology illiterate.

The tools that are provided to the user to operate their smart home are the different tangible user interfaces that I will design. These artifacts will be used one at a time.

By doing this, I felt more confident in starting to ideate about different types of tangible user interfaces. It helped me to exactly state which devices and what settings should be controlled with the tangible user interfaces.

4.2.2 Types of Tangible User Interfaces The next thing I did was defining the different types of tangible user interfaces that I could design for. First, I defined the type of TUI for each of the canonical examples explored during the analysis phase:

Canonical example Type of TUI Smart Home Cards Counter device Interaction through tokens SensePod Hand-held device Interaction through gestures Vyo Counter device Interaction through tokens Marble Answering Machine Counter device Interaction through tokens Moving Sounds Hand-held device Interaction through embodied metaphors Shortcut Buttons Counter device Interaction through physical manipulation (pushing)

From this, I noticed a clear difference between hand-held and counter devices. This led me to further investigate these two different types:

17

Type of TUI Type of TI Counter Device Interaction through tokens Interacting with a device by using movable physical parts that embody certain functions or parts of the smart home.

Interaction through physical manipulation (pushing) Interacting with a device by physically manipulating, in this case pushing, the interface.

Hand-held Device Interaction through gestures Interacting with a device by gesturing over or on the surface of the device.

Interaction through embodied metaphors Interacting with a device by moving the body while holding the device.

So both types of TUIs can be divided into two different types of TI with these devices.

The aim of this thesis project is to explore the user experience of different types of tangible user interfaces. To be able to make a clear comparison and to make this thesis feasible considering the time constraints and the scope of the project, two specific types of TUIs were chosen to be further developed. These two were chosen because of their interesting aspects and diversity from each other.

The first TUI is the counter device using tokens. This TUI uses the concept of situatedness. The interaction possibilities with this device are limited to a specific field. Next to that, the user uses mostly their hands to operate this device. The tokens are chosen as type of interaction with the counter device, since they allow for rich interactions and can be applied in different ways.

The second TUI is the hand-held device using embodied metaphors. The use of this TUI is more expressive and enables the user to use its whole body. Next to that, the device is not bound to a specific place. The embodied metaphors are chosen as type of interaction with the hand- held device, since they give meaning to movements and therefore aim to help the user to understand the device.

To summarize, these are the two types of TUIs that I am going to design for:

Counter Device Hand-Held Device Interaction through Tokens Interaction through Embodied Metaphors = interacting with a device by using movable physical = interacting with a device by moving the body while parts that embody certain functions or parts of the holding the device, adding metaphorical meanings to smart home movements

Examples: Example: ⋅ Marble Answering Machine ⋅ Moving Sounds ⋅ Vyo

18

4.2.3 Rapid Ideation After having defined the sketch of use and the two types of TUIs that I am designing for, it was time to start ideating. I did a rapid ideation session (see 3.2.3) for each TUI type.

Counter Device using Tokens For the first type of tangible interface, I first defined the different things that the tokens can embody. Three categories were defined: a connected smart device, a setting or a pre-set for a specific situation. After that, I explored the possibilities for different interfaces with different kinds of tokens by making sketches (see figure 12).

Figure 12. A few of the sketches made while ideating about the counter device using tokens. Left: the device that uses tokens that embody a connected smart device. Middle: the device that uses tokens that embody a setting. Right: the device that uses tokens that embody a pre-set for a specific situation.

Hand-Held Device using Embodied Metaphors For this TUI, I first ideated about the possible metaphors that could be used to embody the brightness and color temperature settings (see figure 13). I chose the metaphor small vs big to embody the brightness of the light, since this can be associated with the lighted area around the lamp (small if the brightness is low and big if the brightness is high) The metaphor low vs high was chosen for the color temperature, since this can be associated with the number of degrees of the temperature (a low number means a cold temperature and a high number means a warm temperature).

Figure 13. Ideating about the possible embodied metaphors for the brightness and color temperature settings.

After that, some new ideas were developed for implementing these embodied metaphors in a hand-held device (see figure 14).

19

Figure 14. Sketches of ideas for implementing the embodied metaphors in a hand-held device.

20

4.3 Conceptualization 4.3.1 Idea Selection For the counter device using tokens, the idea was selected in which the tokens embody a connected smart device (see figure 15). It was found that this concept makes the best use of the spatiality of using tokens and facilitates best for connecting the device to multiple devices. In this concept, the base is a display that is placed at an angle. By moving a token of one of the lamps vertically (and therefore higher or lower), the brightness is changed. The color temperature can be changed by moving the token horizontally.

This device is different from a screen based solution because of its possibilities to move the tokens not only horizontally or vertically, but also on top of each other. While graphical user interfaces allow for 2D interaction, this device enables users not only to move the tokens horizontally and vertically, but also upwards or downwards.

Figure 15. A sketch of the selected idea for the counter device using tokens.

For the hand-held device using embodied metaphors, a harmonica-like structure was chosen to embody the brightness of the lamps (see figure 16). The color temperature can be changed by moving the device vertically. To select which lamp is being controlled, a twistable ring was chosen, since this facilitates best for connecting the device to multiple devices and has a more interesting way of interacting than pushing a button. To confirm the settings, the handles should be squeezed. This was chosen since it is another way of interacting without adding tangible parts.

Figure 16. A sketch of the selected idea for the hand-held device using embodied metaphors.

These two types of TUIs are interesting to compare, since the interactions with these devices are very unalike. The counter device is situated in one specific place, while the hand-held device can easily be moved somewhere else. Next to that, the amount of bodily involvement is also different between the two interfaces. The interaction with the counter device is quite limited to using a hand to reposition tokens, while the hand-held device relies more on bodily movement such as moving your arms up and down.

21

Regarding the artefact behavior of the devices, the two devices function as a means for giving input to the smart lighting system, since both devices were designed for instant control. The output is the change of the light settings. For the input, it was decided that adjustment of the settings is a linear process. This means that the settings can be adjusted from a scale to 0% to 100% with equal interim steps. Changing this linear process to a more irregular process would over-complicate the use of the TUIs and therefore decrease the user experience.

4.3.2 User Journey Storyboard To get a better understanding of what a use scenario of using the TUIs would look like and to be able to easily explain both concepts, a customer journey map (see 3.2.4) for each TUI was made (see figure 17 and 18).

Figure 17. The storyboard of the customer journey for the counter device using tokens.

22

Figure 18. The storyboard of the customer journey for the hand-held device using embodied metaphors.

4.3.3 Low-fidelity Prototyping For each of the TUIs, a low-fidelity prototype was created to see if there should be made any adjustments to the concept before making the prototypes that would be used during the validation phase. These prototypes can be seen in figures 19 and 20.

Figure 19. The low-fidelity prototype of the counter device using tokens.

Figure 20. The low-fidelity prototype of the hand-held device using embodied metaphors.

23

By doing this, the need for several adjustments was found. Firstly, since the display of the counter device was placed on an angle, I noticed that the tokens would slide down after placing them on the display. Therefore, a solution had to be found to make sure that the tokens would stay in place. This could be done by implementing magnets into the tokens and using metal for the display, but this would make the device appear very cold and mechanical. Since it was preferred to give the device a warmer and more inviting look, it was chosen to use MDF as the material for the prototypes used during the validation phase. Therefore, another solution had to be found for the sliding tokens, which resulted in attaching double-sided tape to the bottom of the tokens.

Next to that, it was found that the interaction of squeezing the handles of the hand-held device did not feel smooth. The handles were made of a very stiff material and would therefore not react to the squeezing movement. Therefore, it was decided to add a thin layer of foam to the handles, which changes its shape slightly when being squeezed.

4.3.4 Second Iteration Hand-Held Device After making the low-fidelity prototype of the hand-held device, it was found that the device did not indicate well enough how it should be used. The counter device had very clear signifiers, but the signifier for the color temperature in the hand-held device was missing (see figure 21). Signifiers are signs on a device that communicate where a certain action should take place (Norman, 2013).

Figure 21. Analyzing the signifiers in both devices.

After finding out that a signifier for the color temperature was missing, I ideated about possible signifiers. The signifier in the counter device makes use of a color spectrum from blue to red, which is associated with cold to warm. Examples of where this signifier is used as well are showers, water faucets and thermostats. It was decided to apply this color spectrum to the hand-held device as well as the color temperature signifier. Several possibilities for this were found (see figure 22).

Figure 22. Ideating about the color temperature signifier for the hand-held device.

24

It was chosen to apply the color spectrum vertically to the harmonica structure. This corresponds best to the movement that should be made with the device, which is also vertical. By making a low-fidelity prototype and acting out the movements with this prototype (see figure 23), it was predicted that this would be a good addition to improve the usability of the device. To make sure that the user would use the device with the colors in the right direction, the colors were placed so that the red side would be on the same side as the mark to change which lamp was adjusted (see figure 24). This way, when the user looks at the icon of the lamp that is being adjusted, they automatically hold the device with the red side up.

Figure 23. Testing out the color temperature signifier for the hand-held device.

Figure 24. A sketch to explain the placement of the colors on the hand-held device.

25

4.4 Realization 4.4.1 Tangible Prototypes After making the last adjustments to the tangible user interface concepts, improved prototypes were made. These prototypes can be defined as experience prototypes (see 3.2.5). These prototypes were made using MDF for the main parts and paper and foam for the detailing. First, all MDF parts needed were drawn in an Adobe Illustrator file (see figure 25). These parts were then cut out using the laser cutter.

Figure 25. The drawings made in Adobe Illustrator used for laser cutting the MDF parts.

After that, the paper parts were designed and printed (see figure 26). Lastly, everything was assembled. This resulted in the final prototypes, which can be seen in figures 27 and 28.

Figure 26. The paper parts for the prototypes.

Figure 27. The final counter device prototype. 26

Figure 28. The final hand-held device prototype.

4.4.2 Digital Context Prototype for Validation In the ideal situation for the user validation, I would have built a test living room with two smart lamps and invited participants to come over to test the tangible user interface prototypes in a real-life situation. But because of the pandemic and lack of access to smart lighting systems, I was unable to test the tangible prototypes with users in the same room that would be equipped with smart lights. Therefore, I had to think of a way to create the use context online.

The solution was found to make a digital context prototype in Processing, which is an open- source graphical library tool to program using the Java programming language (Processing, 2020). The program shows an image of a living room with two lamps. For each lamp, the brightness and color temperature can be controlled by pressing keys on the keyboard of the laptop. To show the brightness of a lamp, a white circle is shown around each lamp. The opacity of this circle changes, which represents the brightness changing (see figure 29).

Figure 29. Changing the brightness of the top lamp in the first digital context prototype. The background picture is retrieved from Unsplash, a website that provides free stock photos (Pal, 2019).

The color temperature was harder to show, since this setting is quite subtle and should be experienced in real life. It was chosen to draw a little ellipse over each lamp to show the color temperature, since this would be a subtle adjustment (see figure 30).

Figure 30. Changing the color temperature of the bottom lamp in the second digital context prototype.

27

However, I was not satisfied with this way of showing the color temperature of the lamps. It was not very realistic and could also be understood as changing the color of the light. Therefore, a second iteration was conducted. In this iteration, the color temperature was shown by changing the color temperature of the area around the lamps (see figure 31). Next to that, the background image was edited so that the lamps were further away from each other. However, since this prototype used a lot of images, the program stopped working because it quickly ran out of memory. Thus, a third iteration for the prototype was conducted.

Figure 31. Changing the color temperature of both lamps in the second digital context prototype.

In the third iteration of the digital context prototype, the color temperature was indicated by showing a colored ring around the lamps (see figure 32). This was a suitable solution, since it still showed that the surroundings change without using too much memory. The code for this final digital context prototype can be found in appendix 1. A sketch was made to show how the prototype should be operated, which can be seen in figure 33.

1. both lamps are turned off 2. increase brightness top lamp 3. increase color temp. top lamp

4. increase brightness bottom lamp 5. decrease color temp. bottom lamp 6. decrease color temp. even more

Figure 32. Changing the brightness and color temperature of both lamps in the final digital context prototype. 28

Figure 33. A sketch to show how the final digital context prototype should be operated using the keyboard of my laptop.

29

4.5 Validation 4.5.1 Design Requirements In the literature read during the analysis phase, several design requirements and goals were described. For each paper, an overview of the requirements and a selection of the requirements relevant for the validation of the tangible user interfaces I designed were made:

1. Comparing Social Robot, Screen and Voice Interfaces for Smart-Home Control (Luria et al., 2017)

For their social robot interface design, Luria, Hoffman and Zuckerman defined five design goals: ⋅ Engaging Evoke engagement and bring back ”excitement of interaction” ⋅ Unobtrusive Not disturb the user and stay in the periphery of attention ⋅ Device-like Resemble a device and not a human or pet ⋅ Respectful Be polite and aware of social situations ⋅ Reassuring Express reliability and reassure the user during their use

The last four design goals are focused on designing a social robot and therefore not relevant for this thesis project, but the engaging goal is an interesting requirement to use for the user validation. Therefore, the following question was formulated based on the design requirement:

Which device was most exciting/enjoyable to use? Why?

2. Evolution towards smart home environments: empirical evaluation of three user interfaces (Koskela & Väänänen-Vaino-Mattila, 2004)

In their paper, Koskela and Väänänen-Vaino-Mattila discuss the differences between pattern control and instant control. For each of them, they formulated several UI requirements. Since the focus of this thesis is on instant control, the requirements for instant control are stated here:

⋅ Simple task performance Only few action steps to get something done ⋅ Centralized control device One centralized means to control all different devices

The second design requirement is already fulfilled in both concepts designed for this thesis. However, the simple task performance requirement is interesting to test. Therefore, the following question was formulated:

Do you feel like it was easy to perform a task by using the device? Why?

3. Internet of Tangible Things (IoTT): Challenges and Opportunities for Tangible Interaction with IoT (Angelini et al., 2018)

As one of the results of their systematic IoTT review, Angelini, Mugellini, Abou Khaled and Couture have formulated eight tangible properties to reflect on:

30

⋅ Meaningful representations and controls The function of the object can be understood from its form ⋅ Rich interactions and human skills Natural human skills and senses are exploited through rich interactions ⋅ Persistency Ability to control the system during a power or connectivity outrage ⋅ Spatial interaction and collaboration Support collaborative setups with multiple IoT objects ⋅ Immediacy and intuitiveness Users need minimal learning time to understand and control the device ⋅ Peripheral interaction Interactions that are integrated in daily routines and do not disrupt attention ⋅ Reflection and memories Support for reflections and associating and sharing memories ⋅ Long-lasting interactions and emotional bonding Durable designs that avoid electronic waste due to the technology becoming outdated

From this list, the properties meaningful representation and controls, rich interactions and human skills and immediacy and intuitiveness can be used as relevant design requirements for this thesis project as well. However, the requirement immediacy and intuitiveness is closely related to the simple task performance requirement gained from the second paper. The question used for that requirement can also be used for the immediacy and intuitiveness requirement. Therefore, the following extra questions were formulated:

Did you easily understand how to use the device? Why (not)?

How have you used your body while using the devices?

The analysis of the design requirements mentioned in the literature above resulted in the following list of design requirements relevant for this thesis project with the corresponding interview questions:

Engaging Which device was most exciting/enjoyable to use? Why?

Simple task performance Which device was easiest to use to perform a task? /Immediacy and intuitiveness Why?

Meaningful representation and Did you easily understand how to use the device? Why controls (not)?

Rich interaction and human skills How have you used your body while using the devices?

4.5.2 User Experience Questionnaire In order to gain more insights in the user experience of the two devices, I decided to use the User Experience Questionnaire. The original User Experience Questionnaire has been developed by Laugwitz, Held and Schrepp to measure the user experience in a simple and immediate way (Laugwitz, Held & Schrepp, 2008). For the user validation, the short version of the User Experience Questionnaire that has been developed by Schrepp, Hinderks and Thomaschewski will be used (see figure 34) (Schrepp, Hinderks, & Thomaschewski, 2017). The

31

short version was used to gain the desired insights without taking up too much of the participants’ time.

Figure 34. The short version of the User Experience Questionnaire (Schrepp, Hinderks & Thomaschewski, 2017).

4.5.3 User Validation Plan As mentioned before, the user validation had to be executed online, because of the social distancing restrictions. Therefore, the prototypes and consent forms were brought to the participants’ homes, after which the user validation would be conducted through Zoom. After the interview, the prototypes and filled-in consent form were picked up again.

Goal of the User Validation The goal of this user validation is to gain insights about the use of each TUI separately and the comparison of both TUIs. The topics of the insights include, but are not limited to, the design requirements and user experience categories mentioned in 4.5.1 and 5.4.2.

Participants For this user validation, five people who currently do not own a smart home participated. This number of participants was chosen since research has shown that this is the ideal number of participants for gaining enough insights (Nielsen, 2000). People who do not own smart homes were chosen since they did not have any experience with smart home control and therefore also had very little prejudices about smart home control.

Study Set-up Below, I will shortly explain each phase of the study. The complete study set-up can be found in appendix 2.

1. Introduction I started by explaining the goal and set-up of this study. I also asked the participants to fill in the consent form.

2. Using the Tangible Prototypes After that, I introduced the digital context prototype and the tangible prototypes. I explained what functions both devices have (adjusting the brightness and color temperature of the two lamps). However, I did not explain how to adjust these settings, since I wanted to test how easy and intuitive (Norman, 2013) it was to use the device without having instructions. Then, I gave the participants a scenario of coming home and wanting to turn on the lights, which they had to act upon. I also asked them to think out loud to make it easier for me to Wizard of Oz the digital context prototype.

3. Interview questions about each device After using each device, I asked the specific questions about the experience of using one device that were derived from the design requirements (see 4.5.1). Next to that, the questions from the short User Experience Questionnaire (see 4.5.2) were asked.

32

4. Interview questions to compare the devices Lastly, after both devices were used, some final questions based on the design requirements (see 4.5.1) were asked to compare both devices.

Informed Consent Form An informed consent form was made, which can be found in appendix 3.

4.5.4 Execution The user validation sessions were conducted on May 14 and 15 over Zoom. My roles during these user validation sessions were Wizard of Oz operator, interviewer and observer. The sessions were video-recorded trough Zoom for data processing purposes. Next to that, some notes were made of remarkable actions and quotes. To make the role of Wizard of Oz operator easier for myself, I added stickers to the keyboard of my laptop (see figure 35).

Figure 35. The keyboard of my laptop with stickers to make the process of operating the digital context prototype easier.

4.5.5 Data Processing During the user validation sessions, both qualitative and quantitative data were gathered. The qualitative data was processed by making an affinity diagram (see 3.2.9). First, all quotes, observations and other insights were written down on post-it notes. After that, I categorized the post-it notes in the requirements mentioned in chapter 4.5.1 (see figure 36). Then, sub- categories were made within each requirement category. These sub-categories were then named, formulating some preliminary insights gained from the user validation sessions. The final affinity diagram can be found in figure 37.

33

Figure 36. The first step of categorizing the qualitative data gathered during the user validation sessions.

Figure 37. The affinity diagram of the qualitative data gathered during the user validation sessions.

The quantitative data was processed by making overviews in Microsoft Excel. First, two tables were made showing the ratings of the counter device and the hand-held device by each participant, together with the average (see figure 38).

34

Figure 38. An overview of the rankings of the counter device and the hand-held device by the participants.

After that, another table was made to compare the counter device and the hand-held device (see figure 39).

Figure 39. An overview of the average rankings of the counter device and the hand-held device.

By making these overviews, I was able to draw conclusions from the quantitative data.

4.5.6 Drawing Conclusions Lastly, I wrote down all preliminary insights gained while making the affinity diagram and conclusions that could be drawn from analyzing the qualitative data. It was found that some of the quantitative categories could be connected to the requirements formulated in 4.5.1:

⋅ Engaging – Boring/Exciting & Not interesting/Interesting

⋅ Simple Task Performance – Obstructive/Supportive & Inefficient/efficient

⋅ Meaningful Representation and Controls – Complicated/Easy & Clear/Confusing

⋅ Rich Interaction and Human Skills

⋅ Innovation - the categories Conventional/Inventive and Usual/Leading edge do not belong to one of the categories mentioned above, but represent their own new category: innovation.

A summary of the results (see figure 40) was written, which can be found in chapter 5.2. The conclusions that have been drawn from these results can be found in chapter 7.

35

Figure 40. The overview of results from the qualitative and quantitative analyses. The results have been divided vertically according to the different requirements. Next to that, a division was made between the counter device (left) and the hand-held device (right).

36

5. Results This thesis project has produced two different types of results. Firstly, two new tangible user interfaces for instant smart home control were designed. Next to that, several insights were gained from exploring the user experience of these tangible user interfaces by validating them with users. These insights can be used for future development of tangible smart home user interfaces.

5.1 Tangible User Interfaces for Instant Smart Home Control Two different types of tangible user interfaces for instant control of a smart lighting system were designed. With these devices, the user can adjust the brightness and color temperature of two smart lamps. These devices show new applications of tangible user interfaces to the smart home. They were designed as a tool for answering the research question of this thesis.

The first tangible user interface is the counter device using tokens (see figure 41) (Ullmer et al., 2005). This device allows for interaction by using movable physical parts that embody the smart lamps. The lamp to adjust can be selected by picking up the corresponding token. By moving the token vertically over the display, the brightness is adjusted. The color temperature can be adjusted by moving the token horizontally.

Figure 41. The experience prototype of the counter device using tokens.

The second tangible user interface is the hand-held device using embodied metaphors (see figure 42) (Bakker et al., 2012). This device allows for interaction by moving the body while holding the device, adding metaphorical meanings to movements. The lamp to adjust can be selected by turning a ring to the corresponding image of the lamp. By opening up the device through moving the handles away from each other, the brightness is adjusted. The color temperature can be adjusted by moving the device vertically. To confirm the settings, the handles have to be squeezed.

Figure 42. The experience prototype of the hand-held device using embodied metaphors.

37

5.2 Results from Validating the Tangible User Interfaces The experience prototypes shown in figures 40 and 41 were used to test the user experience of the two different types of tangible user interfaces (see chapter 4.5). The insights gained from these tests are divided into five chapters. First, insights about the engagement are discussed. After that, several results related to the simple task performance are mentioned. Next, insights about meaningful representation and controls are presented. Then, results regarding rich interaction and human skills are presented. After that, the innovativeness of both devices is discussed. Lastly, insights regarding the best use scenarios are presented.

5.2.1 Engagement By evoking engagement with the user, excitement of interaction is created (Luria et al., 2017). Therefore, to create a better user experience, the tangible user interfaces should engage the user. Both interfaces were perceived as quite exciting and interesting, but for different reasons.

Four out of five participants mentioned that the counter device was easy to understand and logical, which made this interface very inviting to use. One participant stated: “I was more excited to use the device with the tokens because I immediately knew what I could do with it.”

The hand-held device was exciting to use because of the need to move the body to change the settings. Next to that, the use of this device was described as playful, since the participants felt like it was a challenge to understand how to use the hand-held device. As one participant mentioned: “It had this playing element. You could grab it and move it around.” However, users also felt demotivated to use the interface sometimes when it was too hard to understand how the device should be used.

5.2.2 Simple Task Performance A task is simple to perform when only a few action steps have to be taken to get something done (Koskela & Väänänen-Vaino-Mattila, 2004). This makes the device efficient to use. When it is easy to perform a task with a device, the user will have a better experience. Therefore, the devices were tested on ease of task performance, efficiency and supportiveness.

The counter device was perceived as quite efficient. As two participants mentioned, the use of the device would get easier over time. One participant explained: “You can visualize a pattern of where you would put the token. That way, it is very easy to remember what setting you like. So the next time I would use this device, I would know exactly where to put it.” However, it was also found that users had to constantly switch from looking at the device to looking at the lamps to see what they changed. This is not very efficient. Next to that, the counter device was perceived as quite supportive. Users only needed one hand to operate the device.

The hand-held device was perceived as less efficient and supportive. Even though the participants predicted that the use of this device would become easier over time, they also remarked that it would be harder to remember settings that they liked. One participant stated: “It is harder to remember what height for the color temperature I like most. I would have to refer to my body parts, like my nose or shoulders to remember the exact height.” Next to that, the actions that have to be performed with the hand-held device are more time-consuming compared to the counter device. An advantage of the hand-held device is that it can be moved around so that the device and the lamps are in the same field of view. This way, users get immediate feedback on their actions.

38

5.2.3 Meaningful Representation and Controls The representation and controls of a device are meaningful when its function can be understood from its form (Angelini et al., 2018). This increases the usability of the device, which is needed to create a good user experience (Shin & Wang, 2015). The representation and controls of the tangible user interfaces were perceived very differently.

The counter device was perceived as quite clear and easy to use. The participants easily understood what to do, either intuitively or by trying out and seeing what happens. One participant commented while using the counter device: “I don’t have any instructions, but this is intuitively the first thing I would do. So I will just do it and see what happens.” So the immediate feedback contributed to a great extent to the user’s understanding of the counter device. However, two things should be changed to improve the understandability of this device. Firstly, it was found that it was unclear how to turn off the lights. This could be solved by adding a token-station where the tokens should be placed when the lights should be turned off. Next to that, there is a need for an extra signifier for the brightness. Two participants associated the counter device with a DJ mixing console and therefore thought that the lower the token would be placed, the brighter the light would be. This could be solved by adding for example brightness icons to signify that up is brighter and down is less bright.

On the contrary, the hand-held device was perceived as quite complicated and a bit confusing, since it is multidimensional and consists of a lot of different elements. The participants easily understood how to hold the device, but often would not hold it with the red side on top and the blue side on the bottom. So the exact way how to hold it was not indicated well enough. Next to that, the ring to switch between lamps was not very noticeable. But when the participants noticed it, they immediately knew how to use it. On the other hand, it was unclear to the participants how to confirm the chosen settings.

When interacting for the first time with the hand-held device, all participants easily understood which action should be performed, namely opening up the device. One participant mentioned: “I did not know what to expect when doing this, but my first thought was to open the device. When I saw the brightness of a lamp changing, I understood that I could change the brightness by opening the device.” However, the participants had a hard time figuring out how to change the color temperature of a lamp. This has two reasons. Firstly, the participants often did not notice the signifier, the colors on the middle part of the device. Next to that, the movement was not intuitive or logical to the participants. Most participants tried modifying the device itself by for example turning, twisting or bending. It was not intuitive to the participants to change the position of the device.

In conclusion, the control of the brightness using the hand-held device was meaningful and easily understandable, while the control of the color temperature was unclear and not intuitive. This shows that the embodied metaphors should be picked very carefully.

5.2.4 Rich Interaction and Human Skills By exploiting natural human skills and senses, rich interactions are created (Angelini et al. 2018). Rich interactions increase the understandability and trustability of smart home devices, which also improves the user experience. It was found that the amount of human skills and senses needed to use both devices was quite different from each other.

The participants used mostly their hands and arms when interacting with the counter device. One participant mentioned that it felt like playing a chess game. The use of the hand-held device, on the other hand, required moving the whole upper body. All participants associated

39

the use of this device with exercising, which indicates that they were way more active when using the hand-held device compared to the counter device.

5.2.5 Innovation By testing the perceived innovativeness, the hedonic quality of a design is reviewed. This quality aspect addresses the human need for novelty or change (Hassenzahl, 2001). If a design is perceived as new and inventive, a more positive user experience is created.

The hand-held device was perceived as more inventive than the counter device, since the users had not used something similar before to operate technology. The counter device, on the other hand, was associated with a ’traditional’ . The participants were excited to use something new when using the hand-held device, which therefore also improved their experience of using the hand-held device.

5.2.6 Use Scenarios By analyzing the different results related to the user experience, it was found that the tangible user interfaces create the best user experience in different situations.

The counter device creates the most pleasant user experience in a frequent long-term use scenario. This device is easy to understand when using it for the first time, but also enables users to remember their favorite settings for future use. Next to that, it was found that the counter device can facilitate the execution of more complicated tasks, like changing multiple settings for multiple lights, without confusing the user. Lastly, the counter device is easier to expand by for example adding more tokens that represent new lamps.

The hand-held device creates the best user experience in an infrequent use scenario. The device is exciting to use because of the need for bodily movements. It can also be described as playful, since the process of trying to understand how to use the device was perceived as a game. However, it takes more time to operate the hand-held device, which makes it less efficient. Next to that, users cannot easily remember their favorite settings with this device. Lastly, the use of the hand-held device can be perceived as too complicated when performing more complicated tasks. Therefore, this device fits simple tasks that have to be performed infrequently best.

40

6. Discussion In this section, I will first reflect upon the development of the tangible user interfaces and the user validation process. After that, several relevant and possible opportunities for future work will be discussed.

6.1 Tangible User Interfaces Reflecting on the development process of the tangible user interfaces, I can say that this project would have benefitted from an extra design iteration. During the user validation sessions, it was found that there is room for improvement in the usability of the devices. Therefore, if I could have done things differently, I would have conducted an extra user validation during the conceptualization phase which would have been focused on the usability of the devices. This way, I could optimize the usability before testing the user experience.

User Experience Pleasurable

Usability Usable

Reliable Functionality

Functional

Figure 43. An adaptation of Aaron Walter’s hierarchy of user needs (Shin & Wang, 2015). This figure shows how the user experience, usability and functionality of a device build upon each other.

It is important to create a good usability before being able to test the user experience, since a good user experience can only be created when an interface is usable (see figure 43).

6.2 User Validation During the user validation, a digital context prototype was used together with the Wizard of Oz method (Dow et al., 2005) to create the experience of a functioning prototype without actually having a fully functional prototype. This way, the needs reliable and functional of figure 43 were covered. However, it was found that the digital context prototype did not completely create the same user experience as in a real-life setting. Firstly, the visual feedback that users got through the digital context prototype was only two-dimensional, while the feedback would be three-dimensional in real life. When changing the brightness or the color temperature of a lamp, the whole environment changes. This is a different experience than seeing something change on a screen. Next to that, it was found that there was a little delay in the feedback, since an online video tool was used to communicate and the lamp settings had to be changed manually. During the user validation sessions, participants used the feedback as a way to understand how to use the devices. Therefore, it was very important to give immediate feedback on their actions.

So to conclude, the results from the user validation would have benefitted from testing the devices in a real setting instead of using the digital context prototype, since a more realistic user experience would have been created that way. This would also cause the feedback to the 41

user of the lights changing to be more immediate, since there would be no online video tool needed to communicate.

Next to that, I would like to reflect on the ethical considerations of the user validation. For the user validation sessions, the online video tool Zoom was used, since this tool was provided by the university. By using a third-party tool to record the user validation sessions, I was not completely able to guarantee that the recording would not be used for any other purposes than this thesis project. I still felt that this was the best way to conduct the user validation sessions, because of the social distancing constraints and the fact that no ethically sensitive data was being gathered during the user validation sessions. However, if the situation was different, it would have been better to not use any external resources for gathering data.

6.3 Other Results from the User Validation Two interesting insights were found during the user validation session that are not related to the user experience, but to the limitations of the tangible user interfaces. Since these insights are still important to consider when designing new tangible user interfaces for smart control, I want to discuss them here.

One of the participants noticed that there is a limitation to the number of lamps that can be used with both devices, since there is a limited amount of space available for tokens on the counter device and icons on the ring of the hand-held device. This would limit users from expanding their smart homes. For the counter device, this could be solved by allowing users to stack the tokens on top of each other. For the hand-held device, another way should be found to choose which smart device is being controlled.

Next to that, two other participants questioned how to store the hand-held device. While they would imagine the counter device to always be in the same place, they did not know where they should place the hand-held device. Because of its size, it is also not as easily put away as a remote control for television for example. So this is also something that should be further investigated when designing a tangible user interface for long-term use.

6.4 Future Work This thesis project focused on the use of tangible user interfaces for smart lighting control. It was chosen to only focus on one category of smart home devices to make this project more feasible within its time constraints. However, it would be very interesting to design and test new tangible user interfaces that combine several categories of smart home devices, such as smart lighting, smart entertainment and smart security. The process of this thesis can be used as an example of how to do this.

Next to that, the focus of this thesis was on one single user for the smart home interfaces. But often, households consist of more than just one person. Therefore, it would be interesting to design and test tangible user interfaces for smart home control that are being used by multiple users.

Also, the tangible user interfaces that were designed for this thesis project have currently only been tested once. Since smart home control interfaces are meant to be used more frequently, it would be interesting to test the user experience over a longer time period. Now, some participants commented about what they expect it would be like to use the devices in the long- term. But this can only be validated when testing it on the long-term, for example by using the UX Curve method (Kujala, Roto, Väänänen-Vainio-Mattila, Karapanos, & Sinnelä, 2011).

42

7. Conclusion This thesis project aimed to explore how different types of tangible user interfaces influence the user experience of instant smart lighting system control. First, literature and canonical examples related to user control in smart homes, the user experience of smart homes and tangible interaction were reviewed. This way, a better understanding of the thesis context was created and different types of tangible user interfaces were defined. Two types of tangible user interfaces were selected: the counter device using tokens and the hand-held device using embodied metaphors. For each of these types, a new interface was designed to control the brightness and color temperature of two smart lamps. Of each interface, an experience prototype was made which was used for validating the tangible user interfaces with participants that had not used smart home control interfaces before. Five user validation sessions were conducted, during which the engagement, ease of task performance, meaningfulness of representation and controls as well as richness of interaction and human skills of both devices were tested.

From these user validation sessions, several conclusions could be drawn. First of all, it was found that the counter device using tokens was easy to understand when trying to use it for the first time. Furthermore, this device enables users to easily remember their favorite settings. Therefore, this type of tangible user interface creates the best user experience for frequent long-term use. The hand-held device using embodied metaphors, on the other hand, creates the best user experience for infrequent use. This device requires bodily movements, which was perceived as exciting to do. Next to that, this device is more playful, since users felt like it was a game to understand how to use this device. However, it was also found that this device is less efficient, since it takes more time to operate this device. Also, it is harder to remember favorite settings with this device.

Since it was found that the hand-held device can become too complicated when it becomes more complex, the hand-held device fits simple tasks, such as only changing one lighting setting at a time. The counter device fits more complicated tasks, such as changing multiple settings for multiple lights, because it is easy to understand and therefore inviting to use. Next to that, it is also easier to expand this device by for example adding more tokens that represent new lamps.

Lastly, it was found that immediate feedback is very important for users to understand what they are doing, which will therefore also improve their experience. This should be considered while testing new smart home control interfaces, but should also be taken into account while designing new smart home control interfaces.

43

8. Acknowledgments I would like to thank my supervisor, Maliheh Ghajargar, for the great guidance and feedback throughout this thesis project. Next to that, I would like to thank my friends, family and fellow classmates for their support and encouragement. Lastly, I would like to thank the participants of my user validation study for their help.

44

9. References

Abrams, R. (1999). Adventures in tangible computing: The work of interaction designer ‘Durrell Bishop’ in context. Master’s thesis Royal College of Art, London.

Alam, M. R., Reaz, M. B. I., & Ali, M. A. M. (2012). A review of smart homes—Past, present, and future. IEEE transactions on systems, man, and cybernetics, part C (applications and reviews), 42(6), 1190-1203.

Allen, B., van Berlo, A., Ekberg, J., Fellbaum, K., Hampicke, M., & Willems, C. (2001). Design guidelines on smart homes. COST 219bis guidebook.

Angelini, L., Lalanne, D., Hoven, E. V. D., Khaled, O. A., & Mugellini, E. (2015). Move, hold and touch: a framework for tangible gesture interactive systems. Machines, 3(3), 173-207.

Angelini, L., Mugellini, E., Abou Khaled, O., & Couture, N. (2018). Internet of Tangible Things (IoTT): challenges and opportunities for tangible interaction with IoT. In Informatics (Vol. 5, No. 1, p. 7). Multidisciplinary Digital Publishing Institute.

Bakker, S., Antle, A. N., & Van Den Hoven, E. (2012). Embodied metaphors in tangible interaction design. Personal and Ubiquitous Computing, 16(4), 433-449.

Bakker, S., & Niemantsverdriet, K. (2016). The interaction-attention continuum : considering various levels of human attention in interaction design. International Journal of Design, 10(2), 1-14.

Brennan, C. P., McCullagh, P. J., Galway, L., & Lightbody, G. (2015). Promoting autonomy in a smart home environment with a smarter interface. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 5032-5035). IEEE.

Buchenau, M., & Suri, J. F. (2000). Experience prototyping. In Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 424-433).

Clark, B., & Reinertsen, D. G. (1998). Rapid Ideation in Action: Getting Good Ideas Quickly and Cheaply. Design Management Journal (Former Series), 9(4), 47-52.

Dourish, P. (2001). Where the action is. Cambridge: MIT press.

Dow, S., MacIntyre, B., Lee, J., Oezbek, C., Bolter, J. D., & Gandy, M. (2005). Wizard of Oz support throughout an iterative design process. IEEE Pervasive Computing, 4(4), 18-26.

Eggen, J. H., van den Hoven, E. A. W. H., & Terken, J. M. B. (2016). Human-centered design and smart homes: How to study and design for the home experience?. In Handbook of smart homes, health care and well-being (pp. 83-92). Springer.

Hassenzahl, M. (2001). The effect of perceived hedonic quality on product appealingness. International Journal of Human-Computer Interaction, 13(4), 481-499.

Hornecker, E., & Buur, J. (2006). Getting a grip on tangible interaction: a framework on physical space and social interaction. In Proceedings of the SIGCHI conference on Human Factors in computing systems (pp. 437-446).

Ikea. (2020). Smart lighting. Retrieved 2020 April 20 from https://www.ikea.com/nl/en/cat/smart-lighting-36812/

Intille, S. S. (2002). Designing a home of the future. IEEE pervasive computing, 1(2), 76-82.

Ishii, H., & Ullmer, B. (1997). Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (pp. 234-241).

Khan, W. M., & Zualkernan, I. A. (2018). SensePods: A ZigBee-Based Tangible Smart Home Interface. IEEE Transactions on Consumer Electronics, 64(2), 145-152.

Koskela, T., & Väänänen-Vainio-Mattila, K. (2004). Evolution towards smart home environments: empirical evaluation of three user interfaces. Personal and Ubiquitous Computing, 8(3-4), 234-240.

Kujala, S., Roto, V., Väänänen-Vainio-Mattila, K., Karapanos, E., & Sinnelä, A. (2011). UX Curve: A method for evaluating long-term user experience. Interacting with computers, 23(5), 473-483.

Laugwitz, B., Held, T., & Schrepp, M. (2008). Construction and evaluation of a user experience questionnaire. In Symposium of the Austrian HCI and Usability Engineering Group (pp. 63-76). Springer, Berlin, Heidelberg. 45

Longhurst, R. (2003). Semi-structured interviews and focus groups. Key methods in geography, 3(2), 143-156.

Luria, M., Hoffman, G., & Zuckerman, O. (2017). Comparing social robot, screen and voice interfaces for smart-home control. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 580-628).

Manches, A., & O’malley, C. (2012). Tangibles for learning: a representational analysis of physical manipulation. Personal and Ubiquitous Computing, 16(4), 405-419.

Marquez, J. J., Downey, A., & Clement, R. (2015). Walking a mile in the user's shoes: Customer journey mapping as a method to understanding the user experience. Internet Reference Services Quarterly, 20(3-4), 135-150.

Naylor, Z. (2019). How to create an affinity diagram for UX research. Retrieved May 14 2020 from https://medium.com/@zacknaylor/how-to-create-an-affinity-diagram-for-ux-research-cdc08489952d

Nielsen, J. (2000). Why you only need to test with 5 users.

Norman, D. (2013). The design of everyday things: Revised and expanded edition. Basic books.

Pal, T. (2019) Black book on brown wooden coffee table photo – Free furniture image. Retrieved 2020 May 4 from https://unsplash.com/photos/WUC_u5yoD2w

Philips. (2020). Hue. Retrieved 2020 April 20 from https://www2.meethue.com/en- us/products#filters=BULBS_SU&sliders=&support=&price=&priceBoxes=&page=&layout=

Processing. (2020). Processing. Retrieved 2020 May 21 from https://www.processing.org

Ricker, T. (2019). Ikea previews its improved 2020 smart home experience. Retrieved 2020 April 6 from https://www.theverge.com/2019/12/18/21025798/ikea-home-smart-scenes-shortcut-button-onboarding- upgrade-software-price

Rosson, M. B., & Carroll, J. M. (2009). Scenario-based design. In Human-computer interaction (pp. 161-180). CRC Press.

Roto, V., Law, E., Vermeeren, A. P. O. S., & Hoonhout, J. (2011). User experience white paper: Bringing clarity to the concept of user experience. In Dagstuhl Seminar on Demarcating User Experience (p. 12).

Schrepp, M., Hinderks, A., & Thomaschewski, J. (2017). Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S). IJIMAI, 4(6), 103-108.

Shaer, O., & Hornecker, E. (2010). Tangible user interfaces: past, present, and future directions. Foundations and Trends® in Human–Computer Interaction, 3(1–2), 4-137.

Shin, D. & Wang, Z. (2015). The Experimentation of Matrix for Product Emotion. Procedia Manufacturing, 3, 2295- 2302.

Tada, K., Takahashi, S., & Shizuki, B. (2016). Smart home cards: tangible programming with paper cards. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (pp. 381-384).

Ullmer, B., Ishii, H., & Jacob, R. J. (2005). Token+ constraint systems for tangible interaction with digital information. ACM Transactions on Computer-Human Interaction (TOCHI), 12(1), 81-118.

Van Boeijen, A., Daalhuizen, J., van der Schoor, R., & Zijlstra, J. (2014). Delft design guide: Design strategies and methods.

Weiser, M. (1993). Hot topics-ubiquitous computing. Computer, 26(10), 71-72.

Weiser, M., & Brown, J. S. (1997). The coming age of calm technology. In Beyond calculation (pp. 75-85). Springer, New York, NY.

Xia, F., Yang, L. T., Wang, L., & Vinel, A. (2012). Internet of things. International journal of communication systems, 25(9), 1101.

46

Appendix 1. Final Digital Context Prototype Code

47

48

2. User Validation Study Set-up

1. Introduction ⋅ Explain that this study is part of my thesis project, I want to test the use of different interfaces to control smart homes.

⋅ Ask to fill in the consent form

⋅ Explain the following events: 1. The participant will use the first device, after which some questions will be asked. 2. The same happens with the second device. 3. Lastly, some extra questions will be asked to compare the devices.

2. Using the Tangible Prototypes ⋅ Show the digital context prototype

⋅ Ask the participant to get the first tangible prototype (the counter device)

⋅ With both devices, the brightness and color temperature of the two lamps in the digital context prototype can be adjusted

⋅ Instructions: Pretend that this digital living room is your living room. Imagine that you have just come home from work/university and want to turn on the lights. Try to do this by using the prototype. Please think out loud.

3. Interview questions about each device

⋅ Did you easily understand how to use the device? Why (not)?

⋅ How have you used your body while using the devices?

⋅ For the following categories, rate the use of this device on a scale from 1-5: Obstructive Supportive Complicated Easy Inefficient Efficient Clear Confusing Boring Exciting Not interesting Interesting Conventional Inventive Usual Leading edge

Repeat step 2 and 3 again for the second tangible prototype (the hand-held device)

4. Interview questions to compare the devices

⋅ Which device was most exciting/enjoyable to use? Why?

⋅ Which device was easiest to use to perform a task? Why?

⋅ Do you have any other comments on using these two devices?

49

3. Informed Consent Form

Informed Consent Form

Tangible User Interfaces in the Smart Home Environment

Exploring the User Experience of Instant Smart Lighting System Control

Study by Iris Bataille, student Interaction Design at Malmö University Contact: [email protected]

About this study: during this study, the user experience of two smart home control devices that have been designed by the student will be explored. The data gained from this study will be used for the student’s thesis project. The participant will use both devices and answer several questions. The study will be video recorded for data processing purposes. The video footage will not be published and will be deleted after finishing the thesis project. Next to that, all data will be processed anonymously.

I have been verbally informed about the study and read the accompanying written information stated above. I am aware that my participation is voluntary, and that I, at any time and without explanation, can withdraw my participation. The person leading the study will strive to guarantee confidentiality in that no unauthorized person may have access to the material. The gathered material will be stored properly and used for research purposes only.

I hereby submit my consent to participate in the above study:

Name participant ......

Date ......

Participant’s signature ......

50