An Improved Modular Framework for Developing Multi-Surface Interaction

An Improved Modular Framework for Developing Multi-Surface Interaction

AN IMPROVED MODULAR FRAMEWORK FOR DEVELOPING MULTI-SURFACE INTERACTION A Paper Submitted to the Graduate Faculty of the North Dakota State University of Agriculture and Applied Science By Jed Patrick Limke In Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE Major Program: Software Engineering December 2015 Fargo, North Dakota North Dakota State University Graduate School Title AN IMPROVED MODULAR FRAMEWORK FOR DEVELOPING MULTI-SURFACE INTERACTION By Jed Patrick Limke The Supervisory Committee certifies that this disquisition complies with North Dakota State University’s regulations and meets the accepted standards for the degree of MASTER OF SCIENCE SUPERVISORY COMMITTEE: Dr. Jun Kong Chair Dr. Juan Li Dr. Jin Li Approved: 12/21/2015 Dr. Brian Slator Date Department Chair ABSTRACT Advances in multi-touch capabilities have led to their use in a vast array of devices, but usable interactions that span across devices are less frequently encountered. To address this con- cern, a framework was created for the development of interaction spaces which gave developers the tools to write applications which united tabletop and handheld devices, allowing each device to utilize its inherent interaction style yet still communicate effectively. However, while this framework provided proof that such interactions were possible, it failed to prove itself as easily reusable by subsequent developers. To address this concern, we have created an improved framework to both fulfill the goals of the original framework and confront its shortcomings. Our improved framework features a new intra-component communication system, mobile device independence, configurable user interfaces, and automatic exposure of available interac- tions. All of these features coalesce to fulfill the goals of the original framework while improving its reusability. !iii TABLE OF CONTENTS ABSTRACT ...................................................................................................................................iii LIST OF FIGURES .........................................................................................................................v 1. INTRODUCTION .......................................................................................................................1 2. PROBLEM STATEMENT ...........................................................................................................6 3. APPROACH OVERVIEW......................................................................................................... 11 4. BUILDING THE COMMUNICATION LAYER ......................................................................14 5. SOLVING THE MOBILE ECOSYSTEM PROBLEM .............................................................22 6. CONSTRUCTING MOBILE INTERFACES REMOTELY .....................................................26 7. GENERATING TOPOLOGIC REPRESENTATIONS OF THE TABLETOP’S INTERACTION POINTS ..............................................................................................................33 8. CONCLUSION AND FUTURE WORK ...................................................................................37 REFERENCES ..............................................................................................................................39 !iv LIST OF FIGURES Figure Page 1. Improved framework overall structure ...................................................................................11 2. High-level message bus example ...........................................................................................14 3. Abstract MessageBase class ...................................................................................................15 4. Message bus service interface, IMessageBusService .............................................................16 5. Message bus client interface, IMessageBusClient .................................................................17 6. MessageReceivedEventArgs ..................................................................................................17 7. Service implementation ..........................................................................................................18 8. Client implementation ............................................................................................................19 9. MessageBase and its descendants ..........................................................................................21 10. Mobile interaction sequence ...................................................................................................23 11. Experience class .....................................................................................................................26 12. Passive classes ........................................................................................................................27 13. Interaction classes ...................................................................................................................28 14. Screenshot of sample user interface .......................................................................................30 15. Experience object and corresponding mobile interface .........................................................31 16. The ResponseCollection class ................................................................................................32 17. Screenshot of sample auto generated topological interface ...................................................34 18. A hand-held token ...................................................................................................................35 19. The virtual rectangle representing the capture area ................................................................35 20. Topologic generation when Bravo, Foxtrot, Hotel, and India are captured ...........................36 !v 1. INTRODUCTION Since the advent of telecommunications, physical proximity has become less and less necessary to satisfy the basic needs of interaction, however in-person, face-to-face interaction is still important [Wu03], particularly in collaborative endeavors. While handheld devices provide the ability for users to interact virtually, co-located users also have the ability to interact, togeth- er, with tabletop computers. Given their large screens, tabletops are especially suited to such col- laborative work and play. However, when collaborating in a shared space, focus becomes a con- cern and especially so when working together on a single large screen [Sch12]. This poses many challenge for multi-user simultaneous interaction. For example, on tabletop computers, only a single virtual keyboard can be utilized at one time given the personal nature of computers, even in collaborative environments, which limits the ability for multiple users to simultaneously interact directly with a single system. Further- more, a shared tabletop provides a large enough screen for multiple users, but there is still only one speaker, allowing for, at most, a single audio source to play for the collaborators. While this may generally be preferred, it’s unfortunate that such a limitation exists. In addition, there may be instances where users wish or are required to interact with the collaborative space privately, and in an environment with multiple users and one shared collaborative tabletop, privacy is chal- lenging if not impossible. Though the use of a large tabletop screen makes information accessible to multiple users, there are also physical considerations to be contemplated aside from the aforementioned chal- lenges, such as physical reachable distance and content orientation [Shen06], given users will have disparate physical orientations while seated around a tabletop. These differing orientations !1 preclude the ability for multiple users to view the tabletop’s contents a optimal orientation (e.g. the orientation of text on the surface that is suitable for users seated on one side of the tabletop cannot be simultaneously suitable for users seated on the opposite side). Even if the orientation issues are somehow overcome, simply the minimum size necessary for a tabletop screen to be large enough for collaboration may preclude a smaller user from being able to reach different points upon the tabletop without moving. Previous studies have developed methods and interaction strategies to overcome these challenges. For example, Schmidt et. al. [Sch12] proposed a cross-device interaction style for mobile devices and tabletops which uses the mobile device itself to provide tangible input on the tabletop in a stylus-like fashion. Manipulating the mobile device through different predefined gestures and movements allowed the user to interact with the tabletop to perform different tasks, all based upon data being collected simultaneously from not only the tabletop’s internal cameras, but also the physical orientation sensors on the mobile device (e.g. accelerometers, gyroscopes). A drawback of this interaction style is that selection of objects on the tabletop screen sometimes erred due to collision of simultaneous touch events between mobile devices [Sch12]. Our ap- proach associates each device with a unique token or ByteTag to assure the correct association between a user’s mobile device and their interactions with the tabletop. Another approach which was developed in order to overcome these challenges was a cross-device framework (called MobiSurf) between the tabletop, a Microsoft Pixelsense comput- er, and

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    44 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us