Domain Task Force Final Agenda ver.1.0.4 robotics/2006-04-04

OMG Technical Meeting - St. Louis, MO, USA -- April 24-28, 2006 TF/SIG Host Joint (Invited) Agenda Item Purpose Room Monday (April 24) WG and Committee Activites 8:30 12:00 SDO Robotics Technology Components RFP submitter's meeting (closed) St. Peters 12:00 13:00 LUNCH Grand Ballroom ABC 13:00 18:00 Architecture Board Plenary Grand Ballroom D 13:00 14:20 Robotics WG (Service) discussion 14:20 15:40 Robotics WG (Profile) discussion 15:40 17:00 Robotics WG (Tool) discussion Gateway3 17:00 18:00 Robotics SDO Steering Committee of Robotics DTF Volunteer recruit (included Publicity Subcommitee discussion)

Tuesday (April 25) Robotics Plenary 8:30 9:00 MARS SDO, Progress Report of the Robot Technology Components RFP revised submission reporting Gateway2 Robotics 10:05 10:20 Robotics, Welcome and review agenda Robotics/SDO Joint SDO Meeting Kick-off 10:20 11:20 Robotics, SDO, Informative SDO Robotics "Real-Time ORB Middleware: Standards, Applications, and Variations” Poplar - Prof. Chris Gill (Washington University) 11:20 12:00 Robotics (SDO) "Communication protocol for the URC robot and server” RFI response - Hyun-Sik Shim (Samsung Electronics) 12:00 13:00 LUNCH Grand Ballroom ABC 13:00 14:00 Robotics (SDO) Infomative "URBI: a Universal Platform for Personal Robotics” - Prof. Jean-Christophe Baillie (ENSTA/UEI Lab) 14:00 14:40 Robotics (SDO) "Fujitsu's robotics research and standardization activities” RFI response - Toshihiko Morita 14:40 15:20 Robotics (SDO) "Standardization of device interfaces for home ” RFI response Poplar - Ho-Chul Shin (ETRI) Break (20min) 15:40 16:20 Robotics (SDO) “Voice interface standardization items network robot in noisy environments” RFI response - Soon-Hyuk Hong (Samsung Electronics) 16:20 17:40 Robotics SDO WG (Infrastructure) discussion Wednesday (April 26) Robotics Plenary 8:30 9:10 Robotics (SDO) “Home in SAIT” RFI response - Seok Won Bang and Y. H. Kim (Samsung Advanced Institute of Technology) 9:10 9:50 Robotics (SDO) “ITR – Internet Renaissance - The world’s first to be harmonized RFI response with the family -” - Hiroyuki Nakamoto (Systems Engineering Consultants) Break (10min) 10:00 11:40 Robotics (SDO) WG Reports and Roadmap Discussion chartering WG Poplar (Infrastructure, Service, Profile) 11:40 11:50 Robotics (SDO) Contacts Report Informative 11:50 12:00 Robotics, Publicity, Next Meeting Agenda Discussion, etc Robotics/SDO Closing SDO session 12:00 Adjourn 12:00 14:00 LUNCH and OMG Plenary Grand Ballroom ABC 14:00 18:00 Robotics Infrastructure WG [14:00-18:00] follow up discussion Poplar the current component model RFP as well as other topics of interest to the (14:00-18:00) members including possible additional areas for standardization. Service WG [16:00-18:00] follow up discussion Missouri (16:00-18:00) 18:00 20:00 OMG Reception Grand Ballroom CD Thursday 8:30 10:00 SDO Robotics Robot Technology Components RFP submitter's meeting (open) Grand Ballroom F 10:00 12:00 Robotics Service WG follow up discussion Grand Ballroom F 12:00 13:00 LUNCH Grand Ballroom ABC 13:00 18:00 Architecture Board Plenary Grand Ballroom D 13:00 14:00 MARS SDO, Robotics RFI Summary Report Blanchette2 Robotics 14:00 15:00 MARS SDO, "Real-Time ORB Middleware: Standards, Applications, and Variations" Blanchette2 Robotics - Prof. Chris Gill (Washington University) Friday 8:30 12:00 AB, DTC, PTC Grand Ballroom DEF 12:00 13:00 LUNCH Grand Ballroom AB

Other Meetings of Interest Monday 8:00 8:45 OMG New Attendee Orientation Poplar 9:00 12:00 OMG MOF Metamodeling Tutorial Discovery 13:00 17:00 OMG Architecture-Driven Modernization Concepts and Task Force Update Discovery 18:00 19:00 OMG New Attendee Reception (by invitation only) Posh's Dining Tuesday 13:00 17:30 OMG MDA–Where it Came From and Where It’s Going Discovery Wednesday 9:00 12:00 OMG Introduction to UML 2.0 Discovery 14:00 17:00 OMG Introduction to the Data Distribution Service Discovery robotics/2006-04-05

Robotics D䌔䌆 Steering Committee Meeting

April 24, 2006 St. Louis, MO, USA Hilton St. Louis Airport Gateway 3

Agenda

• Agenda Review • Publicity • Working Group Discussion • Roadmap Discussion • Next meeting Schedule Review Agenda Tuesday, April 25, 2006 Poplar

10:05-12:20 Welcome and Review Agenda 10:20-11:20 Special Talk: Prof. Chris Gill 11:20-12:00 RFI response presentation

13:00-14:00 Special Talk: Prof. Jean-Christophe Baillie 14:00-16:20 RFI response presentation 16:20-17:40 WG (Infrastructure)

Joint Meeting with MARS/RTESS Thursday, April 27, 2006 13:00-14:00 (Banchette2)

Review Agenda Wednesday, April 26, 2006 Poplar 08:30-09:50 RFI response presentation 10:40-11:40 WG Reports and Roadmap Discussion 11:40-11:50 Contacts Report 11:50-12:00 Next meeting, etc. 12:00 Adjourn

14:00-18:00 WG (Infrastructure) 16:00-18:00 WG (Service)

Joint Meeting with MARS/RTESS Thursday, April 27, 2006 13:00-14:00 (Banchette2) Publicity Activities

• 4 page fly sheet Draft of Abheek@ADA Software Abheek@ADA Soft, Olivier@AIST, Chung@ETRI, Yokomachi@NEDO

Action: Send each organization logo to Abheek.

44 pagepagepage flyflyfly sheetsheet willwill bebe authorizedauthorized inin BostonBostonBoston

Publicity Activities

• Korea-Japan RSW2006 Friday, June 16, 2006, Jeju Island, Korea Chung@ETRI • RoboBusiness2006 June 20-21, 2006, Pittsburgh, PA, USA http://www.robobusiness2006.com/ Jon Sigel and Bruce@Systronix • IROS2006 Workshop October 9-15, Beijing, China http://www.iros2006.org/ Kotoku@AIST, Chung@ETRI, Mizukawa@Sibaura-IT • SICE-ICASE International Joint Conference October 18-21, Pusan, Korea http://sice-iccas.org/ Mizukawa@Sibaura-IT Roadmap (WG organization)

4 Discussion Groups: • Infrastructure: Rick@RTI, Ando@AIST • Service: Chi@ETRI, Lemaire@AIST • Tool: Abheek@ADA soft • Profile: Lee@ETRI, Bruce@Systronics

Next Meeting Agenda June 26-30, 2006 (Boston, MA, USA) Monday : RTCs RFP revised submission review [MARS] Steering Committee Monday-Tuesday, Thursday : WG activities Wednesday : Robotics-DTF Plenary Meeting •Guest Presentation(s) •WG reports & Roadmap discussion •Contact reports •Resolution Robotics-DTF Meeting Minutes – Tampa, FL, USA – approved (robotics/2006-04-06)

Overview and votes The first plenary of Robotics Domain Task Force was held, following the charter of Robotics DTF in the last Burlingame meeting. We had one special talk and 14 RFI response presentations. We decided to start 4 working group activities, which will be chartered officially in the upcoming St. Louis meeting. With the sponsorship of MARS, the deadline of the Robotics Systems RFI was re-extended to April 3rd, 2006 (3 weeks before the St. Louis meeting). By the proposal from ADA Software, the Publicity Sub-Committee has been chartered.

OMG Documents Generated robotics/2006-02-04 Robotics-DTF Final Agenda (Tetsuo Kotoku) robotics/2006-02-05 Steering Committee Presentation (Tetsuo Kotoku) robotics/2006-02-06 Burlingame Meeting Minutes [approved] (Olivier Lemaire) robotics/2006-02-07 Opening Presentation (Tetsuo Kotoku) robotics/2006-02-08 Robotics-DTF Roadmap (Tetsuo Kotoku) robotics/2006-02-09 “Hitachi's needs for robotic system standards” (Saku Egawa, Hitachi) robotics/2006-02-10 “Towards Plug and Play Robotics” (Abheek Bose, ADA Software Group) robotics/2006-02-11 “SEC's Approach to the Standardization of Robotic Systems” (Masayuki Nagase and Hiroyuki Nakamoto, SEC) robotics/2006-02-12 “Development of Food , Meat Processing Robots, and Request for Standardization of RTC” (Tomoki Yamashita, Mayekawa MFG) robotics/2006-02-13 “RT service framework using IT infrastructure” (Wonpil Yu, ETRI) robotics/2006-02-14 “A Software System Architecture with Unified Sensory Data Integration” (Takashi Tsubouchi, Tsukuba Univ.) robotics/2006-02-15 “Navigation of mobile robots” (Wonpil Yu, ETRI) robotics/2006-02-16 “OMG Robotics Systems RFI response from AIST” (Olivier Lemaire, AIST) robotics/2006-02-17 “Current State of Robotics Script and Control Languages [and Standards]” (Lloyd Spencer, CoroWare) robotics/2006-02-18 “Special Talk: Lessons Learned About Software for Rescue Robots” (Matt Long, Univ. of South Florida) robotics/2006-02-19 “Development Framework for Mobile Robot based on JAUS and RT-Middleware” (Wataru Inamura, IHI) robotics/2006-02-20 “Applicable SWRadio Spec Concepts for Robotics Domain” (Jerry Bickle, PrismTech) robotics/2006-02-21 “COMPARE Response to the Robotics RFI” (Virginie Watine, THALES) robotics/2006-02-22 “Robot Server Middleware” (Seung-Ik Lee, ETRI) robotics/2006-02-23 “Toshiba's Approach to RT Standardization and Where the Standardization is Needed” (Fumio OZAKI, Toshiba) robotics/2006-02-24 OMG Robotics Task Force RFI Survey Result (Olivier Lemaire, AIST) robotics/2006-02-25 Query Report (Olivier Lemaire, AIST) robotics/2006-02-26 Publicity Activity Proposal (Abheek Bose, ADA software) robotics/2006-02-27 Summary of Activity Robotics TF - Tampa (Olivier Lemaire, AIST) robotics/2006-02-28 DTC Report Presentation (Tetsuo Kotoku) robotics/2006-02-29 Meeting Minutes - DRAFT (Saku Egawa, Soo-Young Chi)

Agenda 13 February, Monday 15:00-17:00 – Steering Committee of Robotics DTF 14 February, Tuesday 08:40-09:00 – Welcome and Review Agenda 09:30-10:30 – Robot Technology Components RFP initial submission – Noriaki Ando (AIST) 10:30-11:30 – Robot Technology Components RFP initial submission – Hung Pham (RTI) 13:00-13:40 – Hitachi's needs for robotic system standards – Saku Egawa (Hitachi) 13:40-14:20 – Towards Plug and Play Robotics – Abheek Kumar Bose (ADA Software Group) 14:20-15:00 – SEC's approach to the standardization of robotics systems – Hiroyuki Nakamoto and Masayuki Nagase (SEC) 15:20-16:00 – Development of Food Robots and Meat Processing Robots, and Request for Standardization of RTC – Tomoki Yamashita (Maekawa MFG) 16:00-16:40 – RT service framework using IT infrastructure – Wonpil Yu (ETRI) 16:40-17:20 – A mobile system architecture with unified sensory data integration – Takashi Tsubouchi (Tsukuba Univ.) 17:20-18:00 – Navigation of mobile robots including mapping, localization, and motion – Wonpil Yu (ETRI) 15 February, Wednesday 08:10-08:50 – Response from AIST – Olivier Lemaire (AIST) 08:50-09:30 – Current State of Robotics Script/Control Language Standards – Lloyd Spencer (CoroWare) 09:30-10:20 – Lessons Learned About Software for Rescue Robots – Matt Long (iSSRT, Univ. of South Florida) 10:40-11:20 – Development Framework for Mobile Robot based on JAUS and RT-Middleware – Wataru Inamura (IHI) 11:20-12:00 – An overview of PIM & PSM for SWRadio Components specification – Jerry Bickle (Prismtech) 14:00-14:40 – Response from Compare Project – Virginie Watine (THALES) 14:40-15:20 – Robot Server Middleware: CAMU – Seung-ik Lee (ETRI) 15:40-16:20 – Toshiba's approach to RT standardization and where the standardization is needed – Fumio Ozaki (Toshiba) 16:20-17:40 – Summary of RFI responses and working group discussion 17:40-17:50 – Publicity Activity 17:40-17:50 – Next Meeting Agenda Discussion 18:00 Adjourn

Minutes 14 February, Tuesday AM Plenary Tetsuo Kotoku, presiding co-chair Meeting Week – Kick-off Meeting was called to order at 08:40. Saku Egawa (Hitachi) volunteered to take minutes of the Tampa meeting. Tetsuo Kotoku provided a brief overview of the Burlingame Minutes. (robotics/2006-02-06) Action: The Burlingame minutes were unanimously approved.

Tetsuo Kotoku reviewed today's agenda. (robotics/2006-02-07)

Char Wales, presiding co-chair Joint Meeting with MARS-PTF and SDO-DSIG “Review of the initial submissions of Robot Technology Components RFP”

Introduction – What is important for RTC specification (mars/2006-02-09) Takashi Suehiro (AIST) provided background on the RTC RFP and the goals they hope to achieve with these submissions.

Presentation – Robot Technology Components RFP Initial Submission – Noriaki Ando (AIST) (mars/2006-02-10) Noriaki Ando (AIST) presented the first initial submission Basing it on the existing SDO specification. Extending it and modifying it. Seeking to achieve a pseudo-real-time capability for distributed components.

Presentation – Robot Technology Components RFP Initial Submission – Hung Pham (RTI) (mars/2006-02-11) Hung Pham (RTI) presented the second initial submission. Looking at how a robotics systems behaves – ‘behavioral layering in robotic systems’ – reactive nature of robotics systems – Range of requirements from (e.g.) tightly coupled to loosely coupled interactions. is there a common component model that can be used to bridge between these characteristics? Propose to establish a PIM that uses PSMs to address various “quadrants” of requirements. PIM needs to satisfy 3 basic requirements (in presentation). PIM based on Lightweight CCM (LW CCM), which is based on UML, COMPARE, and Constellation (an RTI Product) and PSM mapping based connector implementation such as DDS, CORBA/IIOP

Discussion Char Wales (MITRE) discussed how what could be the next step – if it is desired – is to form an Evaluation Team so that the submitters and users can work together to help craft the revised submission. Victor Giddings (OIS) stated that such an action would be premature; that it should not happen until the first revised submissions are received. In addition, it would be better – for now – for the submitters to resolve/work out the differences (if they exist) between their submissions. Action: Make a progress review presentation at the upcoming St Louis meeting.

PM Plenary Yun-Koo Chung, presiding co-chair Presentation – RFI Response: “Hitachi's needs for robotic system standards” – Saku Egawa (Hitachi) (robotics/2006-02-09) Saku Egawa talked about needs for standardization in Hitachi Ltd. and showed two examples of their robotic systems, a cleaner robot and a workmate robot. Their first priority for standardization is in the field of real- time control system, where platform for integration of multiple domains of knowledge is especially needed. Their cleaner robot is a small-scale system that has only a 28 MHz RISC processor, while their workmate robot is a more complicated system with four processors and message-based software platform. They expect Robotics TF to build scalable standards that can cover wide range of products in the company.

Presentation – RFI Response: “Towards Plug and Play Robotics” – Abheek Kumar Bose (ADA Software Group) (robotics/2006-02-10) About robotics, ADA Software is primarily focusing on developing “intelligent components,” which mean distributed set of robotic components having basic intelligence and adaptability. Abheek Kumar Bose emphasized on the need for standardization by comparing the current situation of robotic development where interoperability is very low with very standardized automobile industry. He introduced his experiences in the Volksbot project at the Fraunhofer Institute for Autonomous Intelligent Systems. In the project, modular approaches in both in hardware and software are used. By combining the behavior modeling system “Dual Dynamics Designer” and the visual programming system “IConnect” model based development of autonomous robot became possible. His current project is to develop the Robot Modeller by extending the Control Designer Framework, an Eclipse based system developed by the Fraunhofer AIS.

Presentation – RFI Response: “SEC's approach to the standardization of robotics systems” – Hiroyuki Nakamoto and Masayuki Nagase (SEC) (robotics/2006-02-11) SEC has experiences in development of on-board computer of spacecrafts, software for cellular phones, and Internet systems. As for robotic systems, the company has advantages in the technologies, such as firmware on robotic systems, and robot monitoring server applications, robotic content service systems. Their motivation for standardization is to make a cooperative system of the robot and its surrounding devices and services, such as neighboring robots, RFID tags, GPS, and Internet services. The presenters listed up the points for standardization about methods for detecting environment, detecting position, and adapting the robot to the environment. As an example, they introduced the OMA Download Architecture Specification for the cellular phone system.

Presentation – RFI Response: “Development of Food Robots and Meat Processing Robots, and Request for Standardization of RTC” – Tomoki Yamashita (Maekawa MFG) (robotics/2006-02-12) Mayekawa’s robot divisions have been developing various meat processing robots. From 2003 to 2006, they participated in a joint research project for development of food handling robots that can pack foods in a lunch box. To handle foods with various shape, they developed an easy changeable hand-tool system. They also developed a harvest transportation robot, using the RT component system developed by AIST, “OpenRTM- aist.” Based on their experience, the presenter stated four requests for the RTC system: user-friendly development tools, support for non-PC based controller device including the programmable logic controller (PLC), “plug-n-play” function of RTC, and safety management for errors in hardware, communication, and software.

Presentation – RFI Response: “RT service framework using IT infrastructure” – Wonpil Yu (ETRI) (robotics/2006-02-13) To realize an RT service for real-world, technologies such as perception of objects, behavior control (navigation, localization), and connectivity to IT infrastructure should be developed. Their approach to implement RT service is to decompose it into three conceptual spaces: physical space, semantic space, and virtual space. Standardization of interface between three spaces (API) is needed.

Presentation – RFI Response: “A mobile robot software system architecture with unified sensory data integration” – Takashi Tsubouchi (Tsukuba Univ.) (robotics/2006-02-14) Intelligent Robot Laboratory of University of Tsukuba has been developing autonomous mobile robots “Yamabico family.” Tsubouchi pointed out that not only an software architecture or data flow, data structure should be discussed. He proposed a data structure of free space description, the Unified Sensory Data (USD). USD is a 2D scrolling ring buffer that contains an occupancy grid map of the vicinity of the robot and holds abstracted information of free pace independent from specific type of sensors. By using USD, behavior description program can be written independent from sensor types.

Presentation – RFI Response: “Navigation of mobile robots including mapping, localization, and motion” – Nakju Doh Wonpil Yu?(ETRI) (robotics/2006-02-15) Navigation is a key component in mobile robots. The navigation technology includes three sub-techniques: mapping, localization, and motion. To solve the current problem of stand-alone robots, high cost and low performance, a client-sever system is used in the Ubiquitous Robotic Companion (URC) project. The mobile robot (client) sends information such as relative position, range data, and image data to the server over the TCP/IP network, and the server returns robot path and location. As a candidate for standard for navigation, the speaker presented data structures for mapping, localization, and motion.

15 February, Wednesday AM Plenary Tetsuo Kotoku, presiding co-chair Presentation – RFI Response: “Response from AIST” – Olivier Lemaire (AIST) (robotics/2006-02-16) Olivier Lemaire introduced the use cases of the robotics technology in the Ubiquitous Functions Research Group, AIST and showed needs for standardization of robotics technology. “Robotic Space” is a robotic room for daily life support, which uses distributed wireless active RFID. “Librarian Robot” has a minimal on-board processing system and takes full advantage of infrastructure, such as ID tags, to do many usual robotic tasks. The system is constructed by integrating RT Middleware infrastructure and Web service infrastructure. Standardization is needed for re-using technology to develop new robot faster and cheaper by integrating COTS technologies, and for managing complexity of a broad and complex field needed for robotics. Standards are needed at all levels of abstraction, the model layer, the platform layer, the implementation layer, and the hardware layer. He also introduced the Japan’s joint national project on the Robot Technology Middleware.

Presentation – RFI Response: “Current State of Robotics Script/Control Language Standards” – Lloyd Spencer (CoroWare) (robotics/2006-02-17) Robotics interfaces can be categorized as programmatic interfaces, protocol interfaces, scripting and control interfaces. All of the above interfaces can be defined using XML Web Services. XML Web Services should be considered as a viable model for local resource interfaces for arms, motors, sensors, local robotic function interfaces for localization, mapping, vision recognition, and remote robotic function interfaces for higher level navigation functions and tactical objectives. The presenter requested that OMG should create an RFI for Robotics Markup Language.

Special Talk: “Lessons Learned About Software for Rescue Robots” – Matt Long (iSSRT, Univ. of South Florida) (robotics/2006-02-18) Matt Long of USF talked about a robotic software architecture design based on their experience on rescue robots. He listed desirable characteristics of an architecture of a distributed field robot, including incorporation of reactive and deliberative components, fault tolerance, adaptability in the face of changing operating conditions. To provide a consistent programming model, their Distributed Field Robot Architecture uses Java and Jini. DFRA has benefits of flexibility and critical capabilities owing to the robust middleware employed, while it has tradeoffs in complexity, performance overhead, and steeper learning curve.

Hung Pham, presiding co-chair Presentation – RFI Response: “Development Framework for Mobile Robot based on JAUS and RT- Middleware” – Wataru Inamura (IHI) (robotics/2006-02-19) Approach of IHI to standardized robotic architecture is to use two existing open architectures, JAUS and OpenRTM-aist. JAUS defines application specifications of a mobile robot, such as software structure, function assignment to each component, messages between components, and the behavior of a component, while OpenRTM-aist implements components and message communication. A JAUS component is mapped to a RT-Component and JAUS messages are sent through the RTC InPort and OutPort objects. The state machine of JAUS component is mapped to the RTC state machine. Two examples of mobile robot system using the proposed framework were presented.

Presentation – RFI Response: “An overview of PIM & PSM for SWRadio Components specification” – Jerry Bickle (PrismTech) (robotics/2006-02-20) SWRadio specifications include UML profile for SWRadio and SWRadio facilities. The UML Profile for SWRadio consists of the Generic Component Framework, a rich set of semantics for component and container based development, and Communication Channel & Equipment, which defines basic types and properties for the communication channel and for each specific equipment. The SWRadio Facilities are optional interfaces for component behavior extensions, such as common layer for PDUs, error control, or flow control, networking layer, physical layer, and radio control. Robotics may use these facilities as appropriate and supplement these facilities with their own that are specific to the robotics domain.

Presentation – RFI Response: “Response from Compare Project” – Virginie Watine (THALES) (robotics/2006-02-21) Virginie Watine made a talk on a component / container model for real-time / embedded software defined by the Compare Project. The model is based on the Lightweight CCM with adaptations specific to RT/E systems. To isolate timing properties, a concept of Activity, which represents an execution path, is introduced. For interaction between components, Connectors, which capture captures interaction logics in a fully reusable manner, are used. In the robotics domain, profiling for robotics could be defined, such as dedicated Connectors, Container Services, and Predefined Components.

Presentation – RFI Response: “Robot Server Middleware: CAMU” – Seung-ik Lee (ETRI) (robotics/2006-02-22) The basic concept of the URC consists of a ubiquitous sensor network, a high-performance computer, and a hardware robot. The URC server expands the robot functions and services, improves the context-awareness, and enhances the robot intelligence. The server system, robots, and devices communicate by the PLANET framework. PLANET is based on the remote method invocation and has light-weight protocol to minimize the communication load.

Presentation – RFI Response: “Toshiba's approach to RT standardization and where the standardization is needed” – Fumio Ozaki (Toshiba) (robotics/2006-02-23) Toshiba started to adopt the object oriented technology for robot software development. The idea of the Open Robot Controller Architecture is to use standard IT, concentrating on robot control, and define the framework and standard APIs. From lessons learned from their experiences, they use Python for scripting language, HORB for networking middleware, UPnP for home appliances connection. The presenter pointed out several issues needed to be discussed. He also stated that high level technologies that should be developed in the future, for example, AI, are not for standardization because a standard should not prevent the development of new technologies.

Tetsuo Kotoku, presiding co-chair Summary of RFI responses and working group discussion (robotics/2006-02-24, -25) Olivier Lemaire reported the result of the survey on possible working groups gathered by email just this meeting. There were answers from 5 organizations in Japan, 5 in Korea, 2 in US, 1 in France, and 1 in India. The survey showed that the needs for standards and intention of participation are high and the robotic system infrastructure has got the highest score.

Based on the survey, four working groups were defined and volunteers for each potential working group are listed as: Robotic Infrastructure WG: Rick Warren (RTI), Noriaki Ando (AIST) Robotic Service WG: Soo-Young Chi (ETRI), Olivier Lemaire (AIST) Robotic Tools WG: Abheek Bose (ADA Software) Robotic Profiles WG: Seung-Ik Lee (ETRI), Bruce Boyes (Systronix)

Working groups should start discussion by email and the founding chairs should report on the mission statement and real chairs in the next technical meeting.

Motion: To recommend an extension of the submission date for the Robotics Systems RFI (mars/05-06-12) to 3 April 2006, 3 weeks before the St. Louis meeting by Yun-Koo Chung(ETRI). Established Task Force quorum of 3 Second: Olivier Lemaire (JARA) Discussion: none WB proposed: Masayoshi Yokomachi (NEDO) No objection to WB PASSED

Publicity Activity (robotics/2006-02-26) Abheek Bose proposed to start the publicity activity to promote the OMG robotics standardization especially in Japan, EU, India, and SE Asia. A motion to setup the publicity subcommittee was approved. The voluntary members of the subcommittee are: Abheek Bose (ADA Software) Olivier Lemaire (AIST) Yun-Koo Chung (ETRI) Masayoshi Yokomachi (NEDO) The initial mission of the subcommittee is to make a 4 page flyer and web pages to invite people to the OMG Robotics DTF.

Volunteers for attending RoboBusiness event being held in June were requested but no organizations answered.

It was decided to propose standardization workshop for IROS 2006 in October. Tissue Kotoku (AIST), Yun Koo Chung (ETRI), and Makoto Mizukawa (Shibaura Institute of Technology) volunteered to become co- organizers for the workshop.

ETRI will host the URC Technical Cooperation Forum, March 9, Seoul. Hung Pham (RTI) and an OMG staff will make a talk on the robotics standardization activity.

Motion: To charter Publicity Sub-Committee by Abheek Bose (ADA Software). Established Task Force quorum of 3 Second: Yun-Koo Chung (ETRI) Discussion: none WB proposed: Masayoshi Yokomachi (NEDO) No objection to WB PASSED

Next Meeting Agenda Discussion The tentative agenda of the next technical meeting was decided as follows: Monday: WG meeting & Steering committee Tuesday-Wednesday: Plenary meeting RFP progress report (MARS joint meeting) RFI response presentations Roadmap discussion Contact reports

ADJOURNED @ 18:00

Participants (Sign-in)

Steering Committee: 13 February, Monday (23 participants) • Tetsuo Kotoku (AIST) • Yun Koo Chung (ETRI) • Hung Pham (RTI) • Yung-Jo Cho (ETRI) • Makoto Mizukawa (Shibaura Institute of Technology) • Soo-Young Chi (ETRI) • Masayoshi Yokomachi (NEDO) • Saku Egawa (Hitachi) • Tomoki Yamashita (Mayekawa MFG) • Fumio Ozaki (Toshiba) • Seung-Ik Lee (ETRI) • Wonpil Yu (ETRI) • Rick Warren (RTI) • Ho Chul Shin (ETRI) • Roy Bell (Raytheon) • Wataru Inamura (IHI) • Hiroyuki Nakamoto (SEC) • Masayuki Nagase (SEC) • Hideo Shindo (NEDO-DC) • Olivier Lemaire (JARA) • Takashi Suehiro (AIST) • Noriaki Ando (AIST) • Abheek Bose (ADA Software)

Plenary: 14 February, Tuesday (26 participants) • Tetsuo Kotoku (AIST) • Yun Koo Chung (ETRI) • Hung Pham (RTI) • Claude Baudoin (Schlumberger) • Takashi Suehiro (AIST) • Soo-Young Chi (ETRI) • Masayuki Nagase (SEC) • Hiroyuki Nakamoto (SEC) • Masayoshi Yokomachi (NEDO) • Takashi Tsubouchi (University of Tsukuba) • Roy Bell (Raytheon) • Ho Chul Shin (ETRI) • Mitola (MITRE) • Hideo Shindo (NEDO-DC) • Tom Anderson (Objective Interface) • Rick Warren (RTI) • Olivier Lemaire (JARA) • Noriaki Ando (AIST) • Wonpil Yu (ETRI) • Abheek Bose (ADA Software) • Wataru Inamura (IHI) • Saku Egawa (Hitachi) • Fumio Ozaki (Toshiba) • Yung-Jo Cho (ETRI) • Seung-Ik Lee (ETRI) • Roger Burkhart (Deere & Company)

Plenary: 15 February, Wednesday (34 participants) • Tetsuo Kotoku (AIST) • Yun Koo Chung (ETRI) • Hung Pham (RTI) • Saku Egawa (Hitachi) • Fumio Ozaki (Toshiba) • Abheek Bose (ADA Software) • Ho Chul Shin (ETRI) • Takashi Suehiro (AIST) • Masayoshi Yokomachi (NEDO) • Jerry Bickle (Prismtech) • Matt Long (iSSRT) • Juergen Boldt (OMG) • Duame Clarkson (Deere & Company) • Gerardo Pardo (RTI) • Feed Waskviewicz (OMG) • Roger Burkhart (Deere & Company) • Claude Baudoin (Schlumberger) • Shinobu Koizumi (Hitachi) • Noriaki Ando (AIST) • Hideo Shindo (NEDO-DC) • Takashi Tsubouchi (University of Tsukuba) • Wataru Inamura (IHI) • Yung-Jo Cho (ETRI) • Seung-Ik Lee (ETRI) • Hiroyuki Nakamoto (SEC) • Masayuki Nagase (SEC) • Tomoki Yamashita (Mayekawa MFG) • Lloyd Spencer (CoroWare) • Rick Warren (RTI) • Soo-Young Chi (ETRI) • Tom Anderson (Objective Interface) • Olivier Lemaire (JARA) • Jeff Simith (IDTS) • Virginie Watine (Thales)

Prepared and submitted by Saku Egawa with the assistance of Soo-Young Chi and Olivier Lemaire. robotics/06-04-07

Robotics-DTF/SDO-DSIG Plenary Meeting

April 24, 2006 St. Louis, MO, USA Hilton St. Louis Airport Poplar

Approval of Tampa Minutes

• Ask for a volunteer (minutes taker) – Hung Pham – Yun-Koo Chung

• Tampa Minutes review [Robotics] The first plenary of Robotics-DTF. The deadline of Robotic Systems RFI was re-extended. We had 1 special talk (Matt Long, USF) and 14 RFI response presentations. Make 4 groups for WG activities. [SDO] Review of the initial submissions of Robot Technology Components RFP. Review Agenda Tuesday, April 25, 2006 Poplar

10:05-12:20 Welcome and Review Agenda 10:20-11:20 Special Talk: Prof. Chris Gill 11:20-12:00 RFI response presentation

13:00-14:00 Special Talk: Prof. Jean-Christophe Baillie 14:00-16:20 RFI response presentation 16:20-17:40 WG (Infrastructure)

Joint Meeting with MARS/RTESS Thursday, April 27, 2006 13:00-14:00 (Banchette2)

Review Agenda Wednesday, April 26, 2006 Poplar 08:30-09:50 RFI response presentation 10:40-11:40 WG Reports and Roadmap Discussion 11:40-11:50 Contacts Report 11:50-12:00 Next meeting, etc. 12:00 Adjourn

14:00-18:00 WG (Infrastructure) 16:00-18:00 WG (Service)

Joint Meeting with MARS/RTESS Thursday, April 27, 2006 13:00-14:00 (Banchette2) Document Number robotics/2006-04-04 Final Agenda robotics/2006-04-05 Steering Committee presentation robotics/2006-04-06 Tampa Meeting Minutes [approved] robotics/2006-04-07 Opening presentation robotics/2006-04-08 Robotics-DSIG Roadmap robotics/2006-04-09 Special Talk: Chris Gill presentation robotics/2006-04-10 Hyun-Sik Shim presentation robotics/2006-04-11 Special Talk: Jean-Christophe Baillie presentation robotics/2006-04-12 Toshihiko Morita presentation robotics/2006-04-13 Ho-Chul Shinpresentation robotics/2006-04-14 Soon-Hyuk Hong presentation robotics/2006-04-15 Seok Won Bang and Y. H. Kim presentation robotics/2006-04-16 Hiroyuki Nakamoto presentation robotics/2006-04-17 Service WG activity report robotics/2006-04-18 Profile WG activity report robotics/2006-04-19 Infrastructure WG activity report robotics/2006-04-20 Contact Report: KIRSF robotics/2006-04-21 Contact Report: ORiN robotics/2006-04-22 Robot Technology Components RFP Progress Report in MARS robotics/2006-04-23 Summary Report of Robotic Systems RFI robotics/2006-04-24 DTC Report Presentation robotics/2006-04-25 Meeting Minutes - DRAFT

Publicity Activities

• 4 page fly sheet Draft of Abheek@ADA Software Abheek@ADA Soft, Olivier@AIST, Chung@ETRI, Yokomachi@NEDO

Action: Send each organization logo to Abheek.

44 pagepagepage flyflyfly sheetsheet willwill bebe authorizedauthorized inin BostonBostonBoston Publicity Activities

• Korea-Japan RSW2006 Friday, June 16, 2006, Jeju Island, Korea Chung@ETRI • RoboBusiness2006 June 20-21, 2006, Pittsburgh, PA, USA http://www.robobusiness2006.com/ Jon Sigel and Bruce@Systronix • IROS2006 Workshop October 9-15, Beijing, China http://www.iros2006.org/ Kotoku@AIST, Chung@ETRI, Mizukawa@Sibaura-IT • SICE-ICASE International Joint Conference October 18-21, Pusan, Korea http://sice-iccas.org/ Mizukawa@Sibaura-IT

Next Meeting Agenda June 26-30, 2006 (Boston, MA, USA) Monday : RTCs RFP revised submission review [MARS] Steering Committee Monday-Tuesday, Thursday : WG activities Wednesday : Robotics-DTF Plenary Meeting •Guest Presentation •WG reports & Roadmap discussion •Contact reports •Resolution Roadmap for Robotics Activities robotics/2006-04-08 Item Status Tampa St. Louis Boston Anaheim DC TBA TBA Feb-2006 Apr-2006 Jun-2006 Sep-2006 Dec-2006 Mar-2007 Jun-2007 Robot Technology Components RFP In Process Initial Pre-review Revised issue (SDO model for robotics domain) Submittion Submittion SDO model for xxx Domain no plan discussion draft RFP RFP

Charter on Robotics WG in SDO done Oct-2004

Robotic Systems RFI In Process Response Response review Whitepaper [Robotics: Initial Survey] Presentation Presentation whitepaper Flyer of Robotics-DTF In Process discussion discussion issue [Publicity Sub-Committee] ver.1.0 Localization Service RFP In Process discussion draft RFP review RFP RFP Initial [Service WG] Submittion User Identification RFP Planned discussion draft RFP review RFP RFP [Services WG] Programmers API: Typical device abstract Planned Topic draft RFP review RFP RFP interfaces and hierachies RFP discussion [Profile WG] Hardware-level Resources: define Planned Topic draft RFP review RFP RFP resource profiles RFP discussion [Profile WG] Deplyment and Configuration RFP Planned Outline draft RFP review RFP RFP [Infrastructure WG] discussion etc… Future Topic draft RFP review RFP RFP discussion Charter on WGs done Grouping issued [Service, Profile, Infrastructure] Apr-2006 Charter on Robotics TF done Dec-2005 Charter on Robotics SIG done Feb-2005 Robotics Information Day done [Technology Showcase] Jan-2005                                    !" # "$

   ! " % !"&'( "'))*" # "$" &

 #     2#  32 45$   45$ 45$ 6 7 45$ 6  8  45$ 39 2 - :$  ;

 #   $ "  !%%&'()''*+* ,-./0  !%%&'())%)1*  !%%&'()%1''' ,"./

  

 6  $ F 6$ 6 =     $   2# F 3   5 $ 5$ 6    F  $ $  66 #  6  6# 3 6  $ > 2$ S @ F .=2=    6#   S  F .=2=   6# $ 6  6 

52  #2    665$      # ,$ 6     $  55/

2 - Chris Gill – 4/25/2006 52 .A  5   #2

image server client cockpit server side side displays

virtual folder, images transmission adaptation middleware middleware

low bandwidth radio link Collaborative research with Boeing, BBN, Honeywell Technology Center, supported by Boeing/AFRL contract F33615-97-D-1155/0005 (WSOA) 862    6  F .A2  2$   5   9 66   2 B#$  6 $ F 2 5$ #2   #   3$ #     6   6  $ @ F !   C  .A 5  DDC  F ! E 6 !"   2 2    F ! 5   #2  

3 - Chris Gill – 4/25/2006

#  8#5 2$ #

     '=) F FC2#2 $  $  F 66    6  6# F22  F    $ $  ,/62# F 66     6 62# 6      '=G F "#22  $  #2  #   F 66  6A $ 5   

4 - Chris Gill – 4/25/2006 2$ #$ 8      '=)

FC2#2 $ F $  " 66 F   6  6#

5 - Chris Gill – 4/25/2006

   FCF2#2 $

Client object Servant reference

Stub Skeleton

IIOP ORB message ORB

8 F 5 $ 6$    5         H 6 ,8 / F . 8"    H 8  =   6   6 6 5 2 F  #   5 9 #  2    6      6# F :     9 5  #A = F +,  - . /    

6 - Chris Gill – 4/25/2006 .A2 2 ! 6 F F$

  "&0 1

#   $6 66   F #$  # 5  #A2 = F 5  6 $ " 8;9  6  F    A39 F$A  74F#ACF      5 92 B##2 92 =

7 - Chris Gill – 4/25/2006

     '=)  "

//// Define Define two two lanes lanes RTCORBA::ThreadpoolLaneRTCORBA::ThreadpoolLane high_priority high_priority = = {10{10 /*Prio*/, /*Prio*/, 3 3 /*Static /*Static Threads*/, Threads*/, 0 0 /*Dyn /*Dyn Threads*/ Threads*/ }; };

RTCORBA::ThreadpoolLaneRTCORBA::ThreadpoolLane low_priority low_priority = = {5{5 /*Prio*/, /*Prio*/, 2 2 /*Static /*Static Threads*/, Threads*/, 2 2 /*Dyn /*Dyn Threads*/}; Threads*/};

RTCORBA::ThreadpoolLanesRTCORBA::ThreadpoolLanes lanes(2); lanes(2); lanes.length lanes.length (2); (2); lanes[0]lanes[0] = = high_priority; high_priority; lanes[1] lanes[1] = = low_priority; low_priority;

RTCORBA::ThreadpoolIdRTCORBA::ThreadpoolId pool_id pool_id = = Thread Pool with Lanes rt_orb->create_threadpool_with_lanesrt_orb->create_threadpool_with_lanes (1024 * 10, // Stacksize (1024 * 10, // Stacksize PRIORITY PRIORITY lanes,lanes, // // Thread Thread pool pool lanes lanes 5 10 false,false, // // No No thread thread borrowing borrowing false,false, 0, 0, 0); 0); // // No No request request buffering buffering F 6 $      # ,/   ,$/ I 6   9 E # 6  2 B# #662

8 - Chris Gill – 4/25/2006 3  66 8 2   

Structure with Embedded or Bonded Piezoelectric Acoustic Waves (kHz Range) Transducers

2 1

3 4

5  2   ## ,=2= 6 / "2   5    # $   #   $ #        2 2  A$ # 66    5= 6 5= 6#  , #/  5 $   @

9 - Chris Gill – 4/25/2006

2 2

 #     #2   5$ 6  F   #   6 5 6# .A 6# $      F .=2= !   $ 9   $   2   # -  #$    6#    B#  !#    F $CE F  6 # E

10 - Chris Gill – 4/25/2006  " $  66   

Remote call Call to implementation 1 1 Stub code Marshall Unmarshall Skeleton code (using ACE_CDR) parameters parameters (using ACE_CDR)

Could be avoided for 1) Foo() homogenous nodes 4 Operation lookup and dispatch Bar() 2) Only a subset of GIOP messages 3) Simple Life cycle Simple Object management 3 Adapter 4) Hash-table vs linear search

Reactor Reactor Connection ORB 2 ORB Connection Acceptor Cache Acceptor Cache

11 - Chris Gill – 4/25/2006

!   .    

2,000 Node NodeRegistry 1,800

1,600

1,400

1,200

1,000

800

600 Footprint in KB

400

200

0 compile compile ACE TAO nORB optimized optimized TAO nORB Node 376 1800 567 1738 509 NodeRegistry 324 1778 549 1725 492 ACE costs 212KB; nORB+ACE costs 345KB; TAO+ACE costs ~1.7MB Node application code alone costs 164KB 12 - Chris Gill – 4/25/2006 "2 #2 2 52  

13 - Chris Gill – 4/25/2006

2$ #$ 8 # $

        2 2 F "    2#2C  F 5  5 6   EC2E     '=)     J F .=2=   E  = F :  2 6  # =2= 6  $ :5    A  # F .=2=  2  $     #       5@ F  #     6 92  F !#   P2 6  Q $ 

14 - Chris Gill – 4/25/2006 2$ #$ 88 F22 

   $ $  62#    B#  E   66     6 62# 6 9 $ C62# 6$ F # 6  2 $ E #

15 - Chris Gill – 4/25/2006

 5 6    .5#

 # H  #2 , /  F .=2=    M5  8 F  6  2 2 5  ,2#2/      F .=2=      '=) '=G F .6       5    F .=2=       , / .M CMG.. F  6 5 2 2 #2 5 62#      F .=2=   82 .  ,8 / J .M F .6 62#     5 6 F &              2  3

16 - Chris Gill – 4/25/2006 52 .A  

Rate GPS Airframe Heads-Up Generator Display

     6 5   , 2/  $   #       6A $ 5      ,  /    62#  $ F .=2=        6 6#  =   #  62# 6 $    F .=2=  # E    $  ,/ F   2 6    $  $  2

17 - Chris Gill – 4/25/2006

 5= $  62#

$  62# F    $ > $ 

 # FF ; F 2 F "  

 2C2  

  # 6 =C=      6  ,=2= A39/  62# F 5  # 66    F !#  52 $ #  6A $    F 4  92  PQ    F 4 #  5  62#     E

18 - Chris Gill – 4/25/2006  5= $  62# .A 

  6  6   $  62# F 4 A  5     6$ # 6 6  66   #   2  F "# 8 G=(:E "4 ())   ('G7  F   F#A G=1='*  74F#A  

 # FF 6 $  62# 

 66 2    $ 6 A  F .A  # 8 )=1=' C  '=1=' C . (=1='

19 - Chris Gill – 4/25/2006

  6   $

3#  6# F  5= ')) 6  Dynamic RT-CIAO F G  6 2# Dynamic CIAO Static RT-CIAO 1000 Static CIAO 3  6# F   5 100 F  5     10

6 2# Time (msec) 66  #  1 F F2 FF 2  0.1 0 5 10 15 20 25 30 35  $  /  Samples F ; F 2 

  !

20 - Chris Gill – 4/25/2006   5   

5 62#   G 2 Dynamic RT-CIAO Dynamic CIAO Static RT-CIAO  #  1000 Static CIAO 6  66 100 F ')) 5= ') 6  10  25  Time (msec)

 1 Configuring RT-CORBA features F  6    

92     0.1 6#  5 0 5 10 15 20 25 30 35 Samples F G  6 2#  6  5

21 - Chris Gill – 4/25/2006

:    

:  2    Dynamic RT-CIAO Dynamic CIAO 62#2   Static RT-CIAO 1000 Static CIAO  A5  F   $ 100 F   5

10

F 5= 9 Time (msec)   # 6  66 1

  6# 0.1 F S    0 20 40 60 80 100 120 140 Samples   26$

22 - Chris Gill – 4/25/2006 8 5= "8 62#

CIAO PRISM

Create home P1 C1.1 Create home executor Create home Create C1.2 Install C1 component servant home P2 factory Register with POA C1.3 and create object Create Create reference to home C2 component component P3 impl Register component Create C3 (e.g. with a naming support for facet, P4 service) receptacle and equivalent interface Create C4 connection Init Receptacle, EventSupplier & P5 EventSink 8 S  62#    2S "8  6    F # 62#   6A $C 66 26$ F  ,.A 86 / 5= DD ,!  / 

23 - Chris Gill – 4/25/2006

8 5= "8 62# .A 

7x 8x 9x 10x 11x 12x 7x 8x 9x 10x 11x 12x C 7 8 9101112

Ethernet    A 123456 1x 2x 3x 4x 5x 6x 1x 2x 3x 4x 5x 6x A B

            "6  F  ('')GG&%  .  F "O1') ()) :E  C ('G  F A39 (=1=G F " A=1 ,/ 5 6 8 C  62# :2 #   #  #  9 # F (  # A39 tickGet() F 1) # A39 sysTimestamp()

24 - Chris Gill – 4/25/2006 "8 C8 :    

"8   0.8 PRISM F DD  H 0.7 CIAO 0.6   $  0.5  H 0.4 E home activation, etc. Frequency 0.3

8   0.2 F DD  H  G 0.1 0 F G #    0 0.5 1 1.5 2 2.5 3 E  Creation Time (msec)

25 - Chris Gill – 4/25/2006

"8 C8     

2  DD 5= 0.9 PRISM    0.8 CIAO 66 0.7 0.6  A5 0.5    0.4 Frequency 8 0.3 component 62# 0.2 activation, etc. 0.1   # 0 F 6  #  0 0.5 1 1.5 2 2.5 3 3.5 4 Creation Time (msec) 6   

Bounded by 4 msec

26 - Chris Gill – 4/25/2006 "8 C8  .    

0.6 PRISM F A5 CIAO 62#  0.5 0.4 2 6 DD 5=    0.3 Frequency 66 0.2 CORBA connection setup cost 66  0.1  0 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 6   Creation Time (msec) 6A $

27 - Chris Gill – 4/25/2006

2$ #$ 88 # $

  25   62# F 5 C6#      5    FF 2 2  F     $ 5  F 8  2     ; F 2@ "8 > 8 66    6A $  F DD    5=      F 8  2  A# 6  H $@  62#       5   #  . F 8   $  > 62# 6 F CC#==#=#C = 

28 - Chris Gill – 4/25/2006 2$ #$ 888      '=G

 #   "#22  $  #2 66  6A $ 5  

29 - Chris Gill – 4/25/2006

5

 5# 6  2 2  F  #    # 6  

 8==   2#2  # B# 5

 $     # #2  2 = F 82  #22 C$  #2   2     2 F !A  6$  6    F "5 6     #   5 F "5 66 2# 6   

30 - Chris Gill – 4/25/2006    '=G 8    

8   6  #   F  $   2 

 5     6 #$ 5

  5=  #   $ #  

  6      $  #2 5 6 9 F !A  6  #   F      #     9 F J#6$  6 22  #     F     #2 B#

31 - Chris Gill – 4/25/2006

   '=G 

Service Context Object (Servant) BSS-A 2

Scheduling BSS-B Scheduling 1 Distributable thread segment ESS-B segment A ESS-A B Service Context Client 3 Current locus of execution 4 Segment scheduling policies IDL Skeletons IDL B: MUF Stubs A: EDF A: EDF Object Dynamic Dynamic 5 5 Adapter Scheduler Scheduler

ORB Core

 #   N  # #$   #2 2  N 25 $  2 #2 $ F# 6 A# N   #    #$ #2 $  # N 6  #   2 $

32 - Chris Gill – 4/25/2006 8  G  #  

2- Way BSS - A Invocation 2- Way DT2 DT3 DT1 Invocation BSS - D BSS - C BSS - B

BSS - E

ESS - E ESS - B ESS - C ESS - B ESS - A

Host 1 Host 2 Host 3 3 $ G$    5  #   5 # 9    F #  5 ,  A/ 6  $   F  #2 66 # #2    #         F 3 92 66   $  #2 #2  

33 - Chris Gill – 4/25/2006

2  #  

BSS - A 1-Way DT1 spawn () Invocation

DT4 DT3 DT2

ESS - A

Host 1 Host 2 Host 3  #       66 $ F    2  #   #   F   #   2  spawn()  F   #   92  $# ,$/ 5 -  #    #2  

34 - Chris Gill – 4/25/2006  #   " .A 

# #  Service Context 1 BSSorSpawn in args USS 2 5    Object Operation () F   6   (Servant) 3 ESS out args + return value  #   Client F   4 .  F 3 8 " B#   IDL F  6 8 " B# Skeletons F 3 8 " $   IDL Dynamic Stubs Scheduler 5 Object F  6 8 " $ 6 4 7 Adapter 8  # #2 6   # ORB Core

F   1. BSS - RTScheduling::Current::begin_scheduling_segment() or   ,    RTScheduling::Current::spawn() 2. USS - RTScheduling::Current::update_scheduling_segment() #/  # $  3. ESS - RTScheduling::Current::end_scheduling_segment()    4. send_request() interceptor call 5. receive_request() interceptor call #C$ 6. send_reply() interceptor call 7. receive_reply() interceptor call

35 - Chris Gill – 4/25/2006

  #2

Ready Queue of Distributable Threads

C C C V V V 10 5 1

New Distributable Thread + 8

C C C C V V V V 10 8 5 1

Ready Queue of Distributable Threads

- Distributable Thread CV - Condition Variable Importance

6   I 6  #       9 B## 2   6    5 ''  #    2#

36 - Chris Gill – 4/25/2006 C  #2  9

    6    #2         # 6  -   $        C  66   6  $  #2  ,=2= # #2/

37 - Chris Gill – 4/25/2006

 8$   8#

Binding of a single DT to DT carries scheduling two different OS threads parameters with it Host 1 Host 2 RTCORBA 2.0 RTCORBA 2.0 Scheduler Scheduler DT

Can cancel from either endsystem

   66   6    F 22  6  #     F 62#2  #2   5   $ F #2 6  66  6  A#

38 - Chris Gill – 4/25/2006  6 2 ,/ .A 

  #    # 6 2 F 5 92 6 2    5   66 #    :5  #   $    #   #    3    #      @

DT 1

tss_write OS Thread 1 DT 2

OS Thread 1

OS Thread 2

tss_read

Host 1 Host 2 39 - Chris Gill – 4/25/2006

 . #  9

4000 TSS Key Create: Emulated "# 9    Native OS 3500

F  #  G=* :E "1 3000

('G7  ('G  $ 2500

F : O=%    2000

F   $ Time (nsec) 1500 F   C 1000 $   9$ 500

0 4 2  1 51 101 151 201 251 301 351 401 451 501 Number of Keys Created 3000  $ 6 9$  Emulated Write TSS Write/Read: Emulated Read Native OS Write F  $  #  6 2500 Native OS Read

9$   .  2000

F . #  RG#  1500  9$  Time (nsec) 1000 F 2    #   500 0 R'=(#  R=(#  1 101 201 301 401 501 601 701 801 901 1001 Number of Successive Iterations 40 - Chris Gill – 4/25/2006  #   

BSS - A

cancelDT Head ofDT

Propagate cance l DT cance lled Processthe cancelatne xt schedulingpoint

Host 1 Host 2 Host 3

A  #       5  "   $ 6   F   $     S #  P 9Q F      6    #   F 59 5     #    F   A #2  ,# #/

41 - Chris Gill – 4/25/2006

2$ #$ 888 # $

   '=G  25     6   $  #2 6  #    2 6  2     F #   2 6   6     '=G      F # 6 6$ 5     F CC#==#=#C = 

42 - Chris Gill – 4/25/2006 #2  9

   5  22 66 F    6$ G F      6 $      F 8 $ C2#2 $ F   62# # F "E  J    :5   S 5 5$2 F 5 # A H#2  3 

 3     #2    52 6 

    52    6 #22   9$    S  5#$  F  $  52   F  $  52  

43 - Chris Gill – 4/25/2006

!  86 

5   #$ F ==#=#CR2C" !CMS3 =6   6    F ==#=#CR2C" !C)1S  =6    '=G F ==#=#CR2C" !CM S'=G=6 F ==#=#CR2C" !C)(S .=6 $  #2 F ==#=#CR2C" !C$ =6 F ==#=#CR2C" !C S=6 F ==#=#CR2C" !C)(S2#=6 F ==#=#CR2C" !C)(S  =6     62# F ==#=#CR2C" !C)1S=6 F ==#=#CR2C" !C)1S=6

44 - Chris Gill – 4/25/2006 Robotics/2006-04-10

jj––””””œœ••ŠŠˆˆ››––••GG——™™––››––ŠŠ––““GG––™™GG››ŒŒGG ||yyjjGG™™––‰‰––››GGˆˆ••‹‹GGššŒŒ™™ŒŒ™™

hGyŒš—–•šŒG›–Gy–‰–›ŠGz š›Œ”GympG h—™“SGYWW]

Hyun-Sik Shim Telecommunication R&D Center Applied Technology Lab. SAMSUNG ELECTRONICS CO., LTD.

{ˆ‰“ŒG–Gj–•›Œ•›š

‰p•›™–‹œŠ›–•G›–G|yjGp•™ˆGz š›Œ” ‰|yjGw™–›–Š–“ ‰|yjGh——“Šˆ›–•VmŒ“‹G{Œš›G ‰mœ™›Œ™Gz›œ‹

ͣ iˆšŠGj–•ŠŒ—›G–G|yj

Sensing

Processing Action

Conventional Robot

The main approach of URC is to distribute these functional components through the network, and moreover, to fully utilize external sensors and external processing servers.

URC : Ubiquitous Robotic Companion ͤ z›™œŠ›œ™ŒG–G|yjGš š›Œ”

The URC server plays an important role with the URC infrastructure in providing functions of various technical components required by URC robots or clients.

ͥ |yjGš š›Œ”Gš›™œŠ›œ™Œ

ऑऎࣿࣜएडमलडमࣜयवयरडऩ

s•œŸSG~•‹–žš Agent Load Information ~•GjlSGqh}hGhwp Load Report s•œŸ |yjGy–‰–›Vj“Œ•› Face Recognition Remote Monitoring

URC protocol TTS coordinator Action/Emotion Authentication profile URC Main ASR Dialog Speech/sound Motion Detect SMS Transmit profile TTS URC contents Video protocol j–•›Œ•›šGzŒ™ŠŒ s•œŸ profile MO Receving Sync Robot Control/Monitoring ऑऎࣿࣜटफपरडपरयࣜउझपझणडम RSS CP Connectivity profile

Clustering Service E-mail POP3 SIP protocol }–pw s•œŸ

Call

qˆˆ Contents profile CAMUS protocol

CAMUS(Context Aware Middleware for URC System) Call profile

RTP/RTCP Streaming profile z›™Œˆ”•Ž ~•‹–žš

Streamingg Encoding Mobile Encoding ͦ

|yjGw™–›–Š–“

‰ The protocol that the component named ‘URC Main’ adapts, is URC protocol within URC server ‰ Mainly, It used by URC robots or URC Clients. ‰ In order for other server system (ex. CAMUS) in the URC server group to use main function, URC protocol has to be utilized. ‰ The URC client/server communication protocol is the application protocol of TCP protocol.

Ping HTTP Telnet FTP URC traceroute DNS SNMP NFS Protocol

TCP UDP

ICMP IP IGMP

ARP Data Link RARP

media ͧ |yjG—™–›–Š–“G”ŒŠˆ•š”GOXP

‰ URC framing „ Framing of URC message adapts an binary format for the efficiency and numerous data contained in the URC messages borrows the data format of UDR (URC protocol Data Representation) ‰ URC encoding „ URC messages are encoded with “little-endian” format.

TYPE Corresponding to C++ DESCRIPTION byte char 1-byte integer or character short short 2-byte integer integer int 4-byte integer float float 4-byte single-precision floating point double double 8-byte double-precision floating point string[N] char [N] Fixed-length string data string char [] Variable-length char arrary opaque[N] char [N] Fixed-length opaque data ( binary data ) opaque char [] Variable-length opaque data T[N] T [N] Array[N] of type T T T [] Variable-length array of type T structure struct structure {ˆ‰“ŒGUGXGGG|kyGkŒ•›–•G ͨ

|yjG—™–›–Š–“G”ŒŠˆ•š”GOYP

‰ URC authentication „ Every URC robot and URC client pass through authentication process to be identified into users and robots and then grant themselves of the necessary rights. „ Pre-registered ROBOT ID(MAC address) identifies URC robots and URC clients authenticate themselves with user ID and password. ‰ URC robot ACK (acknowledgement) „ Most jobs done by robots are relatively time-consuming.

z moving, gesture, TTS speaking „ Server must be acknowledged with the start and the end of the job in the form of events in that the synchronizations of URC robots match with other functions. „ The URC server then matches the synchronizations of the jobs through ACK.

ͩ y–‰–›Gˆœ›Œ•›Šˆ›–•G—™–ŠŒšš |yjG—™–›–Š–“G”ŒŠˆ•š”GOZP

‰ HB (Heartbeat) Mechanism „ URC robots simultaneously start their connection and keep this condition as long as the URC robot connects with URC server. „ QoS not guaranteed at commercial network „ Problem

z Disconnection by abnormal network environments „ In that case, it is important to swiftly grasp the situation and to address the right direction. „ URC protocol defines HB (Heartbeat) protocol amongst URC robot/client and URC server to manage abnormal situations.

z To monitor the connection between URC server and URC robots on a consistent basis

ͪ

|yjG”ŒššˆŽŒGš›™œŠ›œ™Œ

‰ URC Message URC common header URC Common header message „ URC message body (payload) „ URC message body ‰ URC common header message

‰ URC message format

„ URC Heartbeat Message

„ URC Request Message

„ URC Response Message

„ URC Event Notification Message

The URC heartbeat message is periodically exchanged between URC robot and URC server to tackle abnormal situations found both in wireless and wired connectionU ͢͡ |yjG—™–“ŒG

‰ URC Profile (Total : 178 messages) „ The URC profile is the unit function of URC robot and URC server in the URC infrastructure. „ A profile consists of functions supplied by the profile and events. ‰ URC Server Profile (61 messages) „ URC server profiles provide the interface for numerous functions such as URC server’s recognition functions (such as speech or image recognition). z System , Authentication profile z Remote, Event profile z ASR (Auto Speech Recognition), TTS profile z Face Recognition, Motion Detection profile z Localization, Contents profile z SMS, Real time recording profile z Mpeg4 Play, VOD Play-list profile z Reservation Recording profile ‰ URC Common Robot Profiles (68 messages) „ URC common robot profiles support the interface of functions that all URC robots should provide. z System, Move profile z Navigation Profile z EPD (End Point Detection), ASR profile z Sound, Vision profile z Motion, Sensor profile ‰ URC Robot Specific Profiles (49 messages) „ The URC robot specific profile provides the interface of its characteristic functions. z Text Display, Possible Output Notification profile z Action-Emotion, Service Notification profile

͢͢

|yjGy–‰–›šGœš•ŽG—™–›–Š–“

API using URC protocol is embodied in Linux, Windows, Win CE, JAVA platforms

h• ‰–›

|yjGtˆ• jht|z zV~Gy–‰–›

qh}h s•œŸ ~ˆ›Š‹–Ž

qh}h s•œŸ p”ˆ™– Z y–‰–‹ jœ‰– zŒ™Œ™

qœ—›Œ™ uŒ›–™– œw–š›”ˆ›Œ ~•‹–žš

t™

~•‹–žš s•œŸ ~•‹–žš ~•jl ͣ͢ |yjGmŒ“‹G{Œš› 2005.10.1-12.31 : 64 households in the Seoul/Kyung-gi provisions

rvylu

v›ˆŒ iŠu

nžˆ•ŽŽˆŒ›– iŠu

|‰•Œ› |yjG zŒ™Œ™ iŠu

jˆ‰“Œ iŠu

• The three types of robot systems using URC protocol as well as server have been installed and carrying out services using BcN. • BcN Network – VDSL, Cable, FTTH network • 5 companies – Samsung Elec., Yujin Robotics, Hanwool Robotics, IOTeK, ͤ͢ Izirobotics.

t–ŒGOXPGT |yjGmŒ“‹G{Œš›

‰ mŒ“‹G{Œš›Gt–ŒGG

ͥ͢ t–ŒOYPGT {Œš›šGœš•ŽG|yjG™–‰–›š

~piyvG– ~™Œ“ŒššGi™–ˆ‹‰ˆ•‹Gp•›Œ™•Œ› ͦ͢

|yjGw™–›–Š–“Gz›ˆ•‹ˆ™‹¡ˆ›–•

‰ There are two URC protocol standardizations, which were submitted to TTA (Telecommunication Technology Association of Korea) standardization workgroup PG413 (Intelligent Service Robot Project Group), as standardizations. „ URC Client/Server transport protocol „ Payload message format of URC Client/Server transport protocol

ͧ͢ mœ™›Œ™Gz›œ‹

‰ We plan to improve URC protocol based on URC experimental field services results, and expand the profiles for increasing number of new services and robots. „ UDP/HTTP „ Cooperation protocol for multi-robots „ Fault-tolerant Communication ‰ We plant to develop standard Interface between Server and Robot supported by various languages (JAVA, C/C++, C#)

ͨ͢

Thank you

ͩ͢ Universal Real-time Behavior Interface Robotics/2006-04-11

URBI:URBI: aa UniversalUniversal Platform Platform for for PersonalPersonal Robotics Robotics Jean-Christophe Baillie ENSTA / Gostai

Aldebaran Robotics [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 1/32

Personal Robotics in 2006

We are at the very beginning of « Personal Robotics »: leisure robots, companion-robots, assistant-robots, vacuum cleaner robots, medical support or assistance to senior citizens…

Strong similarities exist No universal platform with the beginning of the currently exists as a Personal Computers standard to control these industry in the 1980’s: robots. huge potential. There is a need.

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 2/32 PLANPLAN

ƒƒ WhatWhat isis URBI?URBI? ƒƒ TechnicalTechnical Part:Part: insideinside URBIURBI ƒƒ ComponentsComponents asas ObjectsObjects ƒƒ UsageUsage examplesexamples ƒƒ CurrentCurrent statusstatus && ConclusionConclusion

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 3/32

WHATWHAT ISIS URBI?URBI?

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 4/32 URBI Key Features URBI is a complete solution to control robots. The base of the system is a new Interface Language.

Simplicity

Easy to understand, but with advanced capabilities for demanding applications. URBI is used as well by Research Labs and by 12 years old kids as a hobby.

Flexibility

Independent of the robot, OS, platform, interfaced with many languages (C++, Java, Matlab…), Client/Server architecture.

Modularity

Software components can be transparently plugged in the language to extend it, as internal new objects or as external objects running on different computers (DOM). The user do not see the difference. Parallelism Parallel processing of commands, concurrent variable access policies, event based programming, task scheduling, … many new powerful concepts oriented towards parallel programming and [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 5/32

URBI Engine: based on a client/server approach

le Mac OSX, Intel or du standard o IM in) B lug UR (p

Windows, Linux, … URBI commands modules URBI (remote) Engine messages (server)

super calculator

or, simply onboard [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 6/32 Frequently Asked Questions

Why a script language? C++ is better and faster!

C++ (or Java, Matlab…) is strongly interfaced with URBI with C++ components. URBI orchestrates several objects in a flexible and dynamic reconfigurable way, at runtime, with almost no speed overhead. CPU demanding code should be in C++ objects, not in URBI. The script language is the glue between components, and that’s the way of modern modular programming.

Why yet another script language? What about python?

URBI is different from python or LUA, because it integrates parallelism and event-based programming in the language semantics. This is a fundamental innovation and a need for complex AI & Robotics. It also trivially allow distributed remote objects with its client/server architecture.

Why a central server? What if it fails?

There are already many central systems that the system depends on: the OS, the hardware drivers. The key point is to have a robust and bullet-proof system, plus a recovery mechanism in the (rare) case of kernel panic. Think about apache or X. Being central (+script based) brings powerful ways to control your objects and to have them cooperate in a flexible manner.

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 7/32

TECHNICALTECHNICAL PARTPART InsideInsideURBI URBI

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 8/32 Objects to control hardware

micro

micro headPan

legRF3

gripL URBI can be seen as a driver: Every sensor, motor, camera legR3 or physical hardware in the robot is an object.

An URBI object is similar to a C++ object: it has methods and properties. By convention, the val property is related to the device value

(telnet(telnet session, session, portport 54000,54000, withwith Aibo Aibo ERS7)ERS7) camera;camera; [145879854:notag][145879854:notag] BINBIN 53475347 jpegjpeg 208 208 160160 headPanheadPan = = 15;15; //// oror headPan.valheadPan.val = = 1515 ########################## 53475347 bytesbytes ###################### headPan;headPan; [136901543:notag][136901543:notag] 15.103026508915.1030265089 speakerspeaker == binbin 54112 54112 wavwav 2 2 1600016000 16;16; accelX;accelX; ############## 5411254112 bytesbytes ########## [136901543:notag][136901543:notag] 0.0029388291040.002938829104

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 9/32

Messages and tags

All messages from the server are composed with the same standard structure:

the command to the server: headTilt;headTilt; the message from the server: [136901543:notag][136901543:notag] 15.103026508915.1030265089

Time stamp (ms) Command tag Message content

mytagmytag:: headTiltheadTilt;; [136901543:[136901543:mytagmytag]] 15.103026508915.1030265089

custom tag: useful to know who is sending what and to control running commands. Any command or group of commands can be prefixed by a tag. This is one of the most powerful features in URBI, crucial to handle parallelism properly

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 10/32 Advanced Tagging

stop / block / freeze

mytagmytag:: {{ command1command1;; ifif (distance(distance << 50)50) sometag:sometag: command2command2;;

whilewhile (index (index << 10)10) {{ ping;ping; index++;index++; };}; ...... };};

From another stop mytag; StopsStops thethe commandscommands client or from other commands block mytag; killskills any any new new commandcommand running in unblock mytag; withwith tag tag ""mytagmytag"" parallel freeze mytag; freezesfreezes any any running running oror unfreeze mytag; newnew commandcommand [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 11/32

Parallelism

Commands can be executed in serial or parallel mode: headPanheadPan = = 1515 && neckneck == 30;30; Set headPan to 15 and neck to 30 at the same time headPanheadPan = = 1515 || neckneck == 30;30; Set headPan to 15 and after, set neck to 30.

Operators , and ; are also available and have a semantics identical to & and | except that they have looser constraints:

gap

A | B A B A ; B A B B.Start == A.end B.Start >= A.end

A A A & B A , B B.Start == A.start B B.Start >= A.start B gap

NB: Brackets can be used to group commands, like in C:

{{ headPanheadPan = = 1515 || headTiltheadTilt = = 2323 time:1000time:1000 }} && neckneck == 10;10; [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 12/32 Complex Assignments

Simple assignment:

headPanheadPan = = -2;-2;

Numerical assignments can be specified via “modifiers” headPanheadPan = = 1515 time:5s;time:5s; headPanheadPan = = 1515 speed:0.34;speed:0.34;

15 15

-2 t/s -2 4 uni =0.3 peed 5000ms s

15 This command never 6 units

-2 terminates -2

accel = 0.02 unit/s² 1000ms headPanheadPan = = 1515 accel:0.02;accel:0.02; headPanheadPan = = -2-2 sin:1ssin:1s ampli:3,ampli:3,

put the command in background

Any function can be assigned as time parameterized trajectory with the function modifier (v2.0):

headPanheadPan = = function(t)function(t):sqr(t)+sin(3*t+pi):sqr(t)+sin(3*t+pi);; [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 13/32

Blending modes

Conflicting assignments can occur from several clients or inside the same program. x=1 & x=5

How to handle it? Blending modes:

variable->blend = …

variable "property"

NB: this is also true for sound devices => simple multiplexer neck.valneck.val->blend->blend == addadd;; Each assignment occurs at the same time and is added to the others => Used to superimpose sinuses in Fourier decomposition neck.val->blendneck.val->blend = = mixmix;; Like add, but they are averaged instead of added neck.val->blendneck.val->blend = = queuequeue;; Each assignement occurs only when the others are finished neck.val->blendneck.val->blend = = discarddiscard;; Each conflicting assignment is ignored neck.val->blendneck.val->blend = = cancelcancel;; Each new assignment terminate any other pending assignment neck.val->blendneck.val->blend = = normalnormal;; The latest assignment has the focus, be the others run in background [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 14/32 Objects: OOP and Broadcasting

Usual OOP features are available: motor class motor { subclassing and multiple legRF var val; inheritance possible function switchon(); event overheat; }; head legs legRF1 = new motor("xx24"); legRF2 = new motor("xx27"); tail legRF1 legRF2 legRF3 legRF3 = new motor("xx789");

// broadcast grouping group legRF { URBI multiple parallel launch going downward: legRF1, legRF2, legRF3 broadcasting. }; function motor.switchon() { echo “on “+val; }; Usual virtual method search going upward legRF.switchon();Î broadcast switchon() function motor.switchon() { Î gets motor.switchon() echo “on “+val; Ù }; legRF1.switchon() & legRF2.switchon() & legRF2.switchon();Î gets motor.switchon legRF3.switchon();

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 15/32

C-like Features

Function definition Control structures:

Functions can be defined on devices or virtual Standard control structures are available and devices: some more specific to URBI: functionfunction robot.walk robot.walk (x,y) (x,y) {{ …… /* /* walkwalk code*/ code*/ // the classical for loop };}; // the classical for loop forfor (i=0;i<10;i++)(i=0;i<10;i++) echo i; functionfunction add add (x,y) (x,y) {{ echo i; returnreturn x+yx+y //// softsoft teststests : : mustmust bebe true true for for 3ms3ms };}; no semicolon whilewhile (headsensor (headsensor > > 0)0) {{ instructions… functionfunction fibo(n) fibo(n) {{ instructions… } ifif (n<2)(n<2) returnreturn 11 } elseelse { { // loop 10 times aa == fibo(n-1);fibo(n-1); // loop 10 times loopn (10) legLF1 = legRF1; bb == fibo(n-2);fibo(n-2); loopn (10) legLF1 = legRF1; returnreturn a+ba+b // Funny average calculation with for&for& }} // Funny average calculation with avg = 0; };}; avg = 0; avg->blendavg->blend = = mix;mix; for& (i=0;i<10;i++) usage: for& (i=0;i<10;i++) avgavg = = tab[i];tab[i]; robot.walkrobot.walk (14,255); (14,255); myresultmyresult = = fibo(10);fibo(10); robot.process_mystring("bonjour");robot.process_mystring("bonjour");

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 16/32 Event catching

Several event catching mechanisms are available:

atat ((testtest)) {{ atat ((testtest)) {{ test = true; Instruction A instructionsA;instructionsA; instructionsA; instructionsA; test = false; Instruction B };}; } } onleaveonleave {{ instructionsB;instructionsB; };};

wheneverwhenever ((testtest)) {{ wheneverwhenever ((testtest)) {{ test = true; Instruction A instructionsA;instructionsA; instructionsA; instructionsA; test = false; Instruction B };}; } } elseelse {{ instructionsB;instructionsB; };};

waituntilwaituntil ((testtest);); Terminates only when test becomes true. usage: waituntilwaituntil ((testtest)) || instructions…instructions… => If given a number, the wait command pauses for this nb of ms.

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 17/32

Event catching (2)

You can control the lifespan of a command:

timeouttimeout ((timetime)) command will be executed until command;command; time is over.

//example//example This command runs in the background. timeout(10s)timeout(10s) robot.walk();robot.walk();

stopifstopif ((testtest)) command will be executed until command;command; the test becomes true.

This command runs in the background. //example//example stopif(headSensor)stopif(headSensor) looploop legRF1 legRF1 == legLF1;legLF1;

freezeiffreezeif ((testtest)) command will be executed until command;command; the test becomes true, then it is freezed. When test becomes false //example//example again, it is unfreezed.

freezeif(!ball.visible)freezeif(!ball.visible) balltracking();balltracking(); Also runs in the background.

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 18/32 Event catching(3)

You can emit your own events with the emit function:

emitemit myevent; myevent; This creates an spiking event with or without parameters. emitemit myevent myevent (1,"hello"); (1,"hello");

atat (myevent) (myevent) ...... You can catch events with a simple test. If there are parameters, you get wheneverwhenever (myevent (myevent (x,y)) (x,y)) echoecho x+y; x+y; them together with the event and you can filter on the base of those parameters value. atat (myevent (myevent (1,s)) (1,s)) echoecho s; s; ......

emit(2s)emit(2s) myevent;myevent; This will add a duration to then event. Possibly, no time limit. emit()emit() myeventmyevent (1,"hello"); (1,"hello");

every(2s)every(2s) commandscommands;; The every command starts the command at given time intervals. every(2s)every(2s) emitemit myevent; myevent; It can be used to create « pulsing events »

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 19/32

Multicore Integration OS Hardware Automatic load balancing of parallel commands on a PhysicalPhysical variable number of cores and threads, with real-time ThreadThread scheduling capabilities (currently in development, v.2 only) CoreCore

PhysicalPhysical 11 ThreadThread URBI Code Network Layer Client1 PhysicalPhysical ThreadThread

URBI CoreCore URBIURBI URBIURBI PhysicalPhysical Code ThreadThread 22 Client2 KernelKernel SchedulerScheduler PhysicalPhysical ThreadThread URBI URBI Code Commands Client3 = Micro-threads … (logical thread) PhysicalPhysical CoreCore ThreadThread 33

Synchronisation

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 20/32 COMPONENTSCOMPONENTS

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 21/32

URBI as a Middleware

The creation of modules extends the objects available in the robot by pluging external C++ URBI classes in the URBI language (Java and other language plugin in progress) as a ball ball.x ball.y ball.visible central Example of URBI modules: => , , hub voice => voice.say(“hello”), voice.hear(x) ball = new objectDetector(100,23,123, …);

Offboard

URBI ModuleModule URBI Server CORBACORBA ModuleModule Otherr Module?Module?……

Onboard URBI ModuleModule

CORBACORBA Module URBIURBI ServerServer

Otherr Module?Module?……

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 22/32 URBI as a Middleware (2)

Proxy

URBI ModuleModule ProxyProxy Server URBIURBI ServerServer CORBACORBA Module mirroring Otherr ModuleModule

Kernel plugins

IntegratedIntegratedIntegrated URBI ModuleModule URBIURBI ServerServer

With the UObject Architecture: the same C++ code for all integration possibilities [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 23/32

How to use URBI?

telnet or urbilab client headPan.val = 15; headPan.val; [136901543:notag] 15.1030265089 ... C++ client // C++ code with liburbi C++

main() { UClient * client = new UClient("myrobot.ensta.fr"); int pos;

pos = complex_calculation(x,y); URBI Server client->send(“headPan.val = %d;”,pos); • simple commands } • functions definition Liburbi • complex scripts Java client // Java code with liburbi Java

import liburbi.UClient;

robotC = new UClient(robotname); robotC.send("motor on;"); URBI.INIURBI.INIURBI.INI robotC.setCallback(image, "cam"); onboardonboardonboard scriptsscriptsscripts Remote/Plugged C++ Module other integrated clients // C++ object inherit fro UObject (matlab, python, . . .) UStart(ball); class ball : UObject { ball(string);

UObjectUVar x,y; . . . }; [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 24/32 Plugged components extends the objects available in the system and URBI is used to control and coordinate them in a parallel, event-driven way.

The architecture is open, we can integrate CORBA or other DOM, interface objects from many languages (C++, Java, Matlab, python…), adapt to existing standards or push towards their creation.

URBI aims at being a unifying tool bringing flexibility

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 25/32

USAGEUSAGE EXAMPLESEXAMPLES

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 26/32 Examples

camera ball detection component headPan

legR3

//// BallBall trackingtracking program:program: wheneverwhenever ((ballball.visible).visible) {{ headPanheadPan = = headPanheadPan + + camera.xfovcamera.xfov * * ballball.x.x && headTiltheadTilt = = headTiltheadTilt + + camera.yfovcamera.yfov * * ballball.y.y };};

//// GetGet upup onon thethe AiboAibo //// EventEvent detectiondetection getupgetup:: {{ atat (headSensor(headSensor ~ ~ 2s)2s) {{ leg2leg2 == 9090 timetime:2s:2s && speaker.play("hello.wav");speaker.play("hello.wav"); leg3leg3 == 00 time:2stime:2s }} || leg1leg1 == 9090 time:1stime:1s || atat (distance(distance << 40)40) leg2leg2 == 1010 time:1stime:1s || emitemit collision;collision; {{ leg1leg1 == -10-10 time:2stime:2s && leg3leg3 == 9090 time:2stime:2s }} };}; ......

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 27/32

Behavior example

This example shows how speaker = lost; to write behavior graphs with URBI: ball.visible == false

Track Search ball ball

ball.visible == true

speaker = found;

//// TrackingTracking statestate //// TransitionsTransitions functionfunction tracking()tracking() {{ track_transition:track_transition: wheneverwhenever (ball.visible)(ball.visible) {{ atat (ball.visible(ball.visible ~ ~ 400ms)400ms) {{ headPanheadPan == headPanheadPan ++ camera.xfovcamera.xfov ** ball.xball.x && stopstop searchsearch;; headTiltheadTilt == headTiltheadTilt ++ camera.yfovcamera.yfov ** ball.yball.y speakerspeaker == found;found; }} tracktrack:: tracking();tracking(); };}; };};

//// SearchingSearching statestate search_transition:search_transition: functionfunction searching()searching() {{ atat (!ball.visible(!ball.visible ~ ~ 400ms)400ms) {{ periodperiod == 10s;10s; stopstop tracktrack;; {{ headPan’nheadPan’n == 0.50.5 smooth:1ssmooth:1s && speakerspeaker == lost;lost; headTilt’nheadTilt’n == 11 smooth:1ssmooth:1s }} || searchsearch:: searching();searching(); {{ headPan’nheadPan’n == 0.50.5 sin:periodsin:period ampli:0.5ampli:0.5 && };}; headTilt’nheadTilt’n == 0.50.5 cos:periodcos:period ampli:0.5ampli:0.5 }} };};

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 28/32 Finite State Machines (another way to do behavior graphs)

functionfunction state2.init()state2.init() {{ Integrated state2.tag1:state2.tag1: cond3 state 2 atat ( (cond1cond1)) {{ Action3 action1|emit go_state4; stop state2}; cond1 action1|emit go_state4; stop state2}; Action1 cond2 state2.tag2:state2.tag2: state 1 Action2 atat ( (cond2cond2)) {{ action2action2|emit|emit go_state3go_state3;; stopstop state2state2};}; freeze state2.tag2; state 3 state2.main:state2.main: looploop { { ...... };}; state 4 };};

atat (go_state2)(go_state2) state2:state2: state2.init();state2.init(); atat (go_state4)(go_state4) state4:state4: state4.init();state4.init(); cond4 Action4 functionfunction state2.init()state2.init() {{ Separated emit()emit() in_state2;in_state2; state2.main:state2.main: looploop { { ...... Local event gate }} };};

state 2 state2.tag1:state2.tag1: atat ( (in_state2in_state2 &&&& cond1cond1)) {{ cond1 action1action1 || Action1 state4:state4: state4.init()state4.init() ;; stopstop state2state2 };};

state2.tag2:state2.tag2: atat ( (in_state2in_state2 &&&& cond2cond2)) {{ cond2 action2action2 || Action2 state3:state3: state3.init()state3.init() ;; stopstop state2state2 };};

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 29/32

CONCLUSIONCONCLUSION

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 30/32 Summary

Key Benefits

• Simple to use by non experts and experts alike, but yet very powerful • Extensible and flexible • Strong industrial commitment: backward compatibility with new versions, open interface and protocols, active partnership policy to increase the number of URBI- compatible components • Innovative technology to handle parallelism • Already seven compatible robots, keeps increasing • Community of users and reusability of components between different platforms [email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 31/32

Current status A spin-off from the ENSTA Lab (Paris, France) has been created to promote URBI:

We are looking for partners developing components, tools, who are integrating systems or developing standards.

[email protected] „ OMG Meeting St-Louis, April 2006 - Gostai 32/32 robotics/2006-04-12

Fujitsu’s Robotics Research and Standardization Activities

OMG Technical Meeting – Robotics DTF

April 25, 2006

Toshihiko Morita Fujitsu Laboratories Ltd.

All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006

Outline

„ Use of robotics technology „ Need for standardization „ Standardization activities „ Robot Services Initiative „ RT vision component

2 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006 Use of Robotics Technology

All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪩㪼㫊㪼㪸㫉㪺㪿㩷㪸㫋㩷㪝㫌㫁㫀㫋㫊㫌㩷 1980 1990 2000 FA robots Hazardous environment Human-friendly robots and outer space

Micro-arm ('81) Dual-armed robot ('93)

Nursing robots Humanoid robot ('98) ('00)

ETS-VII ('96) M6 ('83) Robot for nuclear power plants “Uncle Touch” HRP project ('83-'91) ('99) ('98-'03) 4 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006 - Latest robots (1) - Humanoids for Research Purposes: HOAP (Humanoid for Open Architecture Platforms) HOAP-1 (2001.9)

HOAP-2 (2003.8)

HOAP-3 (2005.7)

Tai Chi

Standing on its head 5 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006

- Latest robots (2) - Home Robot: MARON-1 (Mobile Agent Robot Of the Next-generation) At a remote location

Cell phone Home

Home camera

Appliance control Remote operation Intruder detection Camera visuals MARON-1 (2002)

6 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006 - Latest robots (3) - Service Robot “䌥䌮䌯䌮” (exciting nova on network) Assisting people in offices and public facilities

Various applications based on a common platform Guidance and escort Transport of objects

Security patrol

7 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006

Hardware Specifications

LEDs Head (2 DOF)

6 cameras Arm (5 DOF) 4 microphones Hand (1 DOF) LCD monitor Height 1300 mm Speaker Width 560 mm Sensors Weight 50 kg Speed Max. 3 km/hr Load Max. 10 kg Wheels Nickel-Hydride Battery (2 DOF) 䋨non-contact䋩

8 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006 㪧㫉㫆㪻㫌㪺㫋㩷㪝㪼㪸㫋㫌㫉㪼㫊㩷㪸㫅㪻㩷㪤㪸㫉㫂㪼㫋㩷

„ Features „ Autonomous navigation using 3D vision „ User-friendly information provision via network • Voice, touch panel LCD monitor, and gesture „ Safety • Approved by Safety Engineering Lab (NPO) „ Application fields „ Shopping malls „ Exhibition and tourist facilities „ Airports „ Internet data centers (IDCs)

9 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006

10 All Rights Reserved, Copyright Fujitsu Laboratories Ltd. 2006 robotics/2006-04-13

Standardization of Device Interfaces for Home Service Robot

2006. 04. 24.

Embedded hardware component research team Intelligent robot research division ETRI

H.C. Shin

-1/41-

Table of Contents

ඟ Introduction ඟ Mission of My Team ඟ Standardization of Robot Device Interface ඟ Conclusion

-2/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Introduction

ඟ We are … ඞ Embedded hardware component research team / Intelligent robot research division / ETRI

-3/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Introduction

ඟ We are developing … ඞ LEGO-type embedded systems for low cost, popularized home service robot – F/W, Control S/W, BSP on Embedded Linux System ඞ Robot core chipsets and SoC (System-on-a-Chip) ඞ Network robot system integration ඞ Robot service technology (Robot telephone, Robot Videophone, Robot TV, …etc.)

-4/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Mission of Our Team

ඟWe are developing embedded systems for low cost, popularized intelligent home service robot ඟMIM_T (Multi-modal Interface Module_Tiny) : ඞFor Robot Application – Brain: Low-end Embedded MPU(350MHz), embedded linux – Eye: MPEG4/H.263 H/W Encoder (CIF, max.30fps) – Mouth: Narrow/Wideband Speech I/O – Ear: Sound localization from 8 channel microphones – Network I/F : Wireless LAN, WiBro (Future) – Interfaces for robot hardware devices(RS232, USB, LCD, Zigbee, CAN, etc.) ඞFor Videophone Application – Acoustic Echo Canceller for Loud Speaker Phone

-5/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Mission of Our Team

ඟMRM (Multimedia Retrieval service Module) ඞFor Robot Application – Brain:Middle-end Embedded MPU(500MHz), embedded linux – Eye: JPEG (max.10fps) – Expression: MPEG-2/4, DivX3/4/5, WMV9, H.263 H/W Decoder (DVD Quality), Stereo MP3 Decoder – Network I/F : Wireless LAN, WiBro (Future) – Interfaces for robot hardware devices(RS232, USB, LCD, Zigbee, CAN, etc.) ඞFor Internet Phone Application – VoIP:G.711 A/u Law PCM 64Kpbs – Acoustic Echo Canceller for Loud Speaker Phone

-6/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Mission of Our Team

ඟURC (Ubiquitous Robotic Companion) ඞFor low cost, popularized home service robot

Tele-operation Server Internet - Navigation Mobile Robot Client - Face recognition - Minimum embedded processor - Voice recognition AP - Text to Speech - Minimum sensor - Multimedia content handling Wireless LAN - Minimum actuator -Etc.

AP AP AP …

-7/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Mission of Our Team

ඟWever C1 ඞ Home Security with sensor network (provides video & audio stream, sensor information to remote user) ඞ MIM_T (Zigbee sensor network,wireless LAN) + 2 wheels

-8/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Mission of Our Team

ඟWever Prototype 1 ඞTele-operated intelligent mobile robot with minimum embedded processors

-9/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Mission of Our Team

ඟWever Prototype 1 hardware configuration

RS485 Motor Wireless LAN RS485 IEEE802.11g Motor USB2.0/Ethernet RS485 LCD Display Arm Controller VGA RS232 CMOS Camera Pan-tilt Controller Embedded Main Board OV9650 RS232 RS485 MIM_T / MRM Motor RS485 Motor USB2.0 Speaker Microphone Standard Standard Mobile Base Controller 3.5mm 3.5mm

CAN2.0B CAN2.0B Locomotion IR Sensor Array Controller Motor Driver Motor Motor -10/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Standardization of Robot Device Interface

Now… Future…

Personal Computer = Home Service Robot = + A company CPU + X company arm + B company graphic card + Y company eye + C company memory + Z company brain + etc. + etc.

-11/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Standardization of Robot Device Interface

ඟRobot device integration is important & hard

ඟWe want … ඞ Standardized robot devices like general PC device ඞ Robot device manager like Plug & Play Device Manager of Microsoft Windows

-12/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Standardization of Robot Device Interface

Arm & leg Pan & tilt Display Wheel On/off actuator

Heavy Tiny

Proximity Vision camera Microphone Touch On/off sensor sensor switch Robot needs various devices!

-13/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Standardization of Robot Device Interface

ඟRobot hardware devices can be classified into ඞGeneral PC devices (vision camera, audio I/O, LCD display, etc.) Æ They have already de facto standard ඞSensors and actuators Æ IEEE 1451

IEEE 1451” Standard for a Smart Transducer Interface for Sensors and Actuators “ ඞ IEEE 1451.0 Protocols & Format ඞ IEEE 1451.1 Object Model ඞ IEEE 1451.2 Interface (now revising for RS- 232, RS-485 and USB) ඞ IEEE 1451.3 Local Network ඞ IEEE 1451.4 Analog & TEDS (Transducer Electronic Data Sheets) ඞ IEEE 1451.5 Wireless ඞ IEEE 1451.6 CANopen-based transducer network -14/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Standardization of Robot Device Interface

ඟ We are trying to develop PMI (Physical Media independent Interface) ඞ PMI has classified device interfaces ඞ Upper layer applications can access hardware devices through each standardized interface

Upper Layer Applications

Physical Media independent Interface

Video Sound Sensor Actuator In/out In/out … In Out Interface Interface Interface Interface

-15/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Standardization of Robot Device Interface

Upper Layer Applications

PMI Device Data Management Layer

Device Connection Management Layer Devices based on General PC Devices Standard Smart Sensor & Actuator PC I/F, IEEE1451

VGA,RS232, USB, RS232 USB, RS232 USB, Etc. CAN, ZigBee CAN, ZigBee VGA, Etc. Etc.

PC I/F Robot Main Board IEEE 1451 LCD Display Vision Camera Sensor, Actuator Microphone, Etc.

-16/18- 㫴⏙䝉⦐⸻㜤Ạ␜ Standardization of Robot Device Interface

B A C

D Upper standard Layer smart standard actuator smart (Applications) sensor PMI Arm controller (Application) standard E smart standard actuator smart actuator … standard PMI

IEEE 1451 smart Arm controller actuator

standard F smart standard actuator smart actuator … standard smart actuator

-17/18- 㫴⏙䝉⦐⸻㜤Ạ␜

Conclusion

ඟ Robot hardware devices can classified into ඞ General PC devices Æ de facto standard interface ඞ Sensors and actuators Æ IEEE 1451 interface

ඟ Suggested Physical Media independent Interface for home service robot device can help robot developers and users

-18/18- 㫴⏙䝉⦐⸻㜤Ạ␜ VoiceVoice InterfaceInterface StandardizationStandardization ItemsItems forfor NetworkNetwork RobotRobot inin NoisyNoisy EnvironmentsEnvironments

AA ResponseResponse toto RoboticRobotic SystemSystem RFIRFI April,April, 20062006

Telecommunication R&D Center Applied Technology Lab. SAMSUNG ELECTRONICS CO., LTD.

robotics/2006-04-14

ContentsContents

1 Purpose of Presentation

2 Network Robot

3 Needs for Standardization

4 Standardization items

5 Conclusion

Y Purpose of Presentation

• To propose the needs for Robotic System Standards with an emphasis on Voice Interface.

- Necessity of enactment of standardization (for both network and standalone type of robots) - Description of the item list that can be standardized accordingly.

Z

Network Robot

• Network robots allocate their functions through the network and the server connected to the network. - The robots alone are limited to provide techniques, expenses as well as resources. - External signals that robots receive are analyzed and responded by the network robot servers. - The role of a robot is as an interface between robots and users.

[ Network Robot using voice interface

• The most desirable interface is a voice interface (Speech Recognition). - Robots filter out the user’s voice signals and transfer these signals to the server that leads the recognition processing.

Voice signal

command wireless Speech Recognition Noise response Module Suppressing Module Recogition results

Network Server \

Network Robot using voice interface

• The factors affecting the recognition capability are - background noises from numerous directions - neighboring human voice - noise from radio, TV.

R adio ,TV

ice n vo uma ng h hbori neig Network Robot

] Needs for Standardization

• The standardization of voice interface for robotic applications would

1. Reduce the uncertainties of robot’s voice recognition performances in noisy environment. 2. Prevent investment overlap and cut down on the production cost. 3. Make system alterations and functional addition easy, when a new technique is added or the performance is enhanced. 4. Help establish an improved system in a short period of time for mass production of robots.

^

Standardization items- overview

• The standardization below are required in order to produce effective robots.

_ Standardization items 1

Item 1. Mic and Array Characteristics for network robot - Stipulations for a microphone’s capability and directional distinctions to input the voice signal. - Beam-pattern for beamforming process for acquiring user’s voice - Optimal numbers and the locations of internally fitted microphones

tŠUGˆ•‹Gh™™ˆ Gjˆ™ˆŠ›Œ™š›Ššf

uœ”‰Œ™Gˆ•‹Gw–š›–•G–GtŠUGf

`

Standardization items 2

Item 2. Speech recognition performance guideline - Speech recognition rate and SNR improvement

le p Table from m ‘Performance a evaluation sheet S in AURORA2, 3 DB’

XW Standardization items 3

Item 3. Input/Output parameter for communication between server and network robot • Parameter format of voice input/output - voice codec parameter applied to PCM voice data • The feature extracting methods: - sampling rate standard of voice signals - feature extraction parameters (frame size, filter coefficient, step size) • The method of transmission between server and robot terminal: - standardization of framing, bit-stream composition, error protection - decoding/error correction of transmitted data to the server

Standard of ETSI advanced Feature extraction

XX

Standardization items 4

Item 4. Resource portion for network robot - Stipulate the processing speed of voice signals of robots - Suggest upper bound of memory occupied by voice signal processing

90 90 90 90 1 1 1 1 120 60 120 60 120 60 120 60 0.8 0.8 0.8 0.8 0.6 0.6 0.6 0.6 150 30 150 30 150 30 150 30 0.4 0.4 0.4 0.4

0.2 0.2 0.2 0.2

180 0 180 0 180 0 180 0

210 330 210 330 210 330 210 330

240 300 240 300 240 300 240 300 270 270 270 270 400Hz 800Hz 1600Hz 3200Hz

XY Conclusion

• The benefit and effect of technology development based on these standardized guideline.

1. Application of the best possible solution through regular upgrade on the system. 2. Application and instant use of various novel noise reduction techniques. 3. Simple maintenance and repair based on construction of standardized production system. 4. Establish an improved system in a short period of time for mass production of robots. 5. Standardized robots from different company will be compatible with each other’s network server. 6. Minimize the production rate of inferior robots and supply of quality guaranteed robots. 7. Guarantee of speech recognition rate for the produced robots.

XZ

Thank you for your attention. robotics/2006-04-15

Home Robot Navigation in SAIT

Seok-Won Bang & Yeon-Ho Kim

APRIL 2006 Interaction Lab SAIT (Samsung Advanced Institute of Technology)

Brief History of SAMSUNG Home Service Robot : 1999~2003

‰ 1999.11: BANGGAR ‰ Speech Recognition/Synthesis, Face Recognition, Navigation using a camera mounted on the ceiling ‰ 2000.11: BANGGAR II

‰ Camera-phone, remote monitoring, BANGGAR(1999)

Multi-face Detection SHR-00(2002) ‰ 2002. 4: SHR-00 ‰ Vacuum Cleaning, Tele-presence ‰ 2002. 10 : APRIL ‰ Emotional Motion, Sound Localization ‰ 2003. 1 : SAEBOM (Software System) ‰ Aiming Dialogue Skill of 4 Years Old Children APRIL(2003) Brief History of SAMSUNG Home Service Robot : 2003~Present

‰ 2003.10: SHR-50 ‰ Navigation using Markers on Ceiling ‰ 2003.5 : CRUBO (VACUUM ROBOT) ‰ Navigation using Markers on Ceiling

‰ 2004.7: MOBILE AIR-PURIFIER SHR-50(2003) CRUBO(2003) ‰ Navigation including Wall Following Motion ‰ 2004.11: SHR-100* ‰ Navigation using Natural Image Features on Ceiling ‰ Call & Come using Sound Localization Technique ‰ User Following using Structured Light Sensor MOBILE AIR-PURIFIER(2004) ‰ 2005.9~: PREMIUM VACCUM ROBOT ‰ Navigation using Range Sensors ‰ Precise Wall Following Motion fo1r Corner Cleaning

* SHR100 is developed with Samsung Mechatronics Center SHR-100(2004)

CONTEXT-AWARE TECHNOLOGIES FOR HOME SERVICE ROBOT AT SAIT

Recognizing Position of Robots and Humans

Robot’s Speaker’s User’s Position Position Position

Self- Call & Come User-Following Localization

Based on the work in 2004 SELF-LOCALIZATION

ۏۃۂۄڧٻڿۀۍېۏھېۍۏڮ

‹SLAM : Simultaneous Localization And Map-building •Use only natural image features on ceiling without any artificial markers •Feature extraction : robust to light condition •Distance estimation : structured Laser light, low cost ‹Localization accuracy : position error < 15cm, orientation error < 3 degree

CALL & COME

‹ Call & Come exploits video and audio signals together •Stop Position : 0.5~1.0m in front of user (95%) •Speaker’s Orientation Detection: Analysis of audio signals from 8 microphones ‹Detection Range < 5m, Voice Level > 7dB ‹Human Detection : AdaBoost algorithm with video signals USER FOLLOWING

‹Use of Front camera and structured light ƒMaximum following speed : 1.5m/sec ƒControl distance : 1m ƒUpper body Tracking : particle filter and mean shift algorithm ƒLeg detection : arc pattern extraction from structured light sensor

LOCALIZATION & NAVIGATION FOR HOME SERVICE ROBOT AT SAIT

p”ˆŽŒGmŒˆ›œ™ŒG w–•›š |——Œ™Gjˆ”Œ™ˆG |——Œ™ p”ˆŽŒ jˆ”Œ™ˆ Feature Map

m™– jˆ”

z›™œŠ›œ™Œ‹GsŽ›

Wall Map Final Map

m™–•›Gjˆ”Œ™ˆG p”ˆŽŒ &RQWH[WDZDUH7HFKQRORJLHV IRU+RPH5RERW

Thank You! 䌒㫆㪹㫆㫋㫀㪺㫊㪆㪉㪇㪇㪍㪄㪇㪋㪄㪈㪍

㪠㪠㪫㪫㪩㪩 㪄㪄 㪠㪠㫅㫅㫋㫋㪼㪼㫉㫉㫅㫅㪼㪼㫋㫋㩷㩷㪩㪩㪼㪼㫅㫅㪸㪸㫀㫀㫊㫊㫊㫊㪸㪸㫅㫅㪺㪺㪼㪼

䌾 The world‘s first humanoid robot to be harmonized with the family 䌾

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㪄 㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘 㪘㫇㫉㫀㫃㩷㪉㪍㪃㩷㪉㪇㪇㪍

Copyright 㿊 2006 Systems Engineering Consultants Co., Ltd. All rights reserved.

㪫㪸㪹㫃㪼㩷㫆㪽㩷㪚㫆㫅㫋㪼㫅㫋㫊㪫㪸㪹㫃㪼㩷㫆㪽㩷㪚㫆㫅㫋㪼㫅㫋㫊

㪚㫆㫅㪺㪼㫇㫋㩷㫆㪽㩷㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄 㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪦㫍㪼㫉㫍㫀㪼㫎 㪩㪫㪤㪣㩷㪪㫇㪼㪺㫀㪽㫀㪺㪸㫋㫀㫆㫅 㩿㪩㪪㫀㩷㪘㪺㫋㫀㫍㫀㫋㫀㪼㫊㪀 㪚㫆㫅㪺㫃㫌㫊㫀㫆㫅

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪉 㪮㪿㪸㫋㪮㪿㪸㫋’’㫊㫊 䌉䌔䌒䌉䌔䌒㪖㪖 WalkingWalking InternetInternet 㪩㫆㪹㫆㫋㩷㪻㪼㫍㪼㫃㫆㫇㪼㫉㩷㩹㪪㫇㪼㪼㪺㫐㫊㩷㪚㫆㫉㫇㫆㫉㪸㫋㫀㫆㫅㩹㩷RadioRadio !!!! 㪸㫅㪻㩷㩹㪪㪜㪚㩷㪚㫆㫉㫇㫆㫉㪸㫋㫀㫆㫅㩹㩷㪿㪸㫍㪼㩷㪻㪼㫍㪼㫃㫆㫇㪼㪻㩷 㪿㫌㫄㪸㫅㫆㫀㪻㩷㫉㫆㪹㫆㫋㩷㵰㪠㪫㪩㵱 㪸㫅㪻㩷㫋㪿㪼㩷㪚㫆㫅㫋㪼㫅㫋㫊㩷 㪛㫆㫎㫅㫃㫆㪸㪻㩷㪪㫐㫊㫋㪼㫄㩷㵰㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㵱 㫋㫆㪾㪼㫋㪿㪼㫉㩷㫆㫅㩷㪘㫇㫉㫀㫃㩷㪉㪇㪇㪍㪅 㪠㪫㪩㩷㪻㫆㫎㫅㫃㫆㪸㪻㫊㩷㫀㫋㫊㩷㪺㫆㫅㫋㪼㫅㫋㫊㩷㫊㫌㪺㪿㩷㪸㫊㩷㪸㩷 㪫㪭㩷㪆㩷㫉㪸㪻㫀㫆㩷㫇㫉㫆㪾㫉㪸㫄㫊㩷㪽㫉㫆㫄㩷㫋㪿㪼㩷㫊㪼㫉㫍㪼㫉㩷 㪸㫅㪻㩷㫇㫉㪼㫊㪼㫅㫋㫊㩷㫋㪿㪼㫄㩷㫋㫆㩷㪸㩷㫌㫊㪼㫉㩷㪹㫐㩷㫀㫋㫊㩷 㫍㫆㫀㪺㪼㫊㩷㪸㫅㪻㩷㫄㫆㫍㪼㫄㪼㫅㫋㫊㪅 㪮㪼㩷㪿㪸㫍㪼㩷㪻㪼㪽㫀㫅㪼㪻㩷㪩㪫㪤㪣㩷㫋㪿㪸㫋㩷㫀㫊㩷㪸㩷 㫇㫉㫆㫋㫆㪺㫆㫃㩷㪽㫆㫉㩷㪻㫆㫎㫅㫃㫆㪸㪻㫀㫅㪾㩷㵰㪩㫆㪹㫆㫋㩷 㪚㫆㫅㫋㪼㫅㫋㫊㵱㪅

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪊

㪙㪸㫊㫀㪺㩷㪚㫆㫅㪺㪼㫇㫋㪙㪸㫊㫀㪺㩷㪚㫆㫅㪺㪼㫇㫋

ITRITR == IInntterneternet RRenaissanceenaissance andand IInntterneternet RRobotobot

Creation of the robot entertainment that anyone can enjoy in a home !!

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪋 㪫㪫㪿㪿㪼㪼㩷㩷㪮㪮㫆㫆㫉㫉㫃㫃㪻㪻㩷㩷㫆㫆㪽㪽㩷㩷㪠㪠㪫㪫㪩㪩

ITR Server

Singing! RTML Dancing!

RTML: RTML Robot Transaction Markup Language

ITR Internet Mobile Phone

ChoiceChoice and and play play your your favorite favorite ITRITR programs programs !! !!

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪌

㪛㪼㫄㫆㫅㫊㫋㫉㪸㫋㫀㫆㫅㪛㪼㫄㫆㫅㫊㫋㫉㪸㫋㫀㫆㫅

Movie1 - Comic Story with a Witty Ending http://pc.watch.impress.co.jp/docs/2006/0404/speecys01.wmv

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪍 㪛㪼㫄㫆㫅㫊㫋㫉㪸㫋㫀㫆㫅㪛㪼㫄㫆㫅㫊㫋㫉㪸㫋㫀㫆㫅

Movie2 – English Education http://pc.watch.impress.co.jp/docs/2006/0404/speecys02.wmv

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪎

㪛㪼㫄㫆㫅㫊㫋㫉㪸㫋㫀㫆㫅㪛㪼㫄㫆㫅㫊㫋㫉㪸㫋㫀㫆㫅

Movie3 – Music & Dance http://pc.watch.impress.co.jp/docs/2006/0404/speecys03.wmv

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪏 㪪㪪㫇㫇㪼㪼㪺㪺㫀㫀㪽㪽㫀㫀㪺㪺㪸㪸㫋㫋㫀㫀㫆㫆㫅㫅㫊㫊

㪟㪼㫀㪾㪿㫋 㪊㪊㩷㪺㫄㩷㩿㪈㪉㩷㫀㫅㪺㪿㪼㫊㪀 㪮㪼㫀㪾㪿㫋 㪈㪅㪌㩷㫂㪾㩷㩿㪊㪅㪊㩷㫃㪹㫊㪀 㪛㪼㪾㫉㪼㪼㩷㫆㪽㩷㪝㫉㪼㪼㪻㫆㫄 㪉㪉㩷㪻㪼㪾㫉㪼㪼㫊 㪛㪼㫍㫀㪺㪼 㪉㩷㫊㫇㪼㪸㫂㪼㫉㫊 㪈㪍㪏㩷㪣㪜㪛㫊 㪌㩷㫊㫎㫀㫋㪺㪿㪼㫊 㪪㪼㫉㫍㫆㩷㪤㫆㫋㫆㫉 㪩㪪㪊㪇㪈㪚㪩㩷䋨㪝㫌㫋㪸㪹㪸㩷㪚㫆㫉㫇㫆㫉㪸㫋㫀㫆㫅䋩 㪧㫆㫎㪼㫉㩷㪪㫌㫇㫇㫃㫐 㪩㪼㪺㪿㪸㫉㪾㪼㪸㪹㫃㪼㩷㫃㫀㫋㪿㫀㫌㫄㩷㫇㫆㫃㫐㫄㪼㫉㩷㪹㪸㫋㫋㪼㫉㫐㩷㪎㪅㪋㪭㩷㪎㪇㪇㫄㪘 㪘㪚㩷㫇㫆㫎㪼㫉㩷㫊㫌㫇㫇㫃㫐 㪚㪧㪬 㪩㪧㪬㪄㪌㪇㩷䋨㪝㫌㫋㪸㪹㪸㩷㪚㫆㫉㫇㫆㫉㪸㫋㫀㫆㫅䋩 㪄㪪㪟㪊㩷㪈㪊㪊㪤㪟㫑 㪄㪍㪋㪤㪙㩷㪩㪘㪤㩷㪂㩷㪍㪋㪤㪙㩷㪝㫃㪸㫊㪿㩷㪤㪼㫄㫆㫉㫐 㪄㫄㫀㫅㫀㪪㪛 㪪㫃㫆㫋 㪄㪩㪪㪋㪏㪌㩷㪠㪆㪝 㪄㪬㪪㪙 㪄㪪㪼㫉㫀㪸㫃㩷㫇㫆㫉㫋 㪄㪘㫌㪻㫀㫆㩷㫇㫆㫉㫋 㪦㪪 㪪㫇㪼㪼㪺㫐㫊㪦㪪 㪩㪼㫍㪅㪉㪅㪇㩷㩿㪥㪼㫋㪙㪪㪛㩷㪹㪸㫊㪼㪀 㪮㫀㫉㪼㫃㪼㫊㫊㩷㪣㪘㪥 㪠㪜㪜㪜㪏㪇㪉㪅㪈㪈㪾㩷㩿㪬㪪㪙㪀

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪐

㪝㪝㪼㪼㪸㪸㫋㫋㫌㫌㫉㫉㪼㪼㫊㫊

㪥㪼㫏㫋㩷㪞㪼㫅㪼㫉㪸㫋㫀㫆㫅㩷㪤㪼㪻㫀㪸 㪮㪼㩷㪸㫉㪼㩷㫇㫃㪸㪺㫀㫅㪾㩷㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪸㫊㩷㫆㫅㪼㩷㫆㪽㩷㫄㪼㪻㫀㪸㪅㩷 㪮㪼㩷㪿㫆㫇㪼㩷㫋㪿㪸㫋㩷㪠㪫㪩㩷㫎㫀㫃㫃㩷㪽㫆㫃㫃㫆㫎㩷㫀㫅㩷㫋㪿㪼㩷㪽㫆㫆㫋㫊㫋㪼㫇㫊㩷㫆㪽㩷㫉㪸㪻㫀㫆㪃㩷㪫㪭㪃㩷 㪧㪚㪃㩷㪸㫅㪻㩷㫄㫆㪹㫀㫃㪼㩷㫇㪿㫆㫅㪼㩷㫋㫆㩷㪹㪼㪺㫆㫄㪼㩷㫋㪿㪼㩷㪽㫀㪽㫋㪿㩷㫄㪸㫁㫆㫉㩷㪽㫆㫉㫄㩷㫆㪽㩷 㪿㫆㫌㫊㪼㪿㫆㫃㪻㩷㫄㪼㪻㫀㪸㪅 㪤㫆㫋㫀㫆㫅㩷㪙㫉㫆㫎㫊㪼㫉 㪬㫅㫃㫀㫂㪼㩷㪸㩷㪮㪼㪹㩷㪹㫉㫆㫎㫊㪼㫉㪃㩷㪠㪫㪩㩷㪼㫏㫇㫉㪼㫊㫊㪼㫊㩷㫍㪸㫉㫀㫆㫌㫊㩷㫀㫅㪽㫆㫉㫄㪸㫋㫀㫆㫅㩷 㪹㫐㩷㪸㩷㫄㫆㫋㫀㫆㫅㪅㩷 㪠㪫㪩㩷㫀㫊㩷㪸㩷㫅㪼㫎㩷㫊㫋㫐㫃㪼㩷㪹㫉㫆㫎㫊㪼㫉㩷㫋㪿㪸㫋㩷㪺㪸㫅㩷㪺㫆㫄㫄㫌㫅㫀㪺㪸㫋㪼㩷 㫀㫅㪽㫆㫉㫄㪸㫋㫀㫆㫅㩷㫋㫆㩷㪸㩷㫇㪼㫉㫊㫆㫅㩷㫅㪸㫋㫌㫉㪸㫃㫃㫐㩷㫎㫀㫋㪿㩷㫄㫆㫍㪼㫄㪼㫅㫋㫊㩷㪸㫅㪻㩷 㫃㫀㫅㪼㫊㪃㩷㫊㫆㫌㫅㪻㫊㪅㩷 㪪㪰㪞㪪㪘㩷㪣㫀㪹㫉㪸㫉㫐 㪪㪰㪞㪪㪘㩷㫃㫀㪹㫉㪸㫉㫐㩷㫀㫊㩷㪽㫆㫉㩷㪼㫏㫇㫉㪼㫊㫊㫀㫅㪾㩷㪿㫌㫄㪸㫅㩷㪽㪼㪼㫃㫀㫅㪾㫊㩷㪸㫅㪻㩷 㪼㫄㫆㫋㫀㫆㫅㩷㪹㫐㩷㫌㫊㫀㫅㪾㩷㫀㫋㫊㩷㫄㫆㫋㫀㫆㫅㪅

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪇 㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪦㫍㪼㫉㫍㫀㪼㫎㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪦㫍㪼㫉㫍㫀㪼㫎 Contents Server Contents Server

Download Contents

Internet ITR Server Mobile Phone Select Contents CooperateCooperate UserUser Register user information RTMLRTML PlayerPlayer w/zw/z MobileMobile PhonePhone ManagementManagement

MemoryMemory Device Contents Device 䊶䊶TendencyTendency Contents 㪠㪫㪩 Control Navigator Control 䊶䊶TimeTime Navigator Download Contents 䊶Region Notify status 䊶Region

DRMDRM && CommunicationCommunication NotificationNotification AccountingAccounting && MailMail ServiceService SystemSystem 㪠㪫㪩

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪈

㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪘㫉㪺㪿㫀㫋㪼㪺㫋㫌㫉㪼㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪘㫉㪺㪿㫀㫋㪼㪺㫋㫌㫉㪼

Web Service Contents Server (Apache) 䋨Java䋩 HTTP/SOAP +SSL Linux(FedoraCore)

Web Browser HTTP+SSL Contents Server OS

Mobile Phone Web Server ITR Server Web Service(AXIS) 䋨Java䋩 Linux(FedoraCore)

ITR server HTTP/SOAP Communication Module HTTP/SOAP +SSL +SSL ITR engine

SpeecysOS Web Service Contents Server Device Driver (Apache) 䋨Java䋩

Device Linux(FedoraCore)

㪠㪫 Contents Server

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪉 㪮㪿㪸㫋㪮㪿㪸㫋㵭㵭㫊㩷㪩㪫㪤㪣㪖㫊㩷㪩㪫㪤㪣㪖

㪩㫆㪹㫆㫋㩷㪫㫉㪸㫅㫊㪸㪺㫋㫀㫆㫅㩷㪤㪸㫉㫂㫌㫇㩷㪣㪸㫅㪾㫌㪸㪾㪼㩷㩿㪩㪫㪤㪣㪀㩷㫀㫊㩷㪸㩷㫇㫉㫆㫋㫆㪺㫆㫃㩷 㪸㫅㪻㩷㪻㪸㫋㪸㩷㫊㪺㪿㪼㫄㪸㩷㪽㫆㫉㩷㪻㫆㫎㫅㫃㫆㪸㪻㫀㫅㪾㩷㫉㫆㪹㫆㫋㩷㪺㫆㫅㫋㪼㫅㫋㫊㩷㫍㫀㪸㩷㫋㪿㪼㩷 㪠㫅㫋㪼㫉㫅㪼㫋㩷㪹㪸㫊㪼㪻㩷㫆㫅㩷㪯㪤㪣㩷㪸㫅㪻㩷㪪㪦㪘㪧㪅 㵰㪚㫆㫅㫋㪼㫅㫋㫊㩷㪪㪺㪼㫅㪸㫉㫀㫆㵱 㪻㪼㫊㪺㫉㫀㪹㪼㫊㩷㪸㩷㫇㫉㫆㫇㪼㫉㫋㫐㩷㫆㪽㩷㫀㫋㫊㪼㫃㪽㩷㪸㫅㪻㩷 㫇㫉㫆㪺㪼㪻㫌㫉㪼㫊㩷㫋㫆㩷㫇㫃㪸㫐㩷㵰㪚㫆㫅㫋㪼㫅㫋㫊㵱 㫎㫀㫋㪿㩷㪩㪫㪤㪣㩷㪽㫆㫉㫄㪸㫋㪅㩷㪘㫅㪻㩷㫀㫋㩷 㪺㫆㫅㫋㪸㫀㫅㫊㩷㫃㫀㫅㫂㩷㫀㫅㪽㫆㫉㫄㪸㫋㫀㫆㫅㩷㫋㫆㩷㪸㫅㫆㫋㪿㪼㫉㩷㵰㪚㫆㫅㫋㪼㫅㫋㫊㩷㪪㪺㪼㫅㪸㫉㫀㫆㵱㪅 㵰㪚㫆㫅㫋㪼㫅㫋㫊㵱 㪿㪸㫊㩷㫊㫆㫌㫅㪻㩷㪻㪸㫋㪸㩷㪸㫅㪻㩷㫄㫆㫋㫀㫆㫅㩷㪻㪸㫋㪸㩷㩿㫀㫅㪺㫃㫌㪻㫀㫅㪾㩷㫊㪼㫉㫍㫆㩷 㪸㫅㪻㩷㪣㪜㪛㩷㪻㪸㫋㪸㪀㪅

Sound data Link Sound data

Contents Scenario Motion data Contents Scenario Motion data

Contents (SYGSA) Contents (SYGSA)

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪊

㪩㪫㪤㪣㩷㪝㫆㫉㫄㪸㫋㩷㩿㪈㪆㪊㪀㪩㪫㪤㪣㩷㪝㫆㫉㫄㪸㫋㩷㩿㪈㪆㪊㪀 !ZONXGTUKQP GPEQFKPI 76(! %QPVGPVU5EGPCTKQ ZONPU JVVRKVTUGEEQLRUGTXKEGUQCRKH ZONPUZUK JVVRYYYYQTI:/.5EJGOCKPUVCPEG ZUKUEJGOC.QECVKQP JVVRKVTUGEEQLRUGTXKEGUQCRKH%QPVGPVUEGPCTKQZUF

– %QPVGPVUUEGPCTKQRTQRGTVKGUHKGNF %QPVGPVU+PHQ KF LREQUGEDCUGXGTUKQP  6KVNG /CKP%JCTCEVGT6KVNG )GPTG $CUG)GPTG #WVJQT 5'%%Q.6&#WVJQT %TGCVG&CVG %TGCVG&CVG .CUV7RFCVG&CVG .CUV7RFCVG&CVG #DUVTCEV 6JKUKUC$CUG%QPVGPVU5EGPCTKQ#DUVTCEV %QPVGPVU+PHQ

– /QVKQPFGHKPKVKQPHKGNF /QVKQP&GHKPKVKQP.KUV /QVKQPKF TGSWGUV /QVKQP4GH V[RG CRRNKECVKQPOQVKQPPCOG TGSWGUVOVT 1RVKQP %QWPV1RVKQP GPCDNGF VTWGVKOGQWV  9CKV1RVKQP GPCDNGF VTWGVKOG  1RVKQP /QVKQP /QVKQP&GHKPKVKQP.KUV

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪋 㪩㪫㪤㪣㩷㪝㫆㫉㫄㪸㫋㩷㩿㪉㪆㪊㪀㪩㪫㪤㪣㩷㪝㫆㫉㫄㪸㫋㩷㩿㪉㪆㪊㪀 – 8CTKCDNGUFGHKPKVKQPHKGNF 8CTKCDNG.KUV 8CTKCDNGPCOG EQTTGEVV[RG KPV 8CTKCDNGPCOG KPEQTTGEVV[RG KPV 8CTKCDNG.KUV

 5EGPCTKQ 5ETKRV HKGNF 5EGPCTKQ  )TGGVKPI 2NC[ /QVKQPKF UVCTV /QVKQP4GH V[RG CWFKQYCXPCOG JGNNQYCX /QVKQP4GH V[RG CRRNKECVKQPOQVKQPPCOG JGNNQOVT /QVKQP 2NC[ – .KPMGFVQ'PINKUJ%QPVGPVU .KPMPCOGURCEG LREQUGEKF GPINKUJ – 3WGUVKQP 2NC[ /QVKQPKF SWGUVKQP /QVKQP4GH V[RG CWFKQYCXPCOG SWGUVKQPYCX /QVKQP4GH V[RG CRRNKECVKQPOQVKQPPCOG SWGUVKQPOVT /QVKQP 2NC[

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪌

㪩㪫㪤㪣㩷㪝㫆㫉㫄㪸㫋㩷㩿㪊㪆㪊㪀㪩㪫㪤㪣㩷㪝㫆㫉㫄㪸㫋㩷㩿㪊㪆㪊㪀 – TGRGCVWPVKNDTGCM (QT 2NC[TGHKF TGSWGUV – LWFIGYJKEJUYKVEJQH+64YCURWUJGF +HKPRWV  – UGVXCNWGVQXCTKCDNG 5GVXCTKCDNG EQTTGEVXCNWG EQTTGEV  – UVQRVJGHQTTQQR  $TGCM +H 'NUG – TGRGCVHTQOUVCTVQHHQTUGEVKQP %QPVKPWG 'NUG (QT +HXCTKCDNG EQTTGEVGXCN IGXCNWG  2NC[ /QVKQPKF XGT[IQQF /QVKQP4GH V[RG “CWFKQYCXPCOG XGT[IQQFOVT /QVKQP4GH V[RG CRRNKECVKQPOQVKQPPCOG XGT[IQQFOVT /QVKQP 2NC[ +H 5EGPCTKQ %QPVGPVU5EGPCTKQ 㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪍 㪛㪼㫍㪼㫃㫆㫇㫄㪼㫅㫋㩷㪧㫃㪸㫅㪛㪼㫍㪼㫃㫆㫇㫄㪼㫅㫋㩷㪧㫃㪸㫅

㪫㪼㫊㫋㩷㪧㪿㪸㫊㪼 㪧㪿㪸㫊㪼㩷㪇 㪄㪛㪛㪼㪽㫀㫅㪼㩷㫋㪿㪼㩷㫆㫌㫋㫃㫀㫅㪼㩷㫆㪽㩷㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㪸㫅㪻㩷㪩㪫㪤㪣 㪄㪛㪛㪼㫍㪼㫃㫆㫇㩷㪠㪫㪩㩷㪪㪼㫉㫍㪼㫉㩷㩿㪧㫉㫆㫋㫆㫋㫐㫇㪼㩷㪤㫆㪻㪼㫃㪀 㪚㫌㫉㫉㪼㫅㫋 䋣 㪛㪼㫍㪼㫃㫆㫇㪼㫉’㫊㩷㪩㪼㫃㪼㪸㫊㪼 㪄㪜㪜㫅㪿㪸㫅㪺㪼㩷㪠㪫㪩㩷㪪㪼㫉㫍㪼㫉 㪧㪿㪸㫊㪼㩷㪈 㪄㪛㪛㪼㫍㪼㫃㫆㫇㩷㪚㫆㫅㫋㪼㫅㫋㫊㩷㪪㪼㫉㫍㪼㫉㩷㪸㫅㪻㩷㪫㫆㫆㫃㫊㩷㪽㫆㫉 㪚㫆㫅㫋㪼㫅㫋㫊㩷㪧㫉㫆㫍㫀㪻㪼㫉 㪤㪸㫁㫆㫉㩷㪩㪼㫃㪼㪸㫊㪼 㪧㪿㪸㫊㪼㩷㪉 㪄㪜㪜㫅㪿㪸㫅㪺㪼㩷㪠㪫㪩㩷㪪㪼㫉㫍㪼㫉㩷㪸㫅㪻㩷㪚㫆㫅㫋㪼㫅㫋㫊㩷㪪㪼㫉㫍㪼㫉 㪄㪪㪪㫌㫇㫇㫆㫉㫋㩷㪛㪩㪤㩷㩽㩷㪘㪺㪺㫆㫌㫅㫋㩷㫊㫐㫊㫋㪼㫄

㪬㫇㪾㫉㪸㪻㪼㩷㪩㪼㫃㪼㪸㫊㪼 㪄㪭㪭㫆㫀㪺㪼㩷㫉㪼㪺㫆㪾㫅㫀㫋㫀㫆㫅 㪧㪿㪸㫊㪼 㪊 㪄㪠㪠㫄㪸㪾㪼㩷㫉㪼㪺㫆㪾㫅㫀㫋㫀㫆㫅 㪄㪘㪘㫉㫋㫀㪽㫀㪺㫀㪸㫃㩷㪠㫅㫋㪼㫃㫃㫀㪾㪼㫅㪺㪼

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪎

㪩㫆㪸㪻㫄㪸㫇㪩㫆㪸㪻㫄㪸㫇

㪉㪇㪇㪍 㪉㪇㪇㪎 㪉㪇㪇㪏

㪧㪿㪸㫊㪼㩷㪇 㪧㪿㪸㫊㪼㩷㪈 㪧㪿㪸㫊㪼㩷㪉 㪧㪿㪸㫊㪼㩷㪊

Developer’sDeveloper’s UpgradeUpgrade ReleaseRelease ReleaseRelease

MajorMajor ReleaseRelease

5050 100100 200200 300300 ContentsContents ContentsContents ContentsContentss ContentsContents

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪏 㪘㪹㫆㫌㫋㩷㪩㪪㫀㪘㪹㫆㫌㫋㩷㪩㪪㫀

㪩㪪㫀㩷㩿㪩㫆㪹㫆㫋㩷㪪㪼㫉㫍㫀㪺㪼㫊㩷㫀㫅㫀㫋㫀㪸㫋㫀㫍㪼㪀㩷㫀㫊㩷㪼㫊㫋㪸㪹㫃㫀㫊㪿㪼㪻㩷㫀㫅㩷㪤㪸㫐㪃㩷㪉㪇㪇㪋㪅 㪉㪋㩷㪾㫉㫆㫌㫇㫊㩷㫁㫆㫀㫅㩷㪩㪪㫀㩷㫅㫆㫎㪅 㪩㪪㫀㩷㪻㪼㪽㫀㫅㪼㫊㩷㵰㪩㫆㪹㫆㫋㩷㪪㪼㫉㫍㫀㪺㪼㫊㵱 㪸㫊㩷㵰㪫㪿㪼㩷㪠㪫㩷㫊㪼㫉㫍㫀㪺㪼㩷㫎㪿㫀㪺㪿㩷㪸㩷 㫉㫆㪹㫆㫋㩷㫆㪽㪽㪼㫉㫊㩷㫋㪿㫉㫆㫌㪾㪿㩷㪸㩷㫅㪼㫋㫎㫆㫉㫂㩷㫆㫉㩷㫇㪿㫐㫊㫀㪺㪸㫃㩷㫊㪼㫉㫍㫀㪺㪼㵱㪅 㪩㪪㫀㩷㪸㫀㫄㫊㩷㪸㫋㩷㫊㫋㪸㫅㪻㪸㫉㪻㫀㫑㪸㫋㫀㫆㫅㩷㫆㪽㩷㫇㫉㫆㪺㪼㪻㫌㫉㪼㫊㩷㪸㫅㪻㩷 㪺㫆㫄㫄㫌㫅㫀㪺㪸㫋㫀㫆㫅㩷㫇㫉㫆㫋㫆㪺㫆㫃㫊㩷㪽㫆㫉㩷㫉㫆㪹㫆㫋㩷㫊㪼㫉㫍㫀㪺㪼㫊㪅 㪦㫌㫉㩷㪺㫆㫅㪺㪼㫇㫋㩷㫀㫊㩷㫆㫇㪼㫅㩷㫊㫇㪼㪺㫀㪽㫀㪺㪸㫋㫀㫆㫅㩷㪸㫅㪻㩷㪸㫇㫇㫃㫀㪺㪸㪹㫃㪼㩷㫋㫆㩷 㫍㪸㫉㫀㫆㫌㫊㩷㫉㫆㪹㫆㫋㫊㪅 㪮㪼㵭㫃㫃㩷㫇㫌㪹㫃㫀㫊㪿㩷㪩㪪㫀㩷㪪㫇㪼㪺㫀㪽㫀㪺㪸㫋㫀㫆㫅㩷㪭㪈㪅㪇㩷㫊㫆㫆㫅㪅 㪭㪈㪅㪇㩷㫀㫅㪺㫃㫌㪻㪼㫊㩷㪙㪸㫊㫀㪺㩷㪧㫉㫆㪽㫀㫃㪼㪃㩷㪤㫌㫃㫋㫀㫄㪼㪻㫀㪸㩷㪧㫉㫆㪽㫀㫃㪼㪃㩷㪤㫆㫋㫀㫆㫅㩷 㪧㫉㫆㪽㫀㫃㪼㪃㩷㪤㫆㫋㫀㫆㫅㩷㪧㪸㫋㫋㪼㫉㫅㩷㪧㫉㫆㪽㫀㫃㪼㪃㩷㪠㫅㪽㫆㫉㫄㪸㫋㫀㫆㫅㩷㪪㪼㫉㫍㫀㪺㪼㩷㪧㫉㫆㪽㫀㫃㪼

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪈㪐

㪧㫉㫆㫆㪽㩷㪜㫏㫇㪼㫉㫀㫄㪼㫅㫋㩷㪹㫐㩷㪩㪪㫀㪧㫉㫆㫆㪽㩷㪜㫏㫇㪼㫉㫀㫄㪼㫅㫋㩷㪹㫐㩷㪩㪪㫀

㪩㪪㫀㩷㪺㪸㫉㫉㫀㪼㪻㩷㫆㫌㫋㩷㪸㩷㫇㫉㫆㫆㪽㩷㪼㫏㫇㪼㫉㫀㫄㪼㫅㫋㩷㪸㫋㩷㪢㫀㪻㫊㩷㪧㫃㪸㫑㪸㩷㪦㫊㪸㫂㪸㩷㫆㫅㩷 㪤㪸㫉㪅㩷㪈㪍㪄㪈㪐㪃㩷㪉㪇㪇㪍㪅 㪮㪸㫂㪸㫄㪸㫉㫌 㩿㪤㪟㪠㪀㪃㩷㪼㫅㫆㫅 㩿㪝㫌㫁㫀㫋㫊㫌㪀㪃㩷㫀㪽㪹㫆㫋 㩿㪙㪛㪣㪀㪃㩷㪘㫇㫉㫀㪘㫃㫇㪿㪸 㩿㪫㫆㫊㪿㫀㪹㪸㪀㩷㪻㪼㫄㫆㫅㫊㫋㫉㪸㫋㪼㪻㩷㪹㪸㫊㪼㪻㩷㫆㫅㩷㪩㪪㫀㩷㫊㫇㪼㪺㫀㪽㫀㪺㪸㫋㫀㫆㫅㫊㪅

http://www.robonica.net/archives/log/eid164.html

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪉㪇 㪚㫆㫅㪺㫃㫌㫊㫀㫆㫅㪚㫆㫅㪺㫃㫌㫊㫀㫆㫅

㪠㪫㪩㩷㪄 㪥㪼㫎㩷㪚㫆㫅㪺㪼㫇㫋㫌㪸㫃㩷㪩㫆㪹㫆㫋㩷㪪㫆㫃㫌㫋㫀㫆㫅 㪟㫌㫄㪸㫅㫆㫀㪻㩷㪿㫆㫄㪼㩷㪼㫅㫋㪼㫉㫋㪸㫀㫅㫄㪼㫅㫋㩷㫉㫆㪹㫆㫋 㪥㪼㫏㫋㩷㪾㪼㫅㪼㫉㪸㫋㫀㫆㫅㩷㫄㪼㪻㫀㪸㩷㫎㪿㫀㪺㪿㩷㫀㫊㩷㫅㪼㫏㫋㩷㫋㫆㩷㪫㪭㪃㩷㪩㪸㪻㫀㫆㪃㩷㪧㪚㪃㩷 㪤㫆㪹㫀㫃㪼㩷㫇㪿㫆㫅㪼㩷㫀㫅㩷㪸㩷㪿㫆㫄㪼 㪠㪫㪩㩷㪪㫐㫊㫋㪼㫄㩷㫇㫉㫆㫍㫀㪻㪼㫊㩷㪠㪫㪩㩷㫇㫉㫆㪾㫉㪸㫄㫊㩷㩿㪩㫆㪹㫆㫋㩷㪚㫆㫅㫋㪼㫅㫋㫊㪀㩷㫍㫀㪸㩷 㫋㪿㪼㩷㪠㫅㫋㪼㫉㫅㪼㫋 㪜㫄㫆㫋㫀㫆㫅㪸㫃㩷㪼㫏㫇㫉㪼㫊㫊㫀㫆㫅㩷㫋㪿㪸㫋㩷㫀㫊㩷㫋㪼㫅㪻㪼㫉㩷㫋㫆㩷㪸㩷㫇㪼㫉㫊㫆㫅㩷㫎㫀㫋㪿㩷 㪟㫌㫄㪸㫅㫆㫀㪻㩷㫊㪿㪸㫇㪼㪃㩷㪪㪰㪞㪪㪘㩷㪣㫀㪹㫉㪸㫉㫐㩷㪸㫅㪻㩷㪤㫆㫋㫀㫆㫅㩷㪙㫉㫆㫎㫊㪼㫉㪅

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪉㪈

㪚㫆㫅㪺㫃㫌㫊㫀㫆㫅㪚㫆㫅㪺㫃㫌㫊㫀㫆㫅

㪩㪫㪤㪣㩷㵨 㪩㫆㪹㫆㫋㩷㪫㫉㪸㫅㫊㪸㪺㫋㫀㫆㫅㩷㪤㪸㫉㫂㫌㫇㩷㪣㪸㫅㪾㫌㪸㪾㪼 㪧㫉㫆㫋㫆㪺㫆㫃㩷㪸㫅㪻㩷㪪㪺㪿㪼㫄㪸㩷㪽㫆㫉㩷㪻㫆㫎㫅㫃㫆㪸㪻㫀㫅㪾㩷㫉㫆㪹㫆㫋㩷㪺㫆㫅㫋㪼㫅㫋㫊㩷㫍㫀㪸㩷 㫋㪿㪼㩷㪠㫅㫋㪼㫉㫅㪼㫋 㪩㪫㪤㪣㩷㫀㫊㩷㪻㪼㫊㪺㫉㫀㪹㪼㪻㩷㫀㫅㩷㪯㪤㪣㩷㪽㫆㫉㫄㪸㫋㩷㪸㫅㪻㩷㪹㪸㫊㪼㪻㩷㫆㫅㩷㪪㪦㪘㪧

㪩㫆㪹㫆㫋㫀㪺㫊㩷㪛㪫㪝㩷㪄 㪦㪤㪞㩷㪫㪼㪺㪿㫅㫀㪺㪸㫃㩷㪤㪼㪼㫋㫀㫅㪾㩷㫀㫅㩷㪪㫋㪅㪣㫆㫌㫀㫊㪃㩷㪤㪦㪃㩷㪬㪪㪘㩷㪄 㪘㫇㫉㫀㫃㩷㪉㪋㪄㪉㪏㪃㩷㪉㪇㪇㪍 㪉㪉 㪘㪾㪸㫀㫅㪃㪃㩷㪫㪿㪸㫅㫂㩷㫐㫆㫌㩷㪸㫃㫃㩷㪽㫆㫉㩷㫐㫆㫌㫉㩷㪸㫋㫋㪼㫅㫋㫀㫆㫅㩷㫋㫆㪻㪸㫐㩸 㪮㪼㩷㪿㫆㫇㪼㩷㫆㫌㫉㩷㫇㫉㪼㫊㪼㫅㫋㪸㫋㫀㫆㫅㩷㫎㪸㫊㩷㪿㪼㫃㫇㪽㫌㫃㩷㫋㫆㩷㫐㫆㫌㪅㪅 㪘㫅㪻㪃㪃㩷㪪㫇㪼㪺㫀㪸㫃㩷㪫㪿㪸㫅㫂㫊㩷㫋㫆㩷㪪㫇㪼㪼㪺㫐㫊㩷㪚㫆㫆㫇㪼㫉㪸㫋㫀㫆㫅㪅㪅 㪄 㪿㫋㫋㫇㪑㪆㪆㫎㫎㫎㪅㫊㫇㪼㪼㪺㫐㫊㪅㪅㪺㫆㫄 㪄 㪿㫋㫋㫇㪑㪑㪆㪆㫎㫎㫎㪅㪅㫊㫇㪼㪼㪺㫐㫊㪅㪅㪺㫆㫄㪆㫀㫋㫉㪆

For the social safety and development Realtime@net http://www.sec.co.jp E-mail: [email protected] robotics/2006-04-17

OMG Robotics Task Force Robotic Services WG

April 2006 Saint Louis, MO, USA

Mission Statement

• The goal of the Robotics Services WG is : – Establish a clear definition of Functional Services in Robotic Systems – Identify and categorize services commonly used in robotic application and the technologies involved – Define standard interfaces that expose these technologies to robotic application developers – Study other existing related standards and coordinate with them – Coordinate with other groups within the OMG Robotics Task Force to keep specification consistent Roadmap

Item Status St Louis Boston Anaheim Washington SW Coast April 2006 June 2006 Sept. 2006 Dec. 2006 March 2006 Robotic Service On Chartering WG going

Localization On RFP RFP RFP Service going 1st draft Draft Revision (User Identification) Proposed Discussio RFP Draft RFP Service n drafting Revision

Other services ??

Schedule this week

• Wednesday 16:00 – 18:00 Topic : Localization Service 1. Localization Service scope definition 2. Identification of Requirements

• Thursday 10:00 – 12:00 Topic : Localization Service 1. Localization Service scope definition 2. Identification of Requirements Election of co-chairs

• Candidates are : –Dr Chi – Lemaire

• Newly elected co-chairs :     

             

        

     

                     !    "#  $    

     %& #   ##    !'()*      &  ++ "+      , -) .+/0

 (      & #    #&!#,  1  +   1    & 0#   "+    ++ ! !    $  2 !  # !   , -) 3+0

       ! "

3$ 2!   !() $ 2!   &  4$ 2&    ,     0 $ 2!  ! 52  3$ 6    78 .$ 2& (!  3$ 9   !& "  $    !     "        #$ % "

3$  &  ,)9991 0"  3$  )999+3.3 $ (    #   )999+3.:: 4$      "&-   3$ 3.3   ; $ 3.:: 26;, <)0 $ )=-  1&  3$ 9   !& "  $  !  !     "    3$   &    "

 % 

                 ! ' !( )" #* !"#     #$ %&! #$ %&! %&! +  $ #  #    #    ,#-"*" %  ( #$ !"#     #$ %&! #$ %&! %&!    $" #    #     &#' 

 6    > > ,  ?0  @1 +) ,9)0

 <#   +    > > ,  ?0  @1 +) ,9)0

  

 (   !!     1"   

      

 A B 1  !   # "B '"  "C  D* " #  #   $ robotics/2006-04-19

OMG Robotics DTF Infrastructure WG progress report

Rick Warren (RTI) Noriaki Ando (AIST)

Mission Statement

• The purpose of the Infrastructure Working Group of the Robotics Domain Task Force is to standardize fundamental models, common facilities, and middleware to support the development and integration of a broad range of robotics applications. • This working group should collaborate with other groups within OMG.

– Common facilities • Fundamental services general to wide range of robotics applications.

2 Concerns and Prioritization

9Deployment & Configuration • Resource management • Event management • Data distribution • Behavior of Control Systems

3

Roadmap

• Outline/framework RFP in Boston (June) • Draft RFP in Anaheim (Sep.) • Review RFP in Washington D.C. (Dec.) • Second review RFP (Mar.) • Issue RFP (Mar.)

4 Today’s meeting (14:00-18:00)

Topic: DC (Deployment & Configuration) • Presentation – Jaesoo Lee’s (Seoul National University) • Is there *part* of DC should go in RTC? • DC RFP discussion. • Unification progress update

• RTC RFP discussion (open)

5

Selection Chairs

• Candidates – Saehwa Kim – Rick Warren – Noriaki Ando

• Newly elected co-chairs

6 rpyzmGhŠ››ŒšGGGGGGGG‰ G€œ• r––Gjœ•Ž

‰ rpyzmGš›ˆ•‹ˆ™‹ša – XXGš—ŒŠŠˆ›–•šGžŒ™ŒGˆ——™–Œ‹G–•GmŒ‰USGYWW]G•GrpyzmU – uŒž“ GX]Gš›ˆ•‹ˆ™‹¡ˆ›–•G—™–‘ŒŠ›šG–™GGrpyzmGš›ˆ•‹ˆ™‹GžŒ™ŒG—™–—–šŒ‹U – YGrpyzmGš›ˆ•‹ˆ™‹šGžŒ™ŒGˆ‹–—›Œ‹GˆšGrzšU ̪ rpyzmaGr–™ŒˆGp•›Œ““ŽŒ•›Gy–‰–›Gz›ˆ•‹ˆ™‹¡ˆ›–•Gm–™œ” ‰ |yjG{ŒŠ•–“–Ž Gj–™–—Œ™ˆ›–• m–™œ”G~–™’š–—GŒ“‹G–•Gtˆ™ŠG`UGYWW] – vtnGˆ•‹GvtnGy–‰–›ŠšGk{mGˆŠ››ŒšGžŒ™ŒG•›™–‹œŠŒ‹G›–GYWWGr–™Œˆ•Gy–‰–›G •‹œš›™ˆ“G—Œ–—“ŒU – p•›Œ‹Gš—Œˆ’Œ™šaGk™UGoœ•ŽGwˆ”Gˆ•‹Gt™UGj– ™–”Gy{pSG m™Œ‹G~ˆš’ŒžŠ¡SGk™ŒŠ›–™G–Gz›ˆ•‹ˆ™‹šG–Gvtn ‰ |yjG—™–‘ŒŠ›G•Œžša – ]\WG|yjG‰ˆšŒ‹G–”ŒG™–‰–›šGž““G‰ŒG—™–‹œŠŒ‹Gˆ•‹G‹š›™‰œ›Œ‹G•Gˆ—ˆ™›”Œ•›šG‰ Gr{GG•G vŠ›–‰Œ™SGYWW]U

robotics/2006-04-20 Open Resource Interface for the Network / Open Robot Interface for the Network

Contact Report

ORiN Forum http://www.orin.jp/

Chair: Makoto Mizukawa Shibaura Institute of Technology

2006.4.26 Robotics DTF, OMG TM, St. Louis 1 robotics/2006-04-21

ORiNORiN Open Resource Interface for the Network /Open Robot Interface for the Network applications

ORiN Provider

FA devices/Robot controllers

2006.4.26 Robotics DTF, OMG TM, St. Louis 2 ORiN: Summary

䎵䏒䏅䏒䏗 䎵䏒䏅䏒䏗 䎩䎤 䎩䎤 䎤䏓䏓䎑䎃䎔 䎤䏓䏓䎑䎃䎕 䎤䏓䏓䎑䎃䎔 䎤䏓䏓䎑䎃䎕 Device Independent Interface

API

㸡㪘㫇㫇㫃㫀㪺㪸㫋㫀㫆㫅㩷㫄㪸㫂㪼㫉 㪦㪩㫀㪥㩷㫇㫃㪸㫋㪽㫆㫉㫄 㸣㪝㪘㩷㪻㪼㫍㫀㪺㪼㩷㫄㪸㫂㪼㫉

Device Interface

Application 䊶䊶䊶䊶䊶 Independent Interface 2006.4.26䎵䏒䏅䏒䏗 䎰䏄䏆䏋䏌䏑䏈䎃䏗䏒䏒䏏 Robotics 䎳䎯䎦 DTF, OMG 䎲䏓䏈䏕䏄䏗䏌䏒䏑䎃䏓䏄䏑䏈䏏 TM, St. Louis 3

ORiN: Summary

䎳䏕䏒䏇䏘䏆䏗䏌䏒䏑䎃 䎳䏕䏒䏆䏈䏖䏖䎃 䎲䏓䏈䏕䏄䏗䏌䏒䏑䎃 䎷䏕䏒䏘䏅䏏䏈 䎰䏄䏑䏄䏊䏈䏐䏈䏑䏗 䎰䏄䏑䏄䏊䏈䏐䏈䏑䏗 䎰䏒䏑䏌䏗䏒䏕䏌䏑䏊 䎶䏋䏒䏒䏗䏌䏑䏊 ……… Before

䂹 Close dependency on Devices/ Networks/ Protocols …….. -> One-off/ Order-made ROBOT Machine PLC Operation -> Low Reliability Tool Panel -> Poor Maintenance

Production Process Operation Trouble …….. Management Management Monitoring Shooting After ORiN platform 䂹 Independency on Devices/ Networks/ Protocols …….. -> Standard Products -> High Reliability ROBOT Machine PLC Operation 2006.4.26ROBOT Machine PLC RoboticsOperation DTF, OMG TM, St. Louis 4 Tool Panel -> Good Maintenance Scope

ORiN proposes

•the application program interface, •the provider interface for linking controllers and •the schema definition specification for defining robots using common formats, for realizing unified type applications for production systems containing industrial robots.

2006.4.26 Robotics DTF, OMG TM, St. Louis 5

ORiN System Configuration

ComponentComponent StructureStructure StandardStandard APIAPI

StandardStandard ControllerController APIAPI

2006.4.26 Robotics DTF, OMG TM, St. LouisDataData SchemaSchema 6 ORiN : Features

† Framework and Interface Standards „ Application Program Interface Framework „ Robot/Device Controller Interface Framework „ Device Profiling Schema Application layer † Using ORiN Service Layer „ Distributed Object Engine RRD

„ Device Profiling using XML Provider † Providing Controller Layer „ Interoperability „ Web Service capability 2006.4.26 Robotics DTF, OMG TM, St. Louis 7

Schedules

† 2006 „ Jul 15,16 ISO TC184/SC2 † NWIP Draft ƒ Comments Æ Revision † 2007 „ Jan., Voting „ Mar. Voting Result ƒ Approved Æ Project Starts

2006.4.26 Robotics DTF, OMG TM, St. Louis 8 Open Resource Interface for the Network / Open Robot Interface for the Network

Japan Robot Association ORiN forum http://www.orin.jp/

2006.4.26 Robotics DTF, OMG TM, St. Louis 9

KeyKey TechnologiesTechnologies forfor OpenOpen RobotRobot ModelModel

•Robot Access Object (RAO) a middleware that provides standard program interface and services to robot controller based on the distributed object model •Robot Resource Definition Format(RRD) a data schema that provides standard format for data from/to robot controller based on the eXtensible Markup Language (XML) •Robot Access Protocol(RAP) standard protocol in the Internet using http and XML to allow data-exchange over firewalls

2006.4.26 Robotics DTF, OMG TM, St. Louis 10 robotics/2006-04-22 and mars/2006-04-07

RTCRTC RFPRFP SubmissionSubmission

Progress update

BackgroundBackground

„ Robotics Technology Component (RTC) RFP (closed Feb 06)

„ Separate proposals submitted by AIST & RTI

„ Currently working towards a single proposal that unifies the 2 concepts

„ Revised submission due in Jun 06 MainMain IssuesIssues

„ RTC specification consists of 3 parts

„ Core component model

„ Execution semantics

„ Introspection part

PartPart I:I: CoreCore ComponentComponent ModelModel

„ RT core component (now LwRTC) is minimal conformance point

„ UML componecomponentnt definition

„ Ports

„ Lifecycle (or Activity)

„ States

„ Simple lifecycle

„ Extended by optional profiles (next slide) PartPart II:II: ExecutionExecution SemanticsSemantics

„ Execution semantics defined on top of LwRTC

„ Basic semantics are extended by different behavioral profiles

„ Stimulus-response (i.e., event-driven)

„ Data-flow (i.e., periodic)

„ Multi-rate

„ Multi-modal

PartPart III:III: IntrospectionIntrospection

„ Introspection contained within the RTC specification

„ Defines introspective API based on SDO robotics/2006-04-23 and mars/2006-04-15

Summary Report of Robotic Systems RFI responses

Tetsuo KOTOKU Robotics Domain Task Force

RFI response presentations in St. Louis

• “Communication protocol for the URC robot and server” (Hyun-Sik Shim, Samsung Electronics) • “Fujitsu’s robotics research and standardization activites” (Toshihiko Morita, Fujitsu) • “Standardization of device interfaces for home service robot” (Ho-Chul Shin, ETRI) • “Voice interface standardization items network robot in noisy environments” (Soon-Hyuk Hong, Samsung Electronics) • “Home robot navigation in SAIT” (Seok-Won Bang and Yeon-Ho Kim, Samsung Advanced Institute of Technology) • “ITR – Internet Renaissance ~ The world’s first humanoid robot to be harmonized with the family~” (Hiroyuki Nakamoto, Systems Engineering Consultants) RFI Responses

•1st Batch: 9 presentations in Burlingame (RTI, Systronix, SNU, ETRI * 2, NEC, NTT, ATR, Toshiba)

•2nd Batch: 14 presentations in Tampa (Hitachi, ADA Software, SEC, Mayekawa MFG, ETRI*3, Tsukuba Univ., AIST, Coroware, IHI, PrismTech, THALES, Toshiba)

•3rd Batch: 6 presentations in St. Louis (Samsung*2, Fujitsu, ETRI, SAIT, SEC)

Total: 29 presentations

Chartering Working Groups

3 working Group was chartered in St. Louis • Service WG • Profile WG • Infrastructure WG

Initial roadmap: 5 potential RFPs [robotics/2006-04-08] Robotics Services WG Mission Statement

• The goal of the Robotics Services WG is : – Establish a clear definition of Robotic service – Identify and categorize services commonly used in robotic application and the technologies involved – Define standard interfaces that expose these technologies to robotic application developers – Coordinate with other groups within the OMG Robotics Task Force to keep specification consistent

Robotics Services WG Election of Chairs

• Co-chairs : – Soo-Young Chi (ETRI) – Olivier Lemaire (JARA/AIST) Profile WG Mission Statement Application Programmer's View 1. Define scope and model of API 2. Define typical devices 3. Device hierarchies (like class hierarchies) 4. Define interfaces & Data structures 1. Consider standards such as JAUS 5. Device Profiles 1. Enumeration of available resources 2. Resource configuration and capabilities Physical Resource View 1. Apply relevant standards (IEEE, etc) to robotics 1. Smart sensors IEEE-1451 2. Precision networked clock IEEE-1588 3. Arrange presentations on the above at OMG meetings 1. 1451 in Anaheim? 2. 1588 in Wash DC? (near NIST) 2. I/O point tagging, provides: 1. Enumeration of available resources 2. Storage of configuration and capabilities 1. on the actual device or as close to it as possible

Profile WG Election of Chairs

• Co-chairs : – Bruce Boyes (Systronix) – Seung-Ik Lee (ETRI) Infrastructure WG Mission Statement

• The purpose of the Infrastructure Working Group of the Robotics Domain Task Force is to standardize fundamental models, common facilities, and middleware to support the development and integration of a broad range of robotics applications. • This working group should collaborate with other groups within OMG.

– Common facilities • Fundamental services general to wide range of robotics applications.

Infrastructure WG Selection Chairs

• Co-chairs: – Saehwa Kim (Seoul National Univ.) – Rick Warren (RTI) – Noriaki Ando (AIST) robotics/2006-04-24 Date: Friday, 28th April, 2006 Chair: Tetsuo Kotoku, YunKoo Chung, Hung Pham Robotics-DTF Group URL: http://robotics.omg.org/ Group email: [email protected]

¾ Highlights from this Meeting: Robotics/SDO Joint Plenary (Tue. & Wed.): – 6 RFI response presentations (Samsung Electronics(2), ETRi, SAIT, SEC ) – 2 Special Talks • Chris Gill(Washington U.) [robotics/06-04-09], • Jean-Christophe Baillie(ENSTA/UEI Lab) [robotics/06-04-11]

¾ Joint Meeting with MARS-PTF (Thu.): – Robotics RFI Summary Report

Date: Friday, 28th April, 2006 Chair: Tetsuo Kotoku, YunKoo Chung, Hung Pham Robotics-DTF Group URL: http://robotics.omg.org/ Group email: [email protected]

¾ Working Group activities & Reports – 3 WGs(Service, Profile, Infrastructure) were discussed – 5 Roadmaps were discussed. – Missions and co-chairs of WGs were approved. ¾ Future deliverables (In-Process): – Roadmaps from WGs. ¾Next Meeting (Boston, USA): – Roadmap discussions from WGs – RFP drafts discussion – Contact report