Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in

Review on Mobile Locomotion Control

Kamal Kant1, Vinay Bhatia2 & Vijender Rajora3 1, 2 ,3Baddi University of Emerging Science and Technology, Baddi, Solan, H.P., India

Abstract: During human-robot interaction, it is especially in Japan. Among the first companies to often necessary for human users to command the realize humanoid such as research robot’s locomotion. Robots that interact with prototypes, the most important are Honda, with the humans in everyday situations need to be able to P3, and Sony, with SDR-4X. This highlights that interpret the nonverbal social cause of their human industrial companies are envisioning a large market interaction partners. In this paper researcher for humanoid robots, not only realized as research present a study on various robot locomotion tools for scientific and engineering purposes, but as control techniques. Robot locomotion has various market products for service or entertainment methods to control its movement. But these methods applications. Today Toyota is leading company in are used in different work environment according the field of humanoid research. The to their working algorithm. These robot locomotion modern anatomy of humanoid robots consists of control method are used at known environment and two major subsystems, a lower body, including: not used in priory unknown environment. Robot legs, wheels, or tracks, used to move the robot in locomotion control methods in this paper study are the environment and an upper body, comprising of: vision based, gesture recognition, posture arms, hands and head used to interact with the recognition, social behaviour recognition, environment as well as with human beings. Current monocular vision system, hand posture recognition research in humanoid robotics is characterized by and wireless vision control based. These all different combinations of elements, which depends locomotion control methods are used according on the specific research objectives or robot robot work environment. functions. Honda ASIMO (Japan), Waseda University WABIAN (Japan), Kaist HUBO (South Keywords: , robot locomotion, Korea), Kawada Industries HRP-2 (Japan) are humanoid robots, monocular vision system, Hand examples of full-body, biped humanoid robots Gesture recognition and Wireless vision developed mainly for research in biped locomotion. Other humanoid platforms, such as the NASA ROBONAUT are characterized by fully developed 1. Introduction arms and hands for dexterous manipulation, but lack mobility capabilities [1].It is important that Robot technology has been implemented in mobile robot locomotion has a good control where many fields of our life, such as entertainment, it work or at its work environment. security, rescue, rehabilitation, social life, and the The fundamental features to design a humanoid military. Most robots arebuilds for particular robot are: purposes. Some researchers use tank model robots • Autonomous behaviour: in order to move in a for disaster problems and navigation in dangerous complex environment such as in a house, robot areas [3], while other researchers build a robot should be able to have a certain degree of partner to support elderly people [12]. Furthermore, autonomy even if the robot is controlled by the some researchers use humanoid robots for user. dangerous areas and rescuing humans [19]. Honda • Human-Machine interface: the robot should be produced the “ASIMO” that can able to satisfy some needs of people, while being serve people in their social life. DARPA humanoid controlled by the users by means of a computer or robot is developed for military service. joystick or passive mechanical interface. Nevertheless, the humanoid biped robot is a Agility: ability to detect, avoid and overcome suitable robot in many fields: it can be applied for obstacles. social life [4], rescue [3], [19], military purposes, or •Robustness: fast dynamic balance control; entertainment (soccer robot, dancing robot). • Appearance: nice exterior design; Therefore, it is important to improve the •Communication: communication abilities; development of humanoid robots. Although the • Efficiency: great energy efficiency. cognitive abilities of humanoid robots are important, their basic abilities are also important. The major goal in this field is in developing Furthermore, research in humanoid robotics is capabilities for robots to autonomously decide how, motivated by industrial interests and investments, when, and where to move. It is difficult to coordinating a large number of robot joints, like

Imperial Journal of Interdisciplinary Research (IJIR) Page 1524

Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in negotiating stairs, is difficult. In many areas of technique on real imagery. This is a fast and robotics research locomotion is a simple algorithm for a hand gesture recognition major technological obstacle, such as humanoids problem. For this first observed images of the hand, (like Honda's Asimov).In this paper researcher the algorithm segments the hand region, and then study various robot locomotion control methods. makes an action on the activity of the fingers There is various method of robot locomotion involved in the gesture. They have use a limited control like: vision based, wireless based and number of gestures to control robot. This posture based control etc. It is important that robot algorithm can be extended in a number of ways to locomotion has a good control on its movement, if recognize a broader set of gestures. The it is not accurate than robot can’t trace its path segmentation portion of the algorithm is too simple, accurately and can’t reach at its destination point. and in challenging operating conditions it would So for better robot locomotion its control is need to be improved if this technique would need accurate and efficient. to be used.The segmentation problem in a general setting is an open research problem itself. Reliable 2. Related Work performance of hand gesture recognition techniques in a general dealing with occlusions, R. Kurni et al. make a helper robot that carries temporal tracking for recognizing dynamic out tasks ordered by users through speech [2]. The gestures, as well as 3D modelling of the hand, robot needs a vision system to recognize objects which are still mostly beyond the current state of appearing in the orders. The conventional vision the art. systems cannot recognize objects in complex X .YIN et al. presented a gesture recognition system scenes. They may find many objects and cannot implemented on a real humanoid [6]. determine which the target is. This proposes a The system applies RCE neural network to segment method of using a conversation with the user to hand Images. The RCE network is capable to solve this problem. For this robot ask a question characterize the distribution region of skin colours from the user and user answer this question and in the colour space with numerous skin colour user’s answer can efficiently reduce the number of prototype cells and their influence fields. The candidate objects. It considers the characteristics of recognition of hand postures is based on the features used, for object recognition such as the topological features of the hand that are extracted easiness for humans to specify them by word, thus from the binary image of the segmented hand generating a user-friendly and efficient sequence of region. The topological features of human hands questions. Experimental results show that, it is easy are quite similar and stable. So the recognition to robot to detect target objects by asking the system has the following properties, meeting all the questions generated by the method, butrobots need requirements for human-service robot interaction: vision to carry out their tasks. However, robustness in dynamic and complex backgrounds; conventional vision systems cannot work in adaptability to lighting variations; rotational complex scenes. The robot asks a question to the invariance; real-time performance; user and device user when it cannot detect the target object. It independence. Eight hand postures have been used generates a sequence of utterances that can lead to for gesture-based programming of the service robot determine the object efficiently and user-friendly. It HARO-1 and experimental results demonstrated determines what and how to ask the user by the effectiveness and robustness of the system. according it get the image processing results and They have presented a gesture recognition system the characteristics of image attributes. They obtain implemented on a real humanoid service robot. The promising experimental results. The current system system applies RCE neural network to segment is a small system to examine whether or not the hand images. The RCE network is capable to approach is promising. characterize the distribution region of skin colours A. Malima et al. present a fast algorithm for vision in the colour space with numerous skin colour based hand gesture recognition for prototype cells and their influence fields. The [5]. Hand gesture recognition is a challenging recognition of hand postures is based on the problem in its general form. They consider a fixed topological features of the hand that are extracted set of manual commands and a reasonably from the binary image of the segmented hand structured environment, and develop a simple, yet region. The topological features of human hands effective, procedure for gesture recognition and are quite similar and stable. So the recognition then control robot locomotion on the basis of this system has the following properties, meeting all the command. This approach contains steps for requirements for human-service robot interaction: segmenting the hand region, locating the fingers, robustness in dynamic and complex backgrounds; and finally classifying the gesture. The algorithm is adaptability to lighting variations; rotational invariant to translation, rotation, and scale of the invariance; real-time performance; user and device hand. They demonstrate the effectiveness of the independence. Eight hand postures have been used

Imperial Journal of Interdisciplinary Research (IJIR) Page 1525

Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in for gesture-based programming of the service robot double ration and so on few factors. It provides the HARO-1 and experimental results demonstrated standard parameters for next step optimization and the effectiveness and robustness of the system. This learning. By setting force sensor on the sole, future is a new method for accurate recognition of hand work will be taken more on landing. postures, which extracts topological features of the It is a full-size humanoid robot using self-contained hand, such as the segmented of hand region, and modular component to perform cooperative works recognizes hand postures based on the analysis of in general humanoid environment and robotic these features. This research on hand gesture soccer in particular. Simple locomotion planning recognition is a part of the project of Hybrid methods for biped , turning and kicking are Service Robot System, in which they integrate described. The proposed method provides a set of various technologies, such as real robot control, standard parameters for further optimization and virtual robot simulation, human robot interaction learning with feedback information. This method etc. to build a multi-modal and intelligent human- has basically proved by locomotion experiment. robot interface. They use human-alike service robot Stable walking pattern can be achieved simply by HARO-1. It mainly consists of an active stereo tuning gait length, time, double ration and so on vision head on modular neck, two modular arms few factors. It provides the standard parameters for with active links, an omnidirectional mobile base, next step optimization and learning. By setting dextrous hands under development and the force sensor on the sole, future work will be taken computer system. Each modular arm has 3 serially more on landing impact reduction. It helps to connected active links with 6 axes. accelerate walking. J. Morimoto et al.propose a simple design strategy H. YONEDA et al.introduces a vertical ladder for diverse humanoid locomotion, which does not climbing of the humanoid robot with the help of require a precise model of the robot and does not posture control without any external sensors [9]. need careful design of desired gait trajectories [7]. The humanoid robot does not have any special They show that a humanoid robot can walk using structure for fixing the body to the ladder. The simple sinusoidal desired joint trajectories with robot maintains the body on the ladder by its their phase adjusted by a coupled oscillator model. grippers like human does. As a problem of this They use the centre of pressure location and its locomotion, a free gripper position of the climbing velocity to detect the phase of the robot dynamics. robot is not controllable because a yawing of the This phase information is used to modulate the robot body around the axis connecting a supporting desired joint trajectories. They do not explicitly use gripper and foot on the ladder is not fixed. To solve dynamical parameters of the humanoid robot and this problem, the momentum around AOY caused applied the proposed control strategy to newly by the gravity is used to control the yaw motion of develop human sized humanoid robot CB and a the body so that the various gait such as pace gait small size humanoid robot, enabling them to and trot gait could be realized in a ladder climbing generate diverse range of locomotive patterns. This manoeuvre. The algorithm of ladder climbing with indicates generalization performance of the recovery motion is experimentally verified by using walking controller across platforms. By only “Multi-Locomotion Robot (MLR)” which is changing the amplitudes of the sinusoidal patterns, developed to achieve various types of locomotion we could easily generate variable walking patterns. such as biped, quadruped walking and brachiating. L. Huet al, proposed a Simple walking pattern They realize the various types of vertical ladder generation method for biped walking, turning and climbing locomotion by using Multi-Locomotion kicking, which is continual in stride length and Robot: static gait, pace gait with continuous pace time [8]. Straight walking trajectory is velocity and trot gait with acceleration. The planned in Cartesian coordinate space using sine stability of vertical ladder climbing of MLR in function, which has slow ending and fast central static gait and pace gait is maintained with the section. It makes the robot easy to reduce landing posture control considering the momentum around impact and damp body vibration. All key poses in axis of yawing. Even if the axis has inclination, the locomotion selected according to human reference COG trajectory and acceleration is movement, is connected by third order polynomial calculated to determine the motion which can function. Besides of the advantage of continuity in maintain the stability of posture on the ladder. The the defined time interval, the connection function control flow with the error recognition from output has standard coefficient with normalized time voltage and recovery motion was well operated and scale. By using this kind of function, robot can the MLR realized the continuous ladder climbing in change its joint angular position and velocity the trot gait. continuously during its boundary condition. The G. Zhanget al. design of the locomotion control proposed method has basically proved by system for the LOCH robot is presented in this locomotion experiment. Stable walking pattern can paper [10]. Gait planning and control algorithm for be achieved simply by tuning gait length, time, uneven terrain is also considered. The LOCH robot

Imperial Journal of Interdisciplinary Research (IJIR) Page 1526

Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in is an adult-sized biped humanoid robot. It adopts designing a new type of descriptor ARPD, which the distributed control structure based on CAN bus, includes spatial information, and employed spectral and uses a Linux operating system as the software embedding algorithm to enhance the similarity platform. The architecture of both the hardware and between descriptors before clustering process. The software system is introduced. The emphasis is improved BOW was used to detect and recognize then put onto the biped planning and control. On hand posture based on sliding-window framework. the basis of the inverted arm model an on-line To meet real-time request, several approaches were planner is designed. According to user inputs and proposed to speed up hand posture recognition floor flatness changes it generates walking gaits process. In tracking process, we applied adaptively In handling uneven floor, an imaginary CAMESHIFT algorithm to track hand motion, and foot approach is proposed to convert the problem used the strategy based on histogram to re-initialize into flat floor planning. Stability control and tracking process. The experimental results showed compliant landing control are investigated. that this method can track hand motion accurately M. Manigandan et al.implemented a wireless and effectively. vision based mobile robot control through hand A. Gaschler et al.present social behaviour gesture recognition based on perceptual colour recognition using body posture and head pose for space such as HIS, HSV/HSB, HSL [11]. In field of human-robot interaction for robot control [15]. human-computer interaction Vision-based hand Robots that interact with humans in everyday gesture recognition is an important problem, so situations need to be able to interpret the nonverbal hand motions and gestures is potentially be used to social cues of their human interaction partners. interact with computers in more natural ways. The They use body posture and head pose as social robot control was purely based on the orientation signals to initiate and terminate interaction when histograms, a simple and fast algorithm on the ordering drinks at a bar. For this first they record system which would recognize static hand and analyse 108 interactions of humans interacting gestures.. The wireless based mobile robot system with a human bartender. Then based on these using hand gestures is a new innovative user findings, they train a Hidden Markov Model interface that resolves the complications of using (HMM) using automatic body posture and head numerous remote controls for various applications. pose estimation. Using this model, the bartender Based on one unified set of hand gestures, this robot of the project can recognize typical social system interprets the user hand gestures into pre- behaviours of human customers. The robot defined commands to control the remote robot. The recognition rate of 82.9 % for all implemented experimental results are very encouraging as the social behaviours and in particular a recognition system produces real-time responses and highly rate of 91.2 % for bartender attention requests, accurate recognition towards various gestures which will allow the robot to interact with multiple under different lighting conditions. humans in a robust and socially appropriate way. M. N. A. Wahab et al. present a target distance G. G. Muscol et al. proposes a conceptual design of estimation using monocular vision system for a novel humanoid robot with vision and locomotion mobile robot [13]. Mobile robot with vision could bio inspired by the human beings [16]. The be useful for many applications and purposes. characteristics of a humanoid robot are the However, the vision system needs to be robust, adaptability to human living environment, affinity effective, robust and fast to achieve on its goal. and embodiment with multi-degrees of freedom. Somehow it needs stereo vision system to estimate Developing a humanoid robot that can fully operate the depth of the object. In this paper, a monocular in the real environment is still an open challenge vision system is introduced to the mobile robot to for robotics research. In this paper, the authors increase their capabilities for calculating the present the state of the art in robotic vision and distance or depth approximately. locomotion and propose conceptual designs of Y. Chuang et al. presentshand Posture recognition control systems for a novel bioinspired humanoid and tracking based on Bag-of-Words for human robot. The principal features for developing a robot interaction [14]. Hand posture is a natural and humanoid robot are deeper discussed below, along effective interaction between human and robot. For with some preliminary tests and data. this, use monocular camera as input device and an This first step research is focused on the improved Bag-of-Words method is proposed to interfaced between humanoid robot and a Brain- detect and recognize hand posture based on a new Computer Interface (BCI) system. Starting from the descriptor ARPD (Appearance and Relative analysis of the state of the art related to vision and Position Descriptor) and spectral embedding locomotion in humanoid robotics, the authors clustering algorithm. To track hand motion rapidly propose a binocular vision and a bipedal control and accurately, they has designed a new framework system as basic input for the conceptual design of a based on improved BOW and CAMSHIFT novel humanoid robot. Future steps in research will algorithm. The standard BOW algorithm by be oriented to define symbiosis interaction between

Imperial Journal of Interdisciplinary Research (IJIR) Page 1527

Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in humanoid robot and BCI systems. The conceptual effectively to construct the locomotion pattern by design of a new humanoid robot endowed with manipulating the weight of synapse between the cameras, and it will have an internal stability motoric neurons. Based on the result from the control necessary to move in a complex evolutionary algorithm, the model acquired the environment. A first analysis focused on these two optimum solution as the fastest walking speed of aspects was here shown, with the description of the the robot, according to its capability, with low internal stability control and of the smooth pursuit inertial oscillation. In this locomotion model, the techniques necessary to implement the capability to control system for walking speed and walking follow a moving target. The simulations techniques direction was solved by determining the parameter for the integration and the implementation of the of the command neuron signal transmitted, as in system are under development. Next work will be our brain, to the gain neuron. The output from the oriented to show the first experiments on the real coupled neuron oscillator is controlled by the gain robot, focusing on the aspects that were here neuron, depending on the signal from the command widely analysed. The integration of the robot with neuron. In the real robot experiment, various the BCI interfaces will also be studied, integrating walking directions and walking speeds were the current walking motion control with the implemented as the result of the proposed system. external inputs. They also presented a new stability model that S. Piperakis et al. presents a complete formulation supports the locomotion system. The inertial sensor of the challenging task of stable humanoid robot and ground touch sensor were installed to acquire omnidirectional walk based on the Cart and Table the value of the sensor neuron. They determined model for approximating the robot dynamics [17]. the synapse weight between the sensor neuron and For the control task, they propose two novel the motoric neuron dynamically by using RNN. approaches: preview control augmented with the They use recurrent neural network for the inverse system for negotiating strong disturbances stabilization system required for supporting and uneven terrain and linear model-predictive locomotion. RNN generates a dynamic weight control approximated by an orthonormal basis for synapse value between the sensor neuron and the computational efficiency coupled with constraints motoric neuron. The synapse weight between the for improved stability. For the generation of sensor neuron and the motoric neuron can be smooth feet trajectories, they present a new dynamically changed, depending on the approach based on rigid body interpolation, environmental condition. The proposed model is enhanced by adaptive step correction. At last proven to be effectively implemented for present a sensor fusion approach for sensor-based supporting locomotion, based on a Poincare phase state estimation and an effective solution to diagram. The effectiveness of our system is sensors’ noise, delay, and bias issues, as well as to demonstrated in open dynamic engine computer errors induced by the simplified dynamics and simulation and in a real robot application that has actuation imperfections. 12 degrees of freedom in legs and four DOFs in A. A. Saputra et al.proposes the control system for hands. It has high computational cost. 3-D locomotion of a humanoid biped robot based X. Gao et al. during human-robot interaction, it is on a biological approach [18]. The muscular system often necessary for human users to command the in the human body and the neural oscillator for robot’s locomotion [20]. Design an intuitive generating locomotion signals are adapted. They interaction mechanism for users to command a extend the neuron-locomotion system for NAO robot’s locomotion with predefined postures. modelling a multiple neuron system, where motoric Based on images taken by it’s on board monocular neurons represent the muscular system and sensor RGB camera, the robot can localize the human neurons represent the sensor system inside the user’s head, torso and arms, and recognize the human body. The output signals from coupled posture displayed by the user with the k-NN neurons representing the angle joint level are algorithm. They use NAO’s forehead camera to controlled by gain neurons that represent the recognize several predefined postures, which are energy burst for driving the joint in each motor. used to control the robot’s locomotion. The The direction and the length of step in robot recognition method is realized through body part locomotion can be adjusted by command neurons. localization (head, torso and arms) and k- NN In order to form the locomotion pattern, apply based classification. They use these postures to multi-objective evolutionary computation to solve control the robot’s locomotion during a HRI the multi-objective problem when optimizing experiment with multiple feedbacks, and get good synapse weights between the motoric neurons. The recognition accuracy. The interaction is triggered locomotion model they consists of four types of by face detection. Once the robot detects a human neurons: 1) motoric neurons; 2) sensor neurons; 3) face in front, it greets the human user and command neurons; and 4) gain neurons. A multi- introduces how to use predefined postures to objective evolutionary algorithm was used control its locomotion. Then the robot asks the user

Imperial Journal of Interdisciplinary Research (IJIR) Page 1528

Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in to show a command posture for which it takes a 4. References picture and recognizes the posture. After recognition, the robot repeats the recognized [1] Ambrose and O. Robert, "Robonaut: NASA's space command posture and asks the user whether the humanoid. ”Intelligent Systems and their Applications”, recognition is correct or not. The user needs to IEEE, pp.57-63, 2000. reply with a judge posture for which the robot takes [2] R. Kurnia, Md. A. Hossain, A. Nakamura, and Y. a picture and recognizes the posture. If the judge Kuno, “Object Recognition through Human-Robot posture is “yes”, in other words, the robot’s Interaction by Speech,” Proceedings of the 2004 IEEE recognition of the command posture is correct, the International Workshop on Robot and Human Interactive robot moves according to the command (move Communication, pp. 619-624, 2004. forward/backward/left/right 0.1m). If the judge [3] Z. Yang et al., “A combined navigation strategy by a posture is “no”, in other words, the robot’s steering wheel and a mouse for a tank rescue robot,” recognition of the command posture is incorrect, Proc. IEEE Int. Conf. Robot Biomim., Shenyang, China, the robot apologizes and asks the user to show the pp. 239–244, 2004 command posture again. After a successful [4] Y. Sakagami, “The intelligent ASIMO: System movement, the robot asks the user whether it overview and integration,” Proc. IEEE/RSJ Int. Conf. should continue moving. The user replies with the Intel. Robots Syst., pp. 2478–2483, 2005. judge posture. If “yes”, the robot asks for a command posture, and repeats the above mentioned procedure. If “no”, the robot thanks the user and [5] A. Malima, E. Ozgur, and M. centin, “A FAST the interaction is finished. ALGORITHM FOR VISION-BASED HAND GESTURE RECOGNITION FOR ROBOT CONTROL,” IEEE, 2006. 3. CONCLUSION [6] X. YIN and X. ZHU “Hand Posture Recognition in This paper presents a review on control of Gesture-BasedHuman-Robot Interaction,”IEEE, 2006. mobile robot locomotion. There are various methods for mobile robot locomotion control. [7] J. Morimoto, G. Endo, S. Hyon and G. Cheng, “A These robot locomotion control method are used at Simple Approach to Diverse HumanoidLocomotion,” known environment and not used in priory IEEE, pp. 596-602, 2007. unknown environment. Robot locomotion control [8] L. Hu, C. Zhou, B. Wu, T. Yang and P. K. Yue, methods in this paper study are vision based, “Locomotion Planning and implementation of Humanoid gesture recognition, posture recognition, social Robot Robo-Eectus Senior (RESr-1),” IEEE, pp. 526- behaviour recognition, monocular vision system, 531, 2007. hand posture recognition and wireless vision control based. In which gesture and posture [9] H. YONEDA, K. SEKIYAMA, Y. HASEGAWA and recognition control method has limited number of T. FUKUDA, “Vertical Ladder Climbing Motion with commands and but it is used in priory unknown Posture Control for Multi-Locomotion Robot,” IEEE/RSJ environment, but these methods has less number of International Conference on Intelligent Robots and commands. Social behaviour recognition method is Systems, pp. 3579-3584, 2008. used in any types of environment but there is need [10] G. Zhang, M. Xie, H. Yang, J. Li and X. Wu, a large number of instruction, so it is very complex “Locomotion Control System Design for the LOCH to use in a priory unknown environment. Vision Humanoid Robot,” Mediterranean Conference on based control has good accuracy but is more bulky Control & Congress Palace Hotel, due to large number of data is need to store in Marrakech, Morocco, pp. 974-979, 2010. control unit related to the environment where it is used, but wireless vision system has advantage that [11] M. Manigandan and I M. Jackin, “Wireless Vision data is store in control unit which is separate placed based Mobile Robot control using Hand Gesture from mobile robot. So for this permanent wirelessly Recognition through Perceptual Color Space,” International Conference on Advances in Computer connectivity is needed between mobile robot and Engineering, pp. 95-99, 2010. control unit. So for a good mobile locomotion control there is needed to combine these all control [12] A. Yorita and N. Kubota, “Cognitive development method. Than this becomes mobile robot in partner robots for information support to elderly locomotion accurate, help robot to reach its people,” IEEE Trans. Auton. Mental Develop, pp. 64–73, destination point accurately and made mobile robot 2011. locomotion used in unknown environment. In future, need artificial intelligent robot locomotion [13] M. N. A. Wahab, N. Sivadev and K. Sundaraj, than enable robot locomotion to work in any “Target Distance Estimation Using Monocular Vision System for Mobile Robot,” IEEE Conference on Open environment or in any condition. Systems, pp. 11-15, 2011.

Imperial Journal of Interdisciplinary Research (IJIR) Page 1529

Imperial Journal of Interdisciplinary Research (IJIR) Vol-2, Issue-8, 2016 ISSN: 2454-1362, http://www.onlinejournal.in

[14] Y. Chuang, L. Chen, G. Zhao and G. Chen, “Hand Posture Recognition and Tracking Based on Bag-of- Words for Human Robot Interaction,” IEEE International Conference on Robotics and Automation, pp. 538-543, 2011.

[15] A. Gaschler, S. Jentzsch, M. Giuliani, K. Huth, J. Ruiter and A. Knoll, “Social Behaviour Recognition Using Body Posture and Head Pose for Human-Robot Interaction,” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2128-2133, 2012.

[16] G. G. Muscolo, C. T. Recchiuto and R. Molfino, “Vision and Locomotion Control Systems on a bioinspired Humanoid Robot,” IEEE Mediterranean Electrotechnical Conference, pp. 380-385, 2014.

[17] S. Piperakis, E. Orfanoudakis and M. G. Lagoudakis, “Predictive Control for Dynamic Locomotion of Real Humanoid Robots,” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4036-4043, 2014.

[18] A. A. Saputra and J. Botzheim, “Biologically Inspired Control System for 3-DLocomotion of a Humanoid Biped Robot,” IEEE TRANSACTIONS, pp. 1- 14, 2015.

[19] A.Wagoner et al., “Humanoid robots rescuing humans and extinguishing fires for cooperative fire security system using HARMS,” Proc. 6th Int. Conf. Autom. Robot. Appl. (ICARA), Queenstown, New Zealand, pp. 411–415, 2015.

[20] X. Gao, M. Zheng and M. Q.-H. Meng, “Humanoid Robot Locomotion Control by Posture Recognition for Human-Robot Interaction,” Proceedings of the 2015 IEEE Conference on Robotics and Biomimetics, pp. 1572-1577, 2015.

Imperial Journal of Interdisciplinary Research (IJIR) Page 1530