Ratul Aggarwal *, Pranjal Katara**
Abstract - The Shadow bot is a basic humanoid with 19 degrees of freedom. At first site, it appears to be just like any other humanoid robot but the real difference comes in when we go inside its working mechanism. Shadow bot works on the principle of wireless human-robot interface. It will take user’s body motion as an input and move just like him/her. It’s more like giving sight to the robot and it’ll mimicries user’s motion. Unlike any other humanoid robot, the controller in this bot’s architecture is the user himself. Manipulators can direct communicate with the computer just relying on body movements, and thus control the actions of remote robot in real time, coordination, and harmonization. The design is a combination of new human motion capture methods, intelligent agent control method, network transmission, multiple sensor fusion technology. The name ‘Shadow bot’ is used because just like our shadow, it’ll move precisely in the same fashion as we are. This paper briefly describes all the segments of the humanoid robot using Kinect technology.
Key words – Shadow bot, humanoid, kinect, wireless human-robot interface.
I. INTRODUCTION
The Shadow bot is a basic humanoid (as shown in figure 1) with 19 degrees of freedom, 39 cms height, 1.65 kgs weight and made up of light aluminium. At first site, it appears to be just like any other humanoid robot but the real difference comes in when we go inside its working mechanism. Shadow bot works on the principle of wireless human-robot interface. It will take user’s body motion as an input and move just like him/her. It’s more like giving sight to the robot and it’ll mimicries user’s motion. Unlike any other humanoid robot, the controller in this bot’s architecture is the user himself. Manipulators can direct communicate with the computer just relying on body movements, and thus control the actions of remote robot in real time,
References: 1. Wikipedia. Kinect — Wikipedia, the free encyclopedia. http://en.wikipedia.org/wiki/Kinect, 2011. 2 3. Prime Sense. Prime sense kinect’s hardware. http://www.primesense.com, 2010 4 7. Jae-Han Park, Yong-Deuk Shin, Ji-Hun Bae, and Moon- Hong Baeg, Spatial uncertainty model for visual features using a kinect sensor, Sensors, 2012. 10. David Fiedler and Heinrich Mu ̈ller, Impact of thermal and environmental conditions on the kinect sensor, International Workshop on Depth Image Analysis at the 21st International Conference on Pattern Recognition, 2012.