In this paper, we describe the implementation of interactive robotics in virtual environments accomplishing human-robot interaction for treatment of Autism Spectrum Disorders (ASDs). Interaction between our system and children suffering from ASDs is accomplished by teaching them body language such as hand and arm motion, facial expressions and speech to encourage them to engage in social interaction with other humans and for improving their motor skills. A Kinect sensor is used to allow direct control of the humanoid robot, Zeno, by the therapist or child to enable dynamic interaction. The motions of Zeno and the child are recorded simultaneously by a motion capture system to assess the interaction. Specifically, we compare arm and torso motions of the child which should closely follow those of the robot. This behavior can be used for clinical treatment and diagnosis during robot assisted therapy. Therapists can take advantage of this interactive behavior to achieve desired poses of the robot that may be beneficial to children with ASDs to enhance their motor skills as well as their social interaction skills. In order to compare the motion characteristics of robots and subjects, we use various metrics such as cross correlation and signal 2-norm. Results show that the child's motion follows the robot's motion closely and the analysis techniques are reasonable indicators to compare the similarity of the human and robot motions.