Leap motion controller is a new device launched by Leap Motion Inc. in 2013. Human-computer interaction can be realized by recognizing gestures over the device. Compared with traditional human interface devices, the leap motion controller is able to control robots more precisely, so it can be applied to some scenarios with high requirements for precision, such as remote surgery (being developed). This thesis elaborates the development of leap motion controller with the aim of using the leap motion controller for gesture recognition and controlling the simulated KUKA robotic arms and mobile robot in Vrep by connecting it to a PC, in combination with a KUKA LBR iiwa 14 R820 robotic arm and a Pioneer p3dx mobile robot. The programs were written in Python in accordance with the algorithm structures of SVM, k-NN and BP neural networks for gesture recognition. Moreover, this thesis also briefly introduced the leap motion controller, Vrep, and the three machine learning algorithms, in addition to outlining their backgrounds and the contents of relevant literature, as well as expounding the theoretical knowledge of each machine learning algorithm. The work, results, and discussions conducted and obtained in the past few weeks were also described, followed by a summary of the project and projections for related future work.