Implementation of data glove using 9-axis IMUS for gestures recognition and biomimetic robot hand control

pdf 8 trang Gia Huy 19/05/2022 1780
Bạn đang xem tài liệu "Implementation of data glove using 9-axis IMUS for gestures recognition and biomimetic robot hand control", để tải tài liệu gốc về máy bạn click vào nút DOWNLOAD ở trên

Tài liệu đính kèm:

  • pdfimplementation_of_data_glove_using_9_axis_imus_for_gestures.pdf

Nội dung text: Implementation of data glove using 9-axis IMUS for gestures recognition and biomimetic robot hand control

  1. Tuyển tập Hội nghị khoa học toàn quốc lần thứ nhất về Động lực học và Điều khiển Đà Nẵng, ngày 19-20/7/2019, tr. 152-159, DOI 10.15625/vap.2019000272 Implementation of data glove using 9-axis IMUS for gestures recognition and biomimetic robot hand control Le Hoang Anh Nguyen Minh Quang Nguyen Duy Long Nguyen Doan Tan Anh and Do Dang Khoa School of Mechanical Engineering, Hanoi University of Science and Technology School of Electrical Engineering, Hanoi University of Science and Technology 1 Dai Co Viet Road, Hanoi, Viet Nam Email: khoa.dodang@hust.edu.vn Abstract as Bluetooth, Wi-Fi, etc.) to a processing unit and Human – computer interactions have been the center of many recent ultimately a computer system, where it will be handled scientific researches due to many potential applications. Among such and used for different purposes according to its interactions, hand movements have always been regarded as one of designed functions. Modern gloves can also provide the most desirable research trends. Albeit having been in haptic feedback to provide a “touching sensation” for development for over three decades, the research in capturing hand gestures and movements, along with its applications, are still in need the user, which allows the device to be used as an of refinement. This research utilizes a model of a modular data glove output device. with 9-axis inertial measurement units (IMU), proposed in [1], to Because of the diversity in research and capture hand motions and subsequently, to employ the collected data development of glove-based systems, data glove on two applications: imitating human palm movement in a virtual classification would be difficult. However, data gloves reality (VR) environment and applying AI techniques to interpreting could be categorized according to their applications as sign languages into alphabetic letters. The successful results prove in [2]. Classical applications for such systems include: great potential for using the gloves as a fast, accurate, light-weight Design and Manufacturing; Information and cost-effective tool in hand motion capture. Virtualization; Robotics; Arts and Entertainment; Sign Language Understanding. Recently, some modern Keywords: IMU, Hand gestures, Sign language recognition, Virtual reality simulation. applications are: Medical and Health Care Researches; Improvement of Computers’ Compatibility and Portability. 1. Introduction The researches in this paper fall into Information Virtualization and Sign Language Understanding. A glove-based system, also known as a wired glove, is a wearable device, which commonly serves as an input device 2.2. Previous Researches for various human - computer interactions. A range of The concept of data gloves and similar systems has applications have been devised and implemented for such been devised and developed since the 1970s, with the systems in previous researches, namely 3D tracking, robot Sayre Glove [3] to be considered the first of the type. manipulation and control, simulation, low-level and It used flexible tubes with a light source at one end unsophisticated hand gestures recognition, etc. and a photocell at the other to detect finger bending. Using such a glove, the desired task is to record the The glove was successfully used to employ positions, shapes and possibly movements of all the fingers as multidimensional control, particularly to emulate a set well as the palm of the wearer’s hand. Different methods have of sliders. However, it was not used as any gesturing been implemented for the task, including light-based sensors, device [4]. Still, it is widely regarded as one of the computer vision and inertial measurement units (IMUs), etc. hallmarks of the beginning of gesture recognition in This research concerns two targets: hand simulation in modern computer science. Robot Operating System (ROS) framework and complex hand Later developments until 2008 have already been gestures recognition, using IMUs. Although the concept has summarized by prior research [2], with the preeminent been around for many decades, development of data gloves is products being: Digital Entry Data Glove (AT&T Bell still in its infancy phase. Therefore, the paper plans to study Telephone); VPL DataGlove (MIT, VPL Inc.); Power some useful applications of such potential devices. Glove (Mattel Intellivision); P5 Glove (Essential The paper consists of 5 parts: a brief literature review of Realty Inc.); CyberGlove (Stanford University, Virtual the research topic, the glove design and testing, the sign Technology); TUB-Sensor Glove (Technical language application and the biomimetic hand control. University of Berlin); Humanglove (Humanware Srl). They are applied for a wide range of applications, 2. Literature Review including Sign Language Understanding, Virtual Reality, Entertainment, Robot Control and 2.1. Characteristics Teleoperation, Medicine and Rehabilitation, 3D Glove-based systems usually include sensors of various Modelling. types on the fingers and/or the palm of the hand to register Regarding virtual reality and gesture recognition movements created by the user. The collected data would then applications, many previous researches have been be transmitted via different methods (wired or wireless such conducted in the subject.
  2. Le Hoang Anh, Nguyen Minh Quang, Nguyen Duy Long, Nguyen Doan Tan Anh, and Do Dang Khoa In 1999, J. Weissmann et al. [11] successfully used data glove to capture hand gestures. The glove provided 18 measurement values for the angles of different finger joints and the recognition task was done via neural network. However, in this research, only specific hand gestures, such as "fist", "index finger" and "victory sign", were defined. In 2004, Jeonghun Ku et al. [12] designed and validated a functional magnetic resonance imaging (fMRI) compatible data glove with a built-in vibratory stimulus device for tactile feedback during VR experiments. This research had a great improvement of having haptic feedback for the users and questionnaire scores subsequently assessed users’ experience Fig. 1 Finished design of the data glove with the feedback from the glove, which was reported to be “comfortable”. The data glove deploys 13 9-axis IMUs in the user’s In 2009, another haptic data glove for virtual reality palm and forearm. At each finger, except for the thumb, application was proposed in [13]. This device was innovative there are 2 IMUs situated in middle phalange (MP) and in using active fluids, such as the magnetorheological (MR) proximal phalange (PP) bones to determine the motion fluids as actuators, resulting in a smaller, lighter, more of metacarpophalangeal (MCP) joints and proximal compact and easier to use and control data glove. The research interphalangeal (PIP) joints. The 12th IMU performs as showed that the MR glove improved task completion times in the base sensor, providing the orientation and position grasping virtual objects and could convey stiffness of the hand. The last IMU is implemented in the information to the user. forearm to estimate the pitch, roll, yaw angle of the In more recent researches, specific applications in the wrist joint by obtaining relative quaternion with the fields such as medical and rehabilitation have also been 12th sensor. concerned and developed. Data collected from the IMUs are transferred In 2019, Gonzalez et al. [14] presented a wearable hand directly to the Hand Posture Processing Unit (HPPU), rehabilitation system for stroke patients based on digital glove located on the back of the hand, which handles filtering and keyboard games. This was achieved via hand gesture and sending data to a computer for further processing. recognition along with hand model animation. The hand Fig. 2 is a basic flowchart of the data collecting animation model, combined with keyboard games, enables the procedure. stroke patient under test to see their finger movements and exercise process. However, the disadvantage of this device is that it only employs bending sensors, which would prove difficult to recognize more complex hand gestures. Movements of the hands and arms would be virtually undetectable for this model. Also in 2019, Maitre et al. [15] used the data glove to recognize daily activities with the aim to aid people suffering from Alzheimer’s disease. The product prototype allows monitoring the evolution of the dysfunctionality of the hand in basic daily activities. The research resulted in a cost-effective and efficient model which is also reproducible by other researchers. 3. Hardware Design As mentioned above, this research employs a previous design from [1]. This section introduces some of its features. 3.1. Design The hardware in this work implements off-the-shelf IMUs, model GY-BNO055, yielding the accurate absolute orientation of the motion sensor. Moreover, owing to its modularity, each section could operate independently and be substituted in case of being damaged. Fig. 2 Data collecting flowchart The relative angles sent away is in quaternion format.
  3. Implementation of Data Glove using 9-axis IMUs for Gestures Recognition and Biomimetic Robot Hand Control 3.2. Verification [5], ROS supports the users easier to control their The accuracy verification of the system is done through 2 robots even if the user has little knowledge in experiments: static and dynamic measurement. mechanical engineering, electronics and embedded programming. a. Static There exists a tool developed for ROS named This test measures the accuracy in returning angles MoveIt Motion Planning Framework, which is a between 2 IMUs when they are stationary. The angle between powerful robot control simulation package. Moveit 2 joints is set at 120°, 90°, 60°, 30° and 0°, and the provides C++ and python interface as well as GUI measurements are repeated for more than 1000 times, which interface to control robot in both forward and inverse is recorded by software and taken to an average. kinematics [6]. To simulate a virtual hand, first the entire arm is Table 1. Data of measurement in static scenario designed and modeled in a 3D software. The Fig. 4 Reference illustrates basic steps of visualizing a robot hand in 120° 90° 60° 30° 0° angle Moveit. Determined 119.15° 88.85° 59.41° 29.54° 1.22° angle RMSE 0.99° 1.35° 0.74° 0.74° 1.15° b. Dynamic This test measures the accuracy in returning angles between 2 IMUs when they are in motion. In the dynamic verification, a servo motor is controlled to swing a platform connected to the protractor backwards and forwards. By controlling the motor via Pulse Width Modulation (PWM), the platform is able to reach a specific angle. Two cases of experimenting with 00-30°, and 00-60° are shown in Fig. 3. Fig. 4 Design process of the virtual hand 5. Hand Signs Recognition Fig. 3 Dynamic verification 5.1. Overview These results point to the fact that the hardware used in this The application of this project is performed by project meets the demand in measurement accuracy required using Machine Learning. The term “Machine for the applications in section 4 and 5 below. Learning” was defined by Arthur Samuel in [7], as that: “Machine learning (ML) is the scientific study of 4. Simulation on Virtual Environment algorithms and statistical models that computer This application of the project utilizes the Robot systems use to effectively perform a specific task Operating System (ROS) – one of the most popular modern without using explicit instructions, relying on patterns open-source frameworks for robot development. According to and inference instead.”
  4. Le Hoang Anh, Nguyen Minh Quang, Nguyen Duy Long, Nguyen Doan Tan Anh, and Do Dang Khoa The tasks needed to be performed in this project is T  xb 1 recognizing different hand signs that the glove wearer does. In (3) the language of Machine Learning, this is a supervised   classification problem. “Supervised” here means that the 22 results have already been known (which hand signs corresponds to which characters), and the computer would A margin is the closest distance from the perform the classification job with realtime data other than hyperplane to any points of a class: what has been trained before. There exist many algorithms that can complete the task T x b (4) described above, namely Neural Networks, Decision Trees, margin min i  K-nearest Neighbors, Naive Bayes, Support Vector Machines 2 and Extreme Learning Machines. In the paper, Support Vector Machine (SVM) is chosen due to the following advantages: As mentioned above, SVM maximizes this margin. - Effective in high dimensional spaces. For the canonical hyperplane, the optimization - Effective in cases where the number of dimensions is problem is greater than the number of samples. - Efficient in memory by using a subset of training points 1 in the decision function (called support vectors), ,argmaxb ,b  - Versatile: different Kernel functions can be specified for 2 (5) the decision function. Common kernels are provided, but it is subject to : y0,  T x bi  also possible to specify custom kernels. ii Nevertheless, this algorithm also has some drawbacks: - If the number of features is much greater than the number 5.3. Implementation of samples, avoid over-fitting in choosing Kernel functions ` and regularization term is crucial. The basic SVM supports only binary classification, - SVMs do not directly provide probability estimates, these but there are several implementations supporting are calculated using an expensive five-fold cross-validation. multiclass classification. This project employs a scikit- Therefore, if confidence scores are required, but these do not learn library class named C-Support Vector have to be probabilities, it is advisable to set the probability Classification (SVC), which is based on libsvm [8]. parameter of the implementation codes to be false. The multiclass support is handled according to a one- The objective of the Support Vector Machine algorithm is vs-one scheme. to find a hyperplane in an N-dimensional space that distinctly According to B. E. Boser et al in [9], SVC solves classifies the data points such that the distances of the points the following primal problem: closest to the “border” (margin) are maximized. l 1 T (6) 5.2. Mathematical background min  x Ci ,,b 2 A hyperplane is formally defined as: i 1 subject to: T (1) T  x b ybii((x))1,0,1, ii il where  is the weight vector, b is the bias. The dimension n Where x Ri,1, l are the training vectors and of  is equal to that of x . The optimal hyperplane can be i represented in an infinite number of different ways by scaling l of  and b . As a matter of convention, among all the yRy ,1,1i  possible representations of the hyperplane, the one chosen is is the indicator vector. (xi ) maps x into a T x b 1 (2) higher-dimensional space (one to infinite additional dimensions), which allows the binary classification of where x symbolizes the training examples closest to the the data. C is the regularization parameter, C 0 . hyperplane. In our case, the dimension of x is equal to the amount of reference points provided by the Data Glove. In Whereas i equals the error coefficient. addition, the training examples that are closest to the Normally, the dual problem of the optimization hyperplane are called the support vectors. This representation problem above is preferred due to the possible high is known as the canonical hyperplane. dimensionality of the data set For the canonical hyperplane, the numerator is equal to one and the distance to the support vectors is 1 TT min xQe 2 (7) T subject to: yCil 0,0 i , 1, where e is the column vector of all ones, Q is an l by l
  5. Implementation of Data Glove using 9-axis IMUs for Gestures Recognition and Biomimetic Robot Hand Control positive semi-definite matrix data set. However, after a process of reevaluation it is determined that this high accuracy is due to the small QyyKxx (, ) quantity of labels (under 40) acquired for the project so ij i j i j far. If more gestures are required for the prediction T process, it is projected that the training accuracy could and Kx(,ij x ) () x i ( x j ) reduce slightly. is the kernel function. After the dual problem is solved, the optimal value of the weight vector satisfies l    yxii() i (8) i 1 The decision function is l (a) T (9) sgn(  (x)+b)= sgn( yii a K(x i ,x)+b) i=1 The data acquired from the IMU measurements is manipulated as follows: (b) (c) Fig. 5 Block diagram for recognition task Two separate processes are presented in the figure above. To train a prediction model for the glove, firstly a large amount data is required. It is collected directly from the data glove and exported into a .csv file. This file is then used by the trainer code to produce a prediction model for the recognition task with the help of the “joblib” tool [10]. This model has the filename extension of “sav”. When the data glove is used to predict hand gestures, data from the IMUs are fed directly into the predictor code, which loads the aforementioned .sav model using the same joblib (d) Fig. 6 Hand signs recognition results. All of these letters tool. Finally, a prediction of which gestures the glove-wearer and numbers are according to the Vietnamese sign is putting on is displayed on a graphical interface developed language alphabet. (a) letter A (b) number 6 (c) letter B (d) for this project, which will be demonstrated in the next section. number 4 5.4. Results One can see from the results that small changes in the yaw angles are detected and the different gestures The current training accuracy attained is 98%, which for a are successfully recognized, despite the difficult machine learning model, is a possible sign of overfitting the angles.
  6. Le Hoang Anh, Nguyen Minh Quang, Nguyen Duy Long, Nguyen Doan Tan Anh, and Do Dang Khoa In this section, the mechanism of operating a 6. Biomimetic Robot Hand Control robotic hand is introduced. Joint angle obtained by the cyberglove is considered as the input, while the output 6.1. Forward Kinematic of Hand Posture is the rotation angle of servo motors (Fig. 8). The use Regarding human characteristics, the hand has of cables that transfer motion from motor to rotation of approximately 20 degrees-of-freedom (DoF). In the joint requires mapping calculations. It is particularly circumstance of the index finger, it has 2 DoF for difficult because finger joints are controlled metacarpophalangeal (MCP) joints, 1 DoF for proximal dependently on each other. Fig. 9 illustrates that 3 interphalangeal (PIP) joints, and 1 DoF for distal servo motors controls 3 D-O-F, including the flexion interphalangeal (DIP) joints. Using such structure, each finger movement of the PIP & DIP joints, and the MCP joints, can be modeled as a 4 DoF kinematic chain where the palm is the abduction (ABD) motion of the MCP joints. the base frame and the distal phalange is the end-effector frame. Table 3. Specification of driving cables of finger Given quaternions measured by two adjacent IMUs, joint angles are acquired, and the position and orientation of each Motor Type Moment Purpose phalange can be computed by forward-kinematics. The Fig. 7 DC1 MG95 9.4kg/cm Pull the forward describes the frame attachment and the kinematic chain of the index finger as an example. The palm is assigned as Frame 1, string – the flexion the proximal, middle, and distal phalange are Frame 2 to movement of the PIP Frame 4, respectively. 𝑙,𝑙,𝑙 indicate the length for & DIP/MCP joints proximal, middle, and distal phalanx, respectively.𝛽, 𝜃 denote DC2 MG90s 1.8kg/cm Pull the backward the abduction/adduction (ABD) and flexion/extension angles string folding angle of the MCP joint while 𝜃,𝜃 represent the flexion angles of positioning of MCP, the PIP and DIP joints. horizontal motion ABD (left). DC3 MG90s 1.8kg/cm Pull rear string folding angle positioning of MCP, control horizontal motion ABD (right). The movement of the PIP, DIP MCP joints do not depend on the ABD motion of the MCP joints. So, the rotating angle of DC1 is determined by the bending angle of PIP&DIP/MCP. The rotation angle of DC2, DC3 is calculated based on the flexion of MCP and ABD. Due to the low mechanical precision, the cable is stretched during operation, at the same time, there is Fig. 7 Kinematic model of a finger no need to control it too accurately, so the geometric methods for mapping does not bring about different Table 2. General Denavit-Hartenberg parameters of a finger effects. Servo rotation angles and joint rotation angles Link ID 𝛼 𝑎 𝜃 𝑑 will be mapped into linear function. The zero point is when the finger is extended, the angle of MCP/PIP & 1 0 0 𝛽 0 DIP is 0°. When the finger is fully bent, the maximum 2 𝜋/2 0 𝜃 0 value of MCP angle i𝑠 𝜋/2, the PIP & DIP angle is 3 0 𝑙 𝜃 0 𝜋/2. 4 0 𝑙 𝜃 0 The rotation of DC1 is determined by: 5 0 𝑙 0 0 Mmcp00 Mpipdip 0 SS11 S 1 S 1 1 S1 mcp pipdip 6.2. Robot hand control mechanism Where: (10) • 0 : Servo angle at the zero point, the angle of MCP/ S1 Relative angles Control servos Input rotation of PIP & DIP is 0. from the by determined joints • Mmcp : Servo angle at MCP = 𝜋/2; PIP & DIP = 0. cyberglove values S1 • Mpipdip : Servo angle at MCP = 0; PIP & DIP = 𝜋/2. S1 • mcp : input value of MCP joint angle from the glove Fig. 8 Flowchart of controlling the biomimetic robot hand (rad), 0/2 mcp .
  7. Implementation of Data Glove using 9-axis IMUs for Gestures Recognition and Biomimetic Robot Hand Control SSSSMabd0 Mab d0 • pipdip : input value of PIP & DIP joint angle from the glove 2233. (rad), . Step 3: Fix the finger at MCP = 𝜋/2, ABD = 0, at 0/2 pipdip this position, all values of ABD are identical. We could Thus, when assembling and adjusting for the first time, we determine the rotation value of DC2, DC3, 0 Mmcp Mpipdip Mmcp Mmcp need to specify parameters SS11,,S 1 to enter the respectively, SS23, . Normally, due to the control program. During operation, it is also possible to adjust similarities of DC2, DC3 motor and pulley parameters, the value of these parameters to get the best quality control. Mmcp00 Mmcp we would have, SSSS2233 . Thus, we have determined all the necessary parameters to map the angle of MCP and ABD to the rotation angle of DC2 and DC3. mcp : input value of MCP joint angle from the glove (rad), 0/2 mcp . abd : input value of ABD joint angle from the glove (rad), /2adb /2. Mmcp00 Mmcp LS 2233 S S S(rad) dS Mabd0 S S Mab d0 S 2233(rad) Rotation angle DC2, DC3 will be determined by the following formula (11). The Fig. 10 describes that the biomimetic robot hand resembles a dexterous movement captured by the cyberglove. Fig. 9 Driving cables of the index finger Regarding the mapping of motion of DC2 and DC3 into joint angles, when assembling and adjusting the robot hand for the first time, we will determine control parameters by the following steps. Step 1: Set the finger to MCP = 0, ABD= - 𝜋/2 , let the DC2 and DC3 stretch cables completely. We can determine two rotation values of DC2 and DC3 at that position, which 0dMab are denoted by SS23, correspondingly. Step 2: Set the finger to B to MCP = 0, ABD = 𝜋/2 , we can determine 2 rotation values of DC2 and DC3 at that Mabd0 position, SS23, . Normally, due to the similarities of DC2, DC3 motor and pulley parameters, we would have Fig. 10 Robot hand control result 2 2 022 mcp /2 abd /2 mcp 22 SLd(/4) d /2 /2 2 2 (11) 022 mcp /2 abd /2 mcp 33 SLd(/4) d /2 /2 novel design equipped with IMUs, this research has 7. Conclusion achieved initial achievements in the usages mentioned above. In continuation of the many previous researches aiming to Using the proposed cyberglove hardware design, apply data gloves for applications in the fields of simulation, letters in the alphabet of Vietnamese sign language had control and gesture recognition, with the employment of a been successfully recognized, together with a
  8. Le Hoang Anh, Nguyen Minh Quang, Nguyen Duy Long, Nguyen Doan Tan Anh, and Do Dang Khoa visualization of the posture of the hand displayed on a custom user interface designed for this project. Furthermore, a 17 degree-of-freedom biomimetic robot hand had also been controlled with great accuracy and stability using the same data glove. References [1] Nguyễn Minh Quang, Lê Hoàng Anh, Nguyễn Duy Long, Nguyễn Đoàn Tấn Anh, Đỗ Đăng Khoa, “Design and Analysis of a Sensorized Glove with 9-axis IMU Sensors for Hand Manipulation Capture”, Manuscript submitted for publication in Vietnam International Conference and Exhibition on Control and Automation (VCCA), 2019. [2] Laura Dipietro, “A Survey of Glove-Based Systems and Their Applications”, Angelo M. Sabatini, Senior Member, IEEE, and Paolo Dario, Fellow, IEEE.(Dipietro et al. 2008). [3] DeFanti, T., Sandin, D. J., US NEA R60-34-163 Final Project Report. [4] D. J. Sturman and D. Zeltzer, “A survey of glove-based input,” IEEE Comput. Graph. Appl., vol. 14, no. 1, pp. 30– 39, Jan. 1994. [5] V. Mazzari, "ROS – Robot Operating System," [Online] robot-operating-system-2/. [6] "MoveIt! Motion Planning Framework FAQ," [Online] [7] The phrase is not found verbatim in this publication and may be a paraphrase that appeared later. Koza, John R.; Bennett, Forrest H.; Andre, David; Keane, Martin A. (1996). Automated Design of Both the Topology and Sizing of Analog Electrical Circuits Using Genetic Programming. Artificial Intelligence in Design '96. Springer, Dordrecht. pp. 151–170. doi:10.1007/978-94- 009-0279-4_9. [8] Chih-Chung Chang, Chih-Jen Lin, “LIBSVM: A Library for Support Vector Machines”, March 2013. [9] B. E. Boser, I. Guyon, and V. Vapnik. “A training algorithm for optimal margin classifiers”. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pages 144–152. ACM Press, 1992. [10] [Online] [11] J. Weissmann and R. Salomon, "Gesture recognition for virtual reality applications using data gloves and neural networks," IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339), Washington, DC, USA, 1999, pp. 2043-2046 vol.3. [12] Jeonghun Ku, Richard Mraz, Nicole Baker, Konstantine K. Zakzanis, Jang Han Lee, In Y. Kim, Sun I. Kim, Simon J. Graham, "A Data Glove with Tactile Feedback for fMRI of Virtual Reality Experiments", 2003, CyberPsychology & Behavior, Pages 497-508. [13] J. Blake and H. B. Gurocak, "Haptic Glove With MR Brakes for Virtual Reality," in IEEE/ASME Transactions on Mechatronics, vol. 14, no. 5, pp. 606-615, Oct. 2009. [14] Gonzalez, Eli (2019) "A Wearable Sensor Based Hand Movement Rehabilitation and Feedback System," OSR Journal of Student Research: Vol. 5 , Article 62. [15] Julien Maitre, Clément Rendu, Kévin Bouchard, Bruno Bouchard, Sébastien Gaboury, "Basic Daily Activity Recognition with a Data Glove", Procedia Computer Science, Volume 151, 2019, Pages 108-115.