WO2018170713A1 - Gesture recognition-based robot car control method and device - Google Patents

Gesture recognition-based robot car control method and device Download PDF

Info

Publication number
WO2018170713A1
WO2018170713A1 PCT/CN2017/077425 CN2017077425W WO2018170713A1 WO 2018170713 A1 WO2018170713 A1 WO 2018170713A1 CN 2017077425 W CN2017077425 W CN 2017077425W WO 2018170713 A1 WO2018170713 A1 WO 2018170713A1
Authority
WO
WIPO (PCT)
Prior art keywords
palm
gesture
determining
preset time
starting position
Prior art date
Application number
PCT/CN2017/077425
Other languages
French (fr)
Chinese (zh)
Inventor
梅杰
Original Assignee
深圳市欸阿技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市欸阿技术有限公司 filed Critical 深圳市欸阿技术有限公司
Priority to PCT/CN2017/077425 priority Critical patent/WO2018170713A1/en
Publication of WO2018170713A1 publication Critical patent/WO2018170713A1/en

Links

Definitions

  • the invention is based on the field of image information processing and control of human-computer interaction, and particularly relates to a method and device for controlling a car based on gesture recognition.
  • the image recognition technology of gestures can be roughly divided into static gesture recognition and dynamic gesture recognition.
  • Static gesture recognition is characterized by learning the mapping relationship from the graphical feature space to the hand shape space. Specifically, firstly, a set of gestures is predefined, and a unique gesture feature description is extracted from each gesture, and then the image feature space is directly mapped to the gesture space to obtain an estimation of the gesture, that is, by image features (points, lines, An angle or texture area, etc.) to identify the gesture or gesture state.
  • This method requires learning and training based on data templates representing various gestures.
  • Commonly extracted gesture features include skin color features, Haar features, HOG features, shape features, and the like.
  • the dynamic gesture recognition is mainly based on the motion detection algorithm, and the predefined dynamic gesture is detected according to the feature description of the motion region.
  • dynamic gesture detection research is relatively mature, but if considering the interference factors such as illumination and background, the recognition rate is greatly reduced.
  • the main object of the present invention is to provide a car control method based on gesture recognition, which aims to solve the problem of lowering the recognition rate caused by interference of ambient light, background and the like in the existing dynamic gesture detection.
  • the present invention provides a car control method based on gesture recognition, the control method comprising the following steps:
  • the cart motion is controlled according to the gesture control command.
  • the determining the starting position of the palm includes:
  • the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
  • the determining the end position of the palm includes:
  • the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
  • the judging of the shaking of the palm is achieved by determining a change in the displacement of the palm according to the gradation value of the palm.
  • the identifying the motion trajectory of the palm by the starting position and the ending position of the palm center specifically includes:
  • a coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
  • the present invention further provides a car control device based on gesture recognition, and the car control device based on gesture recognition includes:
  • a judging module configured to identify a motion trajectory of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion trajectory of the palm;
  • a command generating module configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal
  • a control module configured to control the movement of the trolley according to the gesture control command.
  • the determining module is specifically configured to determine a position of the palm according to the joint point of the palm, and determine whether the range of the jitter of the palm is within a preset value within a preset time; when the jitter range of the palm is in the preset When the value is set, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
  • the determining module is configured to delay the first preset time after the initial position of the palm is recognized; and determine the second preset time after the first preset time is counted Whether the first jitter range of the palm is within the first preset value; when the first jitter range of the palm is within the first preset value, identifying the area where the palm is located in the second preset time The center point is the end position of the palm.
  • the judgment of the shaking of the palm is performed by determining a change in the displacement of the palm according to the gradation value of the palm.
  • the determining module is specifically configured to establish a coordinate system based on a starting position of the palm, and identify a motion trajectory of the palm by determining coordinates of the starting position and the ending position.
  • the invention recognizes the start position and the end position of the palm, and determines the gesture signal of the palm according to the start position and the end position, and retrieves the gesture control command corresponding to the gesture signal based on the gesture signal, and finally according to the gesture
  • the command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the reduction of the recognition rate caused by the interference of the ambient light, the background and the like in the existing dynamic gesture detection, and the present invention
  • By judging the jitter range based on the gray value continuous gesture recognition can be realized within a certain distance range.
  • FIG. 1 is a schematic flow chart of a first embodiment of a method for controlling a car based on gesture recognition according to the present invention
  • FIG. 2 is a schematic diagram of functional modules of a first embodiment of a vehicle control device based on gesture recognition according to the present invention.
  • FIG. 1 is a flow chart showing a method of controlling a car based on gesture recognition according to a first embodiment of the present invention.
  • the gesture recognition based car control method of the embodiment of the present invention includes the following steps:
  • Step S10 determining a starting position and an ending position of the palm, and identifying a movement trajectory of the palm based on the starting position and the ending position of the palm, and determining a gesture signal of the palm according to the movement trajectory of the palm.
  • the present invention recognizes a gesture by an image sensor such as Kinect, unlike the conventional method for analyzing gesture recognition by recognizing the image of the palm, the present invention only needs to recognize the start position and the end position of the palm, and according to the start. The position and end position can determine the gesture direction signal of the palm.
  • the starting point of the palm is recognized by the image of the palm, and the starting point is the origin, and the plane rectangular coordinate system is established, and the end position of the ending point in the coordinate system is determined by analyzing the end point.
  • the position, combined with the origin, can determine the current trajectory of the palm, and then draw the basic gesture of the palm according to the trajectory of the palm, including the palm to the left, to the right, up, down or clenched.
  • Step S20 Acquire a gesture control command corresponding to the gesture signal based on the gesture signal.
  • the car controller retrieves the gesture control command corresponding to the gesture signal through the memory, for example, by acquiring five kinds of gesture signals: the palm moves to the left, right, up, and down. Or the fist movements correspond to the five types of movements of the car's forward, backward, left turn, right turn, and stop, which can be easily obtained by means of a memory lookup table.
  • step S30 the car motion is controlled according to the gesture control command.
  • the car controller After the car controller obtains the gesture control command based on the gesture signal, the car controller controls the car to perform the corresponding action according to the control command.
  • the two wheels of a specific car have two motor drive runs respectively, and the car can be driven to perform corresponding actions by controlling the direction of motion of the two motors.
  • the gesture command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the recognition rate reduction caused by the interference of ambient lighting, background and the like in the existing dynamic gesture detection.
  • the present invention can realize continuous gesture recognition within 2 to 20 meters by judging the jitter range based on the gray value.
  • the second embodiment of the gesture recognition-based car control method is based on the first embodiment of the above-described gesture recognition-based car control method of the present invention.
  • the determining the starting position of the hand center specifically includes: The palm joint point determines the palm position, and determines whether the jitter range of the palm is within a preset value within a preset time; when the jitter range of the palm is within the preset value, the recognition is a valid gesture, and the recognition The center point of the jitter range of the palm is the starting position of the palm.
  • the position of the palm of the present embodiment is specifically: identifying the joint point of the palm (which can be identified based on the ready-made library function), and determining the position of the palm (one point) according to the joint point of the palm.
  • a gesture recognition processing that is, de-jitter processing, specifically, determining whether the jitter range of the palm is within a preset value within a preset time, if the preset time The jitter range of the palm is within the preset value and is recognized as a valid gesture.
  • the jitter range does not exceed the preset value of 5 cm in a predetermined time of 0.5 seconds, and if so, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as a palm. starting point.
  • the third embodiment of the gesture recognition-based trolley control method is based on the first embodiment of the above-described gesture recognition-based trolley control method of the present invention.
  • the determining the end position of the palm includes: After the initial position of the palm, delaying the first preset time; determining whether the first jitter range of the palm is in the first pre-predetermined time after the first preset time is counted When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
  • the third embodiment is different from the second embodiment in that after the initial position of the palm is recognized, a first preset time needs to be delayed, and no recognition processing is performed in the first preset time. After the first preset time is counted, it is determined whether the first jitter range of the palm is within the first preset value by the second preset time, and if it is within the first preset value, it is recognized as a valid gesture. For example, after the initial position of the palm is recognized, the first preset time is delayed by 0.5 seconds, and after 0.5 seconds, the first jitter range of the palm is judged to not exceed the first preset within 0.5 second of the second preset time.
  • a value of 5 cm is shaken, and if so, the center point of the area where the palm is located within 0.5 of the second preset time is identified as the end position of the palm.
  • the purpose of increasing the delay of the first preset time is to recognize the start position of the palm, and since the gesture of the subsequent palm is still in continuous motion, the action of whether the gesture ends or not can not be immediately followed, according to the gesture of different people.
  • Select a suitable delay time for the speed such as 0.5 seconds. After 0.5 seconds, the jitter of the palm can be judged so that the end position of the palm can be judged accurately.
  • the fourth embodiment of the gesture recognition-based trolley control method is based on the second or third embodiment of the above-described gesture recognition-based trolley control method of the present invention.
  • the jitter of the palm is determined by the gray scale according to the palm. The value determines the change in the displacement of the palm.
  • the depth information of the object can be acquired by the Kinect infrared sensor, when the preset motion of the user recognizes the area where the palm is located, the gray value parameter of the image of the area where the palm is located can be further recognized, that is, the depth information, that is, the palm and the Kinect sensor.
  • the distance information can be determined according to the depth information of the user's palm position and the approximate distance of the Kinect sensor, such as different distance intervals, such as 2-3 meters, 3-4 meters, ..., 19- A distance range of 20 meters, according to which distance, a preset distance for recognizing the movement of the image in the palm screen when the palm is moved by a preset distance such as 5 cm can be converted.
  • the preset gray value parameter that is, the depth information parameter
  • the depth image of the gesture may be separated from the depth image of the user by the depth information of the area where the palm of the user is located.
  • the information and further based on the depth information combined with the range of the palm's jitter in the screen, can identify the range value of the palm time jitter. For example, after the palm of the hand is determined by the palm joint (one point), the palm is judged to be 10 meters away from the lens according to the depth information of the palm. If the distance of the palm in the screen is within 0.1 cm within 0.5 s, the actual palm shake can be judged. Within 5cm, it is recognized as a valid gesture according to its actual palm jitter within the preset value.
  • the fifth embodiment of the gesture recognition based car control method is based on the first embodiment of the above-described gesture recognition based car control method according to the present invention.
  • the palm movement is recognized based on the starting position and the ending position of the palm.
  • the trajectory specifically includes: establishing a coordinate system based on the starting position of the palm, and identifying a motion trajectory of the palm by determining coordinates of the starting position and the ending position.
  • the coordinate system is established, the positive direction of the y-axis is downward, and the positive direction of the x-axis is to the right, and the difference between the horizontal and vertical coordinates and the ordinate of the end point and the starting point is calculated ( End point minus starting point), the trajectory of the palm is recognized by the difference between the ordinate and the ordinate of the end point and the starting point. If the ordinate difference is positive and the absolute value is greater than the absolute value of the abscissa difference, it is recognized as palm up.
  • the ordinate difference is negative, and the absolute value is greater than the absolute value of the abscissa difference, which is recognized as the palm moving downward;
  • the abscissa difference is positive, and the absolute value is greater than the absolute value of the ordinate coordinate, which is recognized as the palm moving to the right;
  • the difference in the abscissa is negative, and the absolute value is greater than the absolute value of the ordinate difference, which is recognized as the palm moving to the left.
  • the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position, in order to identify more accurately, when determining the starting position of the palm, it is necessary to delay one.
  • the end position is recognized only after the first preset time is 0.5 s.
  • the first preset time is The gesture is not finished at the end, and the jitter range of the palm is not detected within the first preset value, that is, the end position of the palm cannot be detected; but the second preset time after the first preset time.
  • the end of the first preset time can immediately recognize that the range of the palm of the hand is within the first preset value, that is, the end position of the palm is recognized, This recognizes that the hand grasps the punching action, that is, whether or not the palm is a clenching action based on whether or not the end position of the palm is immediately recognized at the end of the first preset time.
  • the invention also provides a trolley control device based on gesture recognition.
  • FIG. 2 is a schematic diagram of functional modules of a first embodiment of a vehicle control device based on gesture recognition according to the present invention.
  • the gesture recognition based car control device includes:
  • Determining module 1 for determining a starting position and an ending position of the palm
  • the determining module 2 is configured to identify a motion track of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion track of the palm;
  • the command generating module 3 is configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal;
  • the control module 4 is configured to control the movement of the trolley according to the gesture control command.
  • the present invention recognizes a gesture by an image sensor such as Kinect, unlike the conventional method for analyzing gesture recognition by recognizing the image of the palm, the present invention only needs to recognize the start position and the end position of the palm, and according to the start. The position and end position can determine the gesture direction signal of the palm.
  • the starting point of the palm is recognized by the image of the palm, and the starting point is the origin, and the plane rectangular coordinate system is established, and the end position of the ending point in the coordinate system is determined by analyzing the end point.
  • the position, combined with the origin, can determine the current trajectory of the palm, and then draw the basic gesture of the palm according to the trajectory of the palm, including the palm to the left, to the right, up, down or clenched.
  • the car controller retrieves the gesture control command corresponding to the gesture signal through the memory, for example, by acquiring five kinds of gesture signals: the palm moves to the left, right, up, and down. Or the fist movements correspond to the five types of movements of the car's forward, backward, left turn, right turn, and stop, which can be easily obtained by means of a memory lookup table.
  • the car controller After the car controller obtains the gesture control command based on the gesture signal, the car controller controls the car to perform the corresponding action according to the control command.
  • the two wheels of a specific car have two motor drive runs respectively, and the car can be driven to perform corresponding actions by controlling the direction of motion of the two motors.
  • the gesture command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the recognition rate reduction caused by the interference of ambient lighting, background and the like in the existing dynamic gesture detection.
  • the present invention can realize continuous gesture recognition within 2 to 20 meters by judging the jitter range based on the gray value.
  • the second embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention.
  • the determining module is specifically configured to be based on the palm joint point. Determining a position of the palm, and determining whether the range of the palm of the hand is within a preset value within a preset time; and identifying the effective gesture when the range of the jitter of the palm is within the preset value, identifying the pre- Set the center point of the area where the palm is located to the starting position of the palm.
  • the determining module 1 recognizes that the joint point of the palm can be identified based on the ready-made library function, and the position of the palm (one point) can be determined according to the joint point of the palm. Before the starting position of the palm is recognized, whether It is effective gesture recognition processing, that is, de-jitter processing, specifically determining whether the jitter range of the palm is within a preset value within a preset time. If the jitter range of the palm is within a preset value within a preset time, it is recognized as Effective gestures.
  • de-jitter processing specifically determining whether the jitter range of the palm is within a preset value within a preset time. If the jitter range of the palm is within a preset value within a preset time, it is recognized as Effective gestures.
  • the jitter range does not exceed the preset value of 5 cm in a predetermined time of 0.5 seconds, and if so, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as a palm. starting point.
  • the third embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention.
  • the determining module is specifically configured to recognize the palm of the hand. After the initial position, delaying the first preset time; determining whether the first jitter range of the palm is at the first preset value within a second preset time after the first preset time is counted When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
  • the third embodiment is different from the second embodiment in that after the initial position of the palm is recognized, a first preset time needs to be delayed, and no recognition processing is performed in the first preset time. After the first preset time is counted, it is determined whether the first jitter range of the palm is within the first preset value by the second preset time, and if it is within the first preset value, it is recognized as a valid gesture. For example, after the initial position of the palm is recognized, the first preset time is delayed by 0.5 seconds, and after 0.5 seconds, the first jitter range of the palm is judged to not exceed the first preset within 0.5 second of the second preset time.
  • a value of 5 cm is shaken, and if so, the center point of the area where the palm is located within 0.5 of the second preset time is identified as the end position of the palm.
  • the purpose of increasing the delay of the first preset time is to recognize the start position of the palm, and since the gesture of the subsequent palm is still in continuous motion, the action of whether the gesture ends or not can not be immediately followed, according to the gesture of different people.
  • Select a suitable delay time for the speed such as 0.5 seconds. After 0.5 seconds, the jitter of the palm can be judged so that the end position of the palm can be judged accurately.
  • the fourth embodiment of the gesture recognition-based car control device is based on the second or third embodiment of the above-described gesture recognition-based car control device of the present invention.
  • the jitter of the palm is judged by the gray scale according to the palm. The value determines the change in the displacement of the palm.
  • the depth information of the object can be acquired by the Kinect infrared sensor, when the preset motion of the user recognizes the area where the palm is located, the gray value parameter of the image of the area where the palm is located can be further recognized, that is, the depth information, that is, the palm and the Kinect sensor.
  • the distance information can be determined according to the depth information of the user's palm position and the approximate distance of the Kinect sensor, such as different distance intervals, such as 2-3 meters, 3-4 meters, ..., 19- A distance range of 20 meters, according to which distance, a preset distance for recognizing the movement of the image in the palm screen when the palm is moved by a preset distance such as 5 cm can be converted.
  • the preset gray value parameter that is, the depth information parameter
  • the depth image of the gesture may be separated from the depth image of the user by the depth information of the area where the palm of the user is located.
  • the information and further based on the depth information combined with the range of the palm's jitter in the screen, can identify the range value of the palm time jitter. For example, after the palm of the hand is determined by the palm joint (one point), the palm is judged to be 10 meters away from the lens according to the depth information of the palm. If the distance of the palm in the screen is within 0.1 cm within 0.5 s, the actual palm shake can be judged. Within 5cm, it is recognized as a valid gesture according to its actual palm jitter within the preset value.
  • the fifth embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention.
  • the palm movement is recognized based on the starting position and the ending position of the palm.
  • the track details include:
  • a coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
  • the coordinate system is established, the positive direction of the y-axis is downward, and the positive direction of the x-axis is to the right, and the difference between the horizontal and vertical coordinates and the ordinate of the end point and the starting point is calculated ( End point minus starting point), the trajectory of the palm is recognized by the difference between the ordinate and the ordinate of the end point and the starting point. If the ordinate difference is positive and the absolute value is greater than the absolute value of the abscissa difference, it is recognized as palm up.
  • the ordinate difference is negative, and the absolute value is greater than the absolute value of the abscissa difference, which is recognized as the palm moving downward;
  • the abscissa difference is positive, and the absolute value is greater than the absolute value of the ordinate coordinate, which is recognized as the palm moving to the right;
  • the difference in the abscissa is negative, and the absolute value is greater than the absolute value of the ordinate difference, which is recognized as the palm moving to the left.
  • the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position, in order to identify more accurately, when determining the starting position of the palm, it is necessary to delay one.
  • the end position is recognized only after the first preset time is 0.5 s.
  • the first preset time is The gesture is not finished at the end, and the jitter range of the palm is not detected within the first preset value, that is, the end position of the palm cannot be detected; but the second preset time after the first preset time.
  • the end of the first preset time can immediately recognize that the range of the palm of the hand is within the first preset value, that is, the end position of the palm is recognized, This recognizes that the hand grasps the punching action, that is, whether or not the palm is a clenching action based on whether or not the end position of the palm is immediately recognized at the end of the first preset time.
  • each module of the device can be specifically set as follows:
  • the car control system detects the image information of the palm of the user through the Kinect sensor and can further detect the image gray value information of the position of the palm, and further recognizes the starting position and the ending position of the palm, so the determining module 1 is set in the Kinect sensor, Kinect After the sensor recognizes the starting position and the ending position of the palm, the position information is sent to the terminal device such as the PC end, and the Kinect sensor can be connected to the terminal device by wire, and the terminal device judges the gesture of the palm according to the starting position and the ending position.
  • the judging module 2 is set in the terminal device, or the Kinect sensor can directly determine the gesture signal and send it to the terminal device based on the initial position and the end position analysis of the palm, and the judging module 2 is also set in the Kinect sensor.
  • the device retrieves the gesture control command corresponding to the gesture signal according to the gesture signal, so the command generation module 3 is set in the terminal device, and finally the terminal device sends the gesture control command to the controller on the trolley by wireless transmission, such as Bluetooth, the car Controller according to hand Car operation control command, the control module 4 is provided on the carriage.

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

A gesture recognition-based robot car control method, comprising: determining a starting position and an ending position of a palm, recognizing the motion trajectory of the palm on the basis of the starting position and the ending position of the palm, and determining a gesture signal of the palm according to the motion trajectory of the palm (S10); invoking a gesture control command corresponding to the gesture signal on the basis of the gesture signal (S20); and controlling the motion of a robot car according to the gesture control command (S30). According to the method, a gesture of the palm is recognized simply and accurately to control the motion of a robot car, and the problem of low recognition rate easily caused by interference of factors such as illumination and background of the environment during existing dynamic gesture detection is resolved.

Description

基于手势识别的小车控制方法、装置  Trolley control method and device based on gesture recognition
技术领域Technical field
本发明基于人机交互的图像信息处理和控制领域,具体涉及一种基于手势识别的小车控制方法及装置。The invention is based on the field of image information processing and control of human-computer interaction, and particularly relates to a method and device for controlling a car based on gesture recognition.
背景技术Background technique
手势的图像识别技术大致可以分为静态手势识别和动态手势识别。静态手势识别其特征在于在学习从图形特征空间到手形状空间的映射关系。具体地说,首先预定义一组手势集合,并从每个手势中提取唯一的手势特征描述,然后从图像特征空间直接映射到手势空间得到手势的估计,也就是通过图像特征(点、线、角或纹理区域等)来识别手势或手势状态。该方法需要在代表各种手势的数据模板的基础上进行学习和训练。一般提取的手势特征包括肤色特征、Haar特征、HOG特征、形状特征等。动态手势识别主要是基于运动的检测算法,根据运动区域的特征描述子检测预先定义的动态手势。目前动态手势检测研究较为成熟,但如果考虑光照、背景等干扰因素,识别率大大降低。The image recognition technology of gestures can be roughly divided into static gesture recognition and dynamic gesture recognition. Static gesture recognition is characterized by learning the mapping relationship from the graphical feature space to the hand shape space. Specifically, firstly, a set of gestures is predefined, and a unique gesture feature description is extracted from each gesture, and then the image feature space is directly mapped to the gesture space to obtain an estimation of the gesture, that is, by image features (points, lines, An angle or texture area, etc.) to identify the gesture or gesture state. This method requires learning and training based on data templates representing various gestures. Commonly extracted gesture features include skin color features, Haar features, HOG features, shape features, and the like. The dynamic gesture recognition is mainly based on the motion detection algorithm, and the predefined dynamic gesture is detected according to the feature description of the motion region. At present, dynamic gesture detection research is relatively mature, but if considering the interference factors such as illumination and background, the recognition rate is greatly reduced.
上述内容仅用于辅助理解本发明的技术方案,并不代表承认上述内容是现有技术。The above content is only used to assist in understanding the technical solutions of the present invention, and does not constitute an admission that the above is prior art.
发明内容Summary of the invention
本发明的主要目的在于提供一种基于手势识别的小车控制方法,目的在于解决由于现有的动态手势检测中易受环境的光照、背景等因素干扰引起的识别率降低问题。The main object of the present invention is to provide a car control method based on gesture recognition, which aims to solve the problem of lowering the recognition rate caused by interference of ambient light, background and the like in the existing dynamic gesture detection.
为实现上述目的,本发明提供一种基于手势识别的小车控制方法,所述控制方法包括以下步骤:To achieve the above object, the present invention provides a car control method based on gesture recognition, the control method comprising the following steps:
确定手心的起始位置和结束位置,并基于手心的起始位置和结束位置识别手心的运动轨迹,并根据所述手心的运动轨迹判断手掌的手势信号;Determining a starting position and an ending position of the palm, and identifying a movement trajectory of the palm based on the starting position and the ending position of the palm, and determining a gesture signal of the palm according to the movement trajectory of the palm;
基于所述手势信号调取所述手势信号对应的手势控制命令;And acquiring, according to the gesture signal, a gesture control command corresponding to the gesture signal;
根据所述手势控制命令控制小车运动。The cart motion is controlled according to the gesture control command.
优选地,所述确定手心的起始位置具体包括:Preferably, the determining the starting position of the palm includes:
根据手掌关节点确定手心位置,并在预设时间内判断所述手心的抖动范围是否在预设值内;Determining the position of the palm according to the joint point of the palm, and determining whether the range of the palm of the palm is within a preset value within a preset time;
当所述手心的抖动范围在所述预设值内时,识别为有效手势,则识别所述预设时间内手心所在区域的中心点为手心的起始位置。When the jitter range of the palm is within the preset value, and is recognized as an effective gesture, the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
优选地,所述确定手心的结束位具体包括:Preferably, the determining the end position of the palm includes:
在识别到手心的起始位置后,延时第一预设时间;After identifying the starting position of the palm, delaying the first preset time;
在所述第一预设时间计时到后在第二预设时间内判断所述手心的第一抖动范围是否在所述第一预设值内; Determining, in the second preset time, whether the first jitter range of the palm is within the first preset value after the first preset time is counted;
当所述手心的第一抖动范围在所述第一预设值内时,则识别所述第二预设时间内手心所在区域的中心点为手心的结束位置。When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
优选地,所述判断手心的抖动通过根据手掌的灰度值判断手心的位移的变化实现。Preferably, the judging of the shaking of the palm is achieved by determining a change in the displacement of the palm according to the gradation value of the palm.
优选地,所述基于手心的起始位置和结束位置识别手心的运动轨迹具体包括:Preferably, the identifying the motion trajectory of the palm by the starting position and the ending position of the palm center specifically includes:
基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手心的运动轨迹。A coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
为实现上述目的,本发明还提供一种基于手势识别的小车控制装置,所述基于手势识别的小车控制装置包括:In order to achieve the above object, the present invention further provides a car control device based on gesture recognition, and the car control device based on gesture recognition includes:
确定模块,用于确定手心的起始位置和结束位置;Determining a module for determining a starting position and an ending position of the palm;
判断模块,用于基于手心的起始位置和结束位置识别手心的运动轨迹,并根据所述手心的运动轨迹判断手掌的手势信号;a judging module, configured to identify a motion trajectory of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion trajectory of the palm;
命令生成模块,用于根据所述手势信号调取所述手势信号对应的手势控制命令;a command generating module, configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal;
控制模块,用于根据所述手势控制命令控制小车运动。And a control module, configured to control the movement of the trolley according to the gesture control command.
优选地,所述确定模块,具体用于根据手掌关节点确定手心位置,并在预设时间内判断所述手心的抖动范围是否在预设值内;当所述手心的抖动范围在所述预设值内时,识别为有效手势,则识别所述预设时间内手心所在区域的中心点为手心的起始位置。Preferably, the determining module is specifically configured to determine a position of the palm according to the joint point of the palm, and determine whether the range of the jitter of the palm is within a preset value within a preset time; when the jitter range of the palm is in the preset When the value is set, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
优选地,所述确定模块,具体用于在识别到手心的起始位置后,延时第一预设时间;在所述第一预设时间计时到后在第二预设时间内判断所述手心的第一抖动范围是否在所述第一预设值内;当所述手心的第一抖动范围在所述第一预设值内时,则识别所述第二预设时间内手心所在区域的中心点为手心的结束位置。Preferably, the determining module is configured to delay the first preset time after the initial position of the palm is recognized; and determine the second preset time after the first preset time is counted Whether the first jitter range of the palm is within the first preset value; when the first jitter range of the palm is within the first preset value, identifying the area where the palm is located in the second preset time The center point is the end position of the palm.
优选地,所述判断手心的抖动根据手掌的灰度值判断手心的位移的变化实现。Preferably, the judgment of the shaking of the palm is performed by determining a change in the displacement of the palm according to the gradation value of the palm.
优选地,所述判断模块,具体用于基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手掌的运动轨迹。Preferably, the determining module is specifically configured to establish a coordinate system based on a starting position of the palm, and identify a motion trajectory of the palm by determining coordinates of the starting position and the ending position.
本发明通过识别手心的起始位置和结束位置,并根据起始位置和结束位置判断手掌的手势信号,并基于所述手势信号调取所述手势信号对应的手势控制命令,最后根据所述手势命令控制小车运动,实现了通过简单准确的方式识别手心的手势来控制小车的运动,解决由于现有的动态手势检测中易受环境的光照、背景等因素干扰引起的识别率降低问题,本发明通过基于灰度值判断抖动范围,可在一定距离范围内实现连续手势识别。The invention recognizes the start position and the end position of the palm, and determines the gesture signal of the palm according to the start position and the end position, and retrieves the gesture control command corresponding to the gesture signal based on the gesture signal, and finally according to the gesture The command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the reduction of the recognition rate caused by the interference of the ambient light, the background and the like in the existing dynamic gesture detection, and the present invention By judging the jitter range based on the gray value, continuous gesture recognition can be realized within a certain distance range.
附图说明DRAWINGS
图1为本发明基于手势识别的小车控制方法的第一实施例的流程示意图;1 is a schematic flow chart of a first embodiment of a method for controlling a car based on gesture recognition according to the present invention;
图2为本发明基于手势识别的小车控制装置第一实施例的功能模块示意图。2 is a schematic diagram of functional modules of a first embodiment of a vehicle control device based on gesture recognition according to the present invention.
本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。The implementation, functional features, and advantages of the present invention will be further described in conjunction with the embodiments.
具体实施方式detailed description
应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。It is understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
下面参照附图描述根据本发明实施例提出的基于手势识别的小车控制方法和装置。A method and apparatus for controlling a car based on gesture recognition according to an embodiment of the present invention will be described below with reference to the accompanying drawings.
首先对本发明实施例提出的基于手势识别的小车控制方法进行说明。图1为根据本发明的第一实施例的基于手势识别的小车控制方法的流程示意图。如图1所示,本发明实施例的基于手势识别的小车控制方法包括以下步骤:First, a gesture recognition based car control method proposed by an embodiment of the present invention will be described. 1 is a flow chart showing a method of controlling a car based on gesture recognition according to a first embodiment of the present invention. As shown in FIG. 1 , the gesture recognition based car control method of the embodiment of the present invention includes the following steps:
步骤S10,确定手心的起始位置和结束位置,并基于手心的起始位置和结束位置识别手心的运动轨迹,并根据手心的运动轨迹判断手掌的手势信号。Step S10, determining a starting position and an ending position of the palm, and identifying a movement trajectory of the palm based on the starting position and the ending position of the palm, and determining a gesture signal of the palm according to the movement trajectory of the palm.
本发明所采用的方案通过图像传感器如Kinect识别出手势时,不同于传统的通过识别手掌的图像分析手势识别的方法,本发明只需要识别出手心的起始位置和结束位置,并根据起始位置和结束位置可以判断出手心的手势方向信号。When the solution adopted by the present invention recognizes a gesture by an image sensor such as Kinect, unlike the conventional method for analyzing gesture recognition by recognizing the image of the palm, the present invention only needs to recognize the start position and the end position of the palm, and according to the start. The position and end position can determine the gesture direction signal of the palm.
具体的,在进行手势识别时,先通过手掌的图像识别出手心的起始点即起始位置,以起始点为原点,建立平面直角坐标系,通过分析结束点在坐标系中的散落位置确定结束位置,并结合原点,就可以判断得出当前的手掌的运动轨迹,根据手掌的运动轨迹再得出手掌的基本手势,包括手掌向左、向右、向上、向下移动或者握拳动作。Specifically, when performing gesture recognition, the starting point of the palm is recognized by the image of the palm, and the starting point is the origin, and the plane rectangular coordinate system is established, and the end position of the ending point in the coordinate system is determined by analyzing the end point. The position, combined with the origin, can determine the current trajectory of the palm, and then draw the basic gesture of the palm according to the trajectory of the palm, including the palm to the left, to the right, up, down or clenched.
步骤S20,基于所述手势信号调取所述手势信号对应的手势控制命令。Step S20: Acquire a gesture control command corresponding to the gesture signal based on the gesture signal.
通过Kinect传感器识别出手势动作后,其小车控制器通过存储器调取所述手势信号对应的手势控制命令,例如:通过获取其中的5种手势信号:手掌向左、向右、向上、向下移动或者握拳动作分别对应小车的前进、后退、左拐、右拐、停止5种动作类型,简单的可通过存储器查表的方式获取。After the gesture action is recognized by the Kinect sensor, the car controller retrieves the gesture control command corresponding to the gesture signal through the memory, for example, by acquiring five kinds of gesture signals: the palm moves to the left, right, up, and down. Or the fist movements correspond to the five types of movements of the car's forward, backward, left turn, right turn, and stop, which can be easily obtained by means of a memory lookup table.
步骤S30,根据手势控制命令控制小车运动。In step S30, the car motion is controlled according to the gesture control command.
小车控制器基于手势信号获得手势控制命令后,其小车控制器根据其控制命令控制小车进行相应的动作。具体的小车的两个轮分别有两个电机传动运行,通过控制两个电机的运动方向即可带动小车进行相应的动作。After the car controller obtains the gesture control command based on the gesture signal, the car controller controls the car to perform the corresponding action according to the control command. The two wheels of a specific car have two motor drive runs respectively, and the car can be driven to perform corresponding actions by controlling the direction of motion of the two motors.
本实施例中,通过识别手心的起始位置和结束位置,并根据起始位置和结束位置判断手掌的手势信号,并基于所述手势信号调取所述手势信号对应的手势控制命令,最后根据所述手势命令控制小车运动,实现了通过简单准确的方式识别手掌的手势来控制小车的运动,解决由于现有的动态手势检测中易受环境的光照、背景等因素干扰引起的识别率降低问题,本发明通过基于灰度值判断抖动范围,可以实现在2~20米内连续手势识别。 In this embodiment, by recognizing the start position and the end position of the palm, and determining the gesture signal of the palm according to the start position and the end position, and acquiring the gesture control command corresponding to the gesture signal based on the gesture signal, and finally The gesture command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the recognition rate reduction caused by the interference of ambient lighting, background and the like in the existing dynamic gesture detection. The present invention can realize continuous gesture recognition within 2 to 20 meters by judging the jitter range based on the gray value.
进一步的,基于手势识别的小车控制方法的第二实施例,基于上述本发明基于手势识别的小车控制方法第一实施例,在本实施例中,所述确定手心的起始位置具体包括:根据手掌关节点确定手心位置,并在预设时间内判断所述手心的抖动范围是否在预设值内;当所述手心的抖动范围在所述预设值内时,识别为有效手势,则识别所述手心的抖动范围的中心点为手心的起始位置。Further, the second embodiment of the gesture recognition-based car control method is based on the first embodiment of the above-described gesture recognition-based car control method of the present invention. In this embodiment, the determining the starting position of the hand center specifically includes: The palm joint point determines the palm position, and determines whether the jitter range of the palm is within a preset value within a preset time; when the jitter range of the palm is within the preset value, the recognition is a valid gesture, and the recognition The center point of the jitter range of the palm is the starting position of the palm.
本实施例识别手心的位置具体为:识别手掌的关节点(可以基于现成的库函数识别),根据手掌的关节点确定手心(一个点)的位置。在识别手心的起始位置之前,还需要进行是否是有效的手势识别处理,即去抖动处理,具体是在预设时间内通过判断手心的抖动范围是否在预设值内,如果预设时间内手心的抖动范围在预设值内,则识别为有效手势。如具体的在预设时间0.5秒内判断抖动范围是否不超过预设值5厘米的抖动,如果是,则识别为有效手势,并识别所述预设时间内手心所在区域的中心点为手心的起始位置。The position of the palm of the present embodiment is specifically: identifying the joint point of the palm (which can be identified based on the ready-made library function), and determining the position of the palm (one point) according to the joint point of the palm. Before identifying the starting position of the palm, it is also necessary to perform a gesture recognition processing, that is, de-jitter processing, specifically, determining whether the jitter range of the palm is within a preset value within a preset time, if the preset time The jitter range of the palm is within the preset value and is recognized as a valid gesture. For example, it is determined whether the jitter range does not exceed the preset value of 5 cm in a predetermined time of 0.5 seconds, and if so, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as a palm. starting point.
进一步的,基于手势识别的小车控制方法的第三实施例,基于上述本发明基于手势识别的小车控制方法第一实施例,在本实施例中,所述确定手心的结束位具体包括:在识别到手心的起始位置后,延时第一预设时间;在所述第一预设时间计时到后在第二预设时间内判断所述手心的第一抖动范围是否在所述第一预设值内;当所述手心的第一抖动范围在所述第一预设值内时,则识别所述第二预设时间内手心所在区域的中心点为手心的结束位置。Further, the third embodiment of the gesture recognition-based trolley control method is based on the first embodiment of the above-described gesture recognition-based trolley control method of the present invention. In this embodiment, the determining the end position of the palm includes: After the initial position of the palm, delaying the first preset time; determining whether the first jitter range of the palm is in the first pre-predetermined time after the first preset time is counted When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
上述第三实施例与第二实施例不同的是:在识别到手心的起始位置后,需要延时一个第一预设时间,在所述第一预设时间内不做任何识别处理,在所述第一预设时间计时到后再通过第二预设时间内通过判断手心的第一抖动范围是否在第一预设值内,如果在第一预设值内,则识别为有效手势。如具体的在识别到手心的起始位置后,延迟第一预设时间0.5秒,0.5秒计时到后在第二预设时间0.5秒内判断手心的第一抖动范围是否不超过第一预设值5厘米的抖动,如果是,则识别所述第二预设时间0.5内手心所在区域的中心点为手心的结束位置。此处增加第一预设时间的延时目的在于识别到手心的起始位置后,由于后续手掌的手势还在连续动作,因此不能立即接着判断手势是否结束的动作,根据不同人的手势动作的速度选择一个合适的延时时间,如0.5秒,0.5秒时间到后再判断手心的抖动这样能使得手心的结束位置判断准确。The third embodiment is different from the second embodiment in that after the initial position of the palm is recognized, a first preset time needs to be delayed, and no recognition processing is performed in the first preset time. After the first preset time is counted, it is determined whether the first jitter range of the palm is within the first preset value by the second preset time, and if it is within the first preset value, it is recognized as a valid gesture. For example, after the initial position of the palm is recognized, the first preset time is delayed by 0.5 seconds, and after 0.5 seconds, the first jitter range of the palm is judged to not exceed the first preset within 0.5 second of the second preset time. A value of 5 cm is shaken, and if so, the center point of the area where the palm is located within 0.5 of the second preset time is identified as the end position of the palm. Here, the purpose of increasing the delay of the first preset time is to recognize the start position of the palm, and since the gesture of the subsequent palm is still in continuous motion, the action of whether the gesture ends or not can not be immediately followed, according to the gesture of different people. Select a suitable delay time for the speed, such as 0.5 seconds. After 0.5 seconds, the jitter of the palm can be judged so that the end position of the palm can be judged accurately.
进一步的,基于手势识别的小车控制方法的第四实施例,基于上述本发明基于手势识别的小车控制方法第二或第三实施例,本实施例中,判断手心的抖动通过根据手掌的灰度值判断手心的位移的变化实现。Further, the fourth embodiment of the gesture recognition-based trolley control method is based on the second or third embodiment of the above-described gesture recognition-based trolley control method of the present invention. In this embodiment, the jitter of the palm is determined by the gray scale according to the palm. The value determines the change in the displacement of the palm.
由于通过Kinect红外传感器可以获取物体的深度信息,因此在上述用户的预设动作识别到手掌所在区域时,可以进一步识别出手掌所在区域的图像的灰度值参数即深度信息,即手掌与Kinect传感器的距离信息,根据其深度信息可以判断出用户的手掌位置与Kinect传感器的大致距离,如可以是不同的距离区间,如2-3米、3-4米、......、19-20米不同的距离区间,根据此距离,可以换算得到识别手掌移动预设距离如5厘米时手掌屏幕中的图像移动的预设距离。当获取了用户手掌所在区域的预设灰度值参数即深度信息参数后,在识别用户手掌的手势动作时,可以通过户手掌所在区域的深度信息从用户的深度图像中分离出手势的深度图像信息,并进一步根据深度信息结合手心在屏幕中抖动的范围,就可以识别到手心时间抖动的范围值。例如:通过手掌关节确定手心(一个点)后,根据手掌的深度信息判断出手掌距离镜头10米,如果在0.5s内,手掌在屏幕中抖动的距离为0.1cm以内,可以判断出实际手心抖动5cm以内,根据其实际手心抖动在预设值内,识别为有效的手势。Since the depth information of the object can be acquired by the Kinect infrared sensor, when the preset motion of the user recognizes the area where the palm is located, the gray value parameter of the image of the area where the palm is located can be further recognized, that is, the depth information, that is, the palm and the Kinect sensor. The distance information can be determined according to the depth information of the user's palm position and the approximate distance of the Kinect sensor, such as different distance intervals, such as 2-3 meters, 3-4 meters, ..., 19- A distance range of 20 meters, according to which distance, a preset distance for recognizing the movement of the image in the palm screen when the palm is moved by a preset distance such as 5 cm can be converted. After the preset gray value parameter, that is, the depth information parameter, is obtained in the area where the user's palm is located, when the gesture motion of the user's palm is recognized, the depth image of the gesture may be separated from the depth image of the user by the depth information of the area where the palm of the user is located. The information, and further based on the depth information combined with the range of the palm's jitter in the screen, can identify the range value of the palm time jitter. For example, after the palm of the hand is determined by the palm joint (one point), the palm is judged to be 10 meters away from the lens according to the depth information of the palm. If the distance of the palm in the screen is within 0.1 cm within 0.5 s, the actual palm shake can be judged. Within 5cm, it is recognized as a valid gesture according to its actual palm jitter within the preset value.
进一步的,基于手势识别的小车控制方法的第五实施例,基于上述本发明基于手势识别的小车控制方法第一实施例,本实施例中,基于手心的起始位置和结束位置识别手掌的运动轨迹具体包括:基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手掌的运动轨迹。Further, the fifth embodiment of the gesture recognition based car control method is based on the first embodiment of the above-described gesture recognition based car control method according to the present invention. In this embodiment, the palm movement is recognized based on the starting position and the ending position of the palm. The trajectory specifically includes: establishing a coordinate system based on the starting position of the palm, and identifying a motion trajectory of the palm by determining coordinates of the starting position and the ending position.
通过判断手心的起始位置,并以起始位置为原点,建立坐标系,y轴正方向向下,x轴正方向向右,计算结束点和起始点的横纵坐标和纵坐标之差(结束点减起始点),通过结束点和起始点的横纵坐标和纵坐标差值识别手掌的运动轨迹,如纵坐标差为正,且绝对值大于横坐标差的绝对值,识别为手掌向上移动;纵坐标差为负,且绝对值大于横坐标差的绝对值,识别为手掌向下移动;横坐标差为正,且绝对值大于纵坐标差的绝对值,识别为手掌向右移动;横坐标差为负,且绝对值大于纵坐标差的绝对值,识别为手掌向左移动。By judging the starting position of the palm and starting from the starting position, the coordinate system is established, the positive direction of the y-axis is downward, and the positive direction of the x-axis is to the right, and the difference between the horizontal and vertical coordinates and the ordinate of the end point and the starting point is calculated ( End point minus starting point), the trajectory of the palm is recognized by the difference between the ordinate and the ordinate of the end point and the starting point. If the ordinate difference is positive and the absolute value is greater than the absolute value of the abscissa difference, it is recognized as palm up. Movement; the ordinate difference is negative, and the absolute value is greater than the absolute value of the abscissa difference, which is recognized as the palm moving downward; the abscissa difference is positive, and the absolute value is greater than the absolute value of the ordinate coordinate, which is recognized as the palm moving to the right; The difference in the abscissa is negative, and the absolute value is greater than the absolute value of the ordinate difference, which is recognized as the palm moving to the left.
需要说明的是,在基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手掌的运动轨迹时,为了识别更加准确,在确定手心的起始位置时,需要延迟一个第一预设时间如0.5s后才识别结束位置。识别上述手掌的运动轨迹如手掌向上移动、手掌向下移动、手掌向右移动、手掌向左移动时,由于识别到手心的起始位置后一直为连续运动,因此在所述第一预设时间结束时手势还未结束,还不能检测到手心的抖动范围在第一预设值内,即不能检测到手心的结束位置;而是在所述第一预设时间后的第二预设时间内检测到手心的抖动范围在第一预设值内时即识别所述第二预设时间内手心所在区域的中心点为手心的结束位置,所述第二预设时间的终了是点为手势的结束时间;而对于手掌握拳动作,由于手掌一直没有运动,因此在上述第一预设时间结束时即可以马上识别到手心的抖动范围在第一预设值内即识别到手心的结束位置,以此识别到手掌握拳动作,即根据在上述第一预设时间结束时是否立即识别到手心的结束位置来识别手掌是否为握拳动作。It should be noted that when the coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position, in order to identify more accurately, when determining the starting position of the palm, it is necessary to delay one. The end position is recognized only after the first preset time is 0.5 s. Recognizing the movement trajectory of the palm as the palm moves upward, the palm moves downward, the palm moves to the right, and the palm moves to the left, since the initial position of the palm is recognized as continuous motion, the first preset time is The gesture is not finished at the end, and the jitter range of the palm is not detected within the first preset value, that is, the end position of the palm cannot be detected; but the second preset time after the first preset time When the range of the jitter of the palm is detected within the first preset value, that is, the center point of the region where the palm is located in the second preset time is the end position of the palm, and the end of the second preset time is the gesture. End time; for the hand to master the boxing action, since the palm has no motion, the end of the first preset time can immediately recognize that the range of the palm of the hand is within the first preset value, that is, the end position of the palm is recognized, This recognizes that the hand grasps the punching action, that is, whether or not the palm is a clenching action based on whether or not the end position of the palm is immediately recognized at the end of the first preset time.
本发明还提供一种基于手势识别的小车控制装置。The invention also provides a trolley control device based on gesture recognition.
参照图2,图2为本发明基于手势识别的小车控制装置第一实施例的功能模块示意图。Referring to FIG. 2, FIG. 2 is a schematic diagram of functional modules of a first embodiment of a vehicle control device based on gesture recognition according to the present invention.
在本实施例中,所述基于手势识别的小车控制装置包括:In this embodiment, the gesture recognition based car control device includes:
确定模块1,用于确定手心的起始位置和结束位置;Determining module 1 for determining a starting position and an ending position of the palm;
判断模块2,用于基于手心的起始位置和结束位置识别手心的运动轨迹,并根据手心的运动轨迹判断手掌的手势信号;The determining module 2 is configured to identify a motion track of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion track of the palm;
命令生成模块3,用于根据手势信号调取所述手势信号对应的手势控制命令;The command generating module 3 is configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal;
控制模块4,用于根据手势控制命令控制小车运动。The control module 4 is configured to control the movement of the trolley according to the gesture control command.
本发明所采用的方案通过图像传感器如Kinect识别出手势时,不同于传统的通过识别手掌的图像分析手势识别的方法,本发明只需要识别出手心的起始位置和结束位置,并根据起始位置和结束位置可以判断出手心的手势方向信号。When the solution adopted by the present invention recognizes a gesture by an image sensor such as Kinect, unlike the conventional method for analyzing gesture recognition by recognizing the image of the palm, the present invention only needs to recognize the start position and the end position of the palm, and according to the start. The position and end position can determine the gesture direction signal of the palm.
具体的,在进行手势识别时,先通过手掌的图像识别出手心的起始点即起始位置,以起始点为原点,建立平面直角坐标系,通过分析结束点在坐标系中的散落位置确定结束位置,并结合原点,就可以判断得出当前的手掌的运动轨迹,根据手掌的运动轨迹再得出手掌的基本手势,包括手掌向左、向右、向上、向下移动或者握拳动作。Specifically, when performing gesture recognition, the starting point of the palm is recognized by the image of the palm, and the starting point is the origin, and the plane rectangular coordinate system is established, and the end position of the ending point in the coordinate system is determined by analyzing the end point. The position, combined with the origin, can determine the current trajectory of the palm, and then draw the basic gesture of the palm according to the trajectory of the palm, including the palm to the left, to the right, up, down or clenched.
通过Kinect传感器识别出手势动作后,其小车控制器通过存储器调取所述手势信号对应的手势控制命令,例如:通过获取其中的5种手势信号:手掌向左、向右、向上、向下移动或者握拳动作分别对应小车的前进、后退、左拐、右拐、停止5种动作类型,简单的可通过存储器查表的方式获取。After the gesture action is recognized by the Kinect sensor, the car controller retrieves the gesture control command corresponding to the gesture signal through the memory, for example, by acquiring five kinds of gesture signals: the palm moves to the left, right, up, and down. Or the fist movements correspond to the five types of movements of the car's forward, backward, left turn, right turn, and stop, which can be easily obtained by means of a memory lookup table.
小车控制器基于手势信号获得手势控制命令后,其小车控制器根据其控制命令控制小车进行相应的动作。具体的小车的两个轮分别有两个电机传动运行,通过控制两个电机的运动方向即可带动小车进行相应的动作。After the car controller obtains the gesture control command based on the gesture signal, the car controller controls the car to perform the corresponding action according to the control command. The two wheels of a specific car have two motor drive runs respectively, and the car can be driven to perform corresponding actions by controlling the direction of motion of the two motors.
本实施例中,通过识别手心的起始位置和结束位置,并根据起始位置和结束位置判断手心的手势信号,并基于所述手势信号调取所述手势信号对应的手势控制命令,最后根据所述手势命令控制小车运动,实现了通过简单准确的方式识别手心的手势来控制小车的运动,解决由于现有的动态手势检测中易受环境的光照、背景等因素干扰引起的识别率降低问题,本发明通过基于灰度值判断抖动范围,可以实现在2~20米内连续手势识别。In this embodiment, by identifying the start position and the end position of the palm, and determining the gesture signal of the palm according to the start position and the end position, and acquiring the gesture control command corresponding to the gesture signal based on the gesture signal, and finally The gesture command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the recognition rate reduction caused by the interference of ambient lighting, background and the like in the existing dynamic gesture detection. The present invention can realize continuous gesture recognition within 2 to 20 meters by judging the jitter range based on the gray value.
进一步的,基于手势识别的小车控制装置的第二实施例,基于上述本发明基于手势识别的小车控制装置第一实施例,在本实施例中,所述确定模块,具体用于根据手掌关节点确定手心位置,并在预设时间内判断所述手心的抖动范围是否在预设值内;当所述手心的抖动范围在所述预设值内时,识别为有效手势,则识别所述预设时间内手心所在区域的中心点为手心的起始位置。Further, the second embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention. In the embodiment, the determining module is specifically configured to be based on the palm joint point. Determining a position of the palm, and determining whether the range of the palm of the hand is within a preset value within a preset time; and identifying the effective gesture when the range of the jitter of the palm is within the preset value, identifying the pre- Set the center point of the area where the palm is located to the starting position of the palm.
本实施例中所述确定模块1识别手掌的关节点可以基于现成的库函数识别,根据手掌的关节点可确定手心(一个点)的位置,在识别手心的起始位置之前,还需要进行是否是有效的手势识别处理,即去抖动处理,具体是在预设时间内通过判断手心的抖动范围是否在预设值内,如果预设时间内手心的抖动范围在预设值内,则识别为有效手势。如具体的在预设时间0.5秒内判断抖动范围是否不超过预设值5厘米的抖动,如果是,则识别为有效手势,并识别所述预设时间内手心所在区域的中心点为手心的起始位置。In the embodiment, the determining module 1 recognizes that the joint point of the palm can be identified based on the ready-made library function, and the position of the palm (one point) can be determined according to the joint point of the palm. Before the starting position of the palm is recognized, whether It is effective gesture recognition processing, that is, de-jitter processing, specifically determining whether the jitter range of the palm is within a preset value within a preset time. If the jitter range of the palm is within a preset value within a preset time, it is recognized as Effective gestures. For example, it is determined whether the jitter range does not exceed the preset value of 5 cm in a predetermined time of 0.5 seconds, and if so, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as a palm. starting point.
进一步的,基于手势识别的小车控制装置的第三实施例,基于上述本发明基于手势识别的小车控制装置第一实施例,在本实施例中,所述确定模块,具体用于在识别到手心的起始位置后,延时第一预设时间;在所述第一预设时间计时到后在第二预设时间内判断所述手心的第一抖动范围是否在所述第一预设值内;当所述手心的第一抖动范围在所述第一预设值内时,则识别所述第二预设时间内手心所在区域的中心点为手心的结束位置。Further, the third embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention. In the embodiment, the determining module is specifically configured to recognize the palm of the hand. After the initial position, delaying the first preset time; determining whether the first jitter range of the palm is at the first preset value within a second preset time after the first preset time is counted When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
上述第三实施例与第二实施例不同的是:在识别到手心的起始位置后,需要延时一个第一预设时间,在所述第一预设时间内不做任何识别处理,在所述第一预设时间计时到后再通过第二预设时间内通过判断手心的第一抖动范围是否在第一预设值内,如果在第一预设值内,则识别为有效手势。如具体的在识别到手心的起始位置后,延迟第一预设时间0.5秒,0.5秒计时到后在第二预设时间0.5秒内判断手心的第一抖动范围是否不超过第一预设值5厘米的抖动,如果是,则识别所述第二预设时间0.5内手心所在区域的中心点为手心的结束位置。此处增加第一预设时间的延时目的在于识别到手心的起始位置后,由于后续手掌的手势还在连续动作,因此不能立即接着判断手势是否结束的动作,根据不同人的手势动作的速度选择一个合适的延时时间,如0.5秒,0.5秒时间到后再判断手心的抖动这样能使得手心的结束位置判断准确。The third embodiment is different from the second embodiment in that after the initial position of the palm is recognized, a first preset time needs to be delayed, and no recognition processing is performed in the first preset time. After the first preset time is counted, it is determined whether the first jitter range of the palm is within the first preset value by the second preset time, and if it is within the first preset value, it is recognized as a valid gesture. For example, after the initial position of the palm is recognized, the first preset time is delayed by 0.5 seconds, and after 0.5 seconds, the first jitter range of the palm is judged to not exceed the first preset within 0.5 second of the second preset time. A value of 5 cm is shaken, and if so, the center point of the area where the palm is located within 0.5 of the second preset time is identified as the end position of the palm. Here, the purpose of increasing the delay of the first preset time is to recognize the start position of the palm, and since the gesture of the subsequent palm is still in continuous motion, the action of whether the gesture ends or not can not be immediately followed, according to the gesture of different people. Select a suitable delay time for the speed, such as 0.5 seconds. After 0.5 seconds, the jitter of the palm can be judged so that the end position of the palm can be judged accurately.
进一步的,基于手势识别的小车控制装置的第四实施例,基于上述本发明基于手势识别的小车控制装置第二或第三实施例,本实施例中,判断手心的抖动通过根据手掌的灰度值判断手心的位移的变化实现。Further, the fourth embodiment of the gesture recognition-based car control device is based on the second or third embodiment of the above-described gesture recognition-based car control device of the present invention. In this embodiment, the jitter of the palm is judged by the gray scale according to the palm. The value determines the change in the displacement of the palm.
由于通过Kinect红外传感器可以获取物体的深度信息,因此在上述用户的预设动作识别到手掌所在区域时,可以进一步识别出手掌所在区域的图像的灰度值参数即深度信息,即手掌与Kinect传感器的距离信息,根据其深度信息可以判断出用户的手掌位置与Kinect传感器的大致距离,如可以是不同的距离区间,如2-3米、3-4米、......、19-20米不同的距离区间,根据此距离,可以换算得到识别手掌移动预设距离如5厘米时手掌屏幕中的图像移动的预设距离。当获取了用户手掌所在区域的预设灰度值参数即深度信息参数后,在识别用户手掌的手势动作时,可以通过户手掌所在区域的深度信息从用户的深度图像中分离出手势的深度图像信息,并进一步根据深度信息结合手心在屏幕中抖动的范围,就可以识别到手心时间抖动的范围值。例如:通过手掌关节确定手心(一个点)后,根据手掌的深度信息判断出手掌距离镜头10米,如果在0.5s内,手掌在屏幕中抖动的距离为0.1cm以内,可以判断出实际手心抖动5cm以内,根据其实际手心抖动在预设值内,识别为有效的手势。Since the depth information of the object can be acquired by the Kinect infrared sensor, when the preset motion of the user recognizes the area where the palm is located, the gray value parameter of the image of the area where the palm is located can be further recognized, that is, the depth information, that is, the palm and the Kinect sensor. The distance information can be determined according to the depth information of the user's palm position and the approximate distance of the Kinect sensor, such as different distance intervals, such as 2-3 meters, 3-4 meters, ..., 19- A distance range of 20 meters, according to which distance, a preset distance for recognizing the movement of the image in the palm screen when the palm is moved by a preset distance such as 5 cm can be converted. After the preset gray value parameter, that is, the depth information parameter, is obtained in the area where the user's palm is located, when the gesture motion of the user's palm is recognized, the depth image of the gesture may be separated from the depth image of the user by the depth information of the area where the palm of the user is located. The information, and further based on the depth information combined with the range of the palm's jitter in the screen, can identify the range value of the palm time jitter. For example, after the palm of the hand is determined by the palm joint (one point), the palm is judged to be 10 meters away from the lens according to the depth information of the palm. If the distance of the palm in the screen is within 0.1 cm within 0.5 s, the actual palm shake can be judged. Within 5cm, it is recognized as a valid gesture according to its actual palm jitter within the preset value.
进一步的,基于手势识别的小车控制装置的第五实施例,基于上述本发明基于手势识别的小车控制装置第一实施例,本实施例中,基于手心的起始位置和结束位置识别手掌的运动轨迹具体包括:Further, the fifth embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention. In this embodiment, the palm movement is recognized based on the starting position and the ending position of the palm. The track details include:
基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手掌的运动轨迹。A coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
通过判断手心的起始位置,并以起始位置为原点,建立坐标系,y轴正方向向下,x轴正方向向右,计算结束点和起始点的横纵坐标和纵坐标之差(结束点减起始点),通过结束点和起始点的横纵坐标和纵坐标差值识别手掌的运动轨迹,如纵坐标差为正,且绝对值大于横坐标差的绝对值,识别为手掌向上移动;纵坐标差为负,且绝对值大于横坐标差的绝对值,识别为手掌向下移动;横坐标差为正,且绝对值大于纵坐标差的绝对值,识别为手掌向右移动;横坐标差为负,且绝对值大于纵坐标差的绝对值,识别为手掌向左移动。By judging the starting position of the palm and starting from the starting position, the coordinate system is established, the positive direction of the y-axis is downward, and the positive direction of the x-axis is to the right, and the difference between the horizontal and vertical coordinates and the ordinate of the end point and the starting point is calculated ( End point minus starting point), the trajectory of the palm is recognized by the difference between the ordinate and the ordinate of the end point and the starting point. If the ordinate difference is positive and the absolute value is greater than the absolute value of the abscissa difference, it is recognized as palm up. Movement; the ordinate difference is negative, and the absolute value is greater than the absolute value of the abscissa difference, which is recognized as the palm moving downward; the abscissa difference is positive, and the absolute value is greater than the absolute value of the ordinate coordinate, which is recognized as the palm moving to the right; The difference in the abscissa is negative, and the absolute value is greater than the absolute value of the ordinate difference, which is recognized as the palm moving to the left.
需要说明的是,在基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手掌的运动轨迹时,为了识别更加准确,在确定手心的起始位置时,需要延迟一个第一预设时间如0.5s后才识别结束位置。识别上述手掌的运动轨迹如手掌向上移动、手掌向下移动、手掌向右移动、手掌向左移动时,由于识别到手心的起始位置后一直为连续运动,因此在所述第一预设时间结束时手势还未结束,还不能检测到手心的抖动范围在第一预设值内,即不能检测到手心的结束位置;而是在所述第一预设时间后的第二预设时间内检测到手心的抖动范围在第一预设值内时即识别所述第二预设时间内手心所在区域的中心点为手心的结束位置,所述第二预设时间的终了是点为手势的结束时间;而对于手掌握拳动作,由于手掌一直没有运动,因此在上述第一预设时间结束时即可以马上识别到手心的抖动范围在第一预设值内即识别到手心的结束位置,以此识别到手掌握拳动作,即根据在上述第一预设时间结束时是否立即识别到手心的结束位置来识别手掌是否为握拳动作。It should be noted that when the coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position, in order to identify more accurately, when determining the starting position of the palm, it is necessary to delay one. The end position is recognized only after the first preset time is 0.5 s. Recognizing the movement trajectory of the palm as the palm moves upward, the palm moves downward, the palm moves to the right, and the palm moves to the left, since the initial position of the palm is recognized as continuous motion, the first preset time is The gesture is not finished at the end, and the jitter range of the palm is not detected within the first preset value, that is, the end position of the palm cannot be detected; but the second preset time after the first preset time When the range of the jitter of the palm is detected within the first preset value, that is, the center point of the region where the palm is located in the second preset time is the end position of the palm, and the end of the second preset time is the gesture. End time; for the hand to master the boxing action, since the palm has no motion, the end of the first preset time can immediately recognize that the range of the palm of the hand is within the first preset value, that is, the end position of the palm is recognized, This recognizes that the hand grasps the punching action, that is, whether or not the palm is a clenching action based on whether or not the end position of the palm is immediately recognized at the end of the first preset time.
上述基于手势识别的小车控制装置实施例中,结合具体的小车控制系统,其装置的各模块可以具体设置如下:In the above embodiment of the vehicle control device based on gesture recognition, combined with a specific car control system, each module of the device can be specifically set as follows:
小车控制系统通过Kinect传感器检测用户手掌的图像信息并可进一步检测手心所在位置的图像灰度值信息,并进一步识别出手心的起始位置和结束位置,因此确定模块1设置在Kinect传感器中,Kinect传感器识别出手心的起始位置和结束位置后,将其位置信息发送至终端设备如PC端,Kinect传感器可以通过有线的方式与终端设备连接,终端设备根据起始位置和结束位置判断手掌的手势信号,因此判断模块2设置在终端设备中,或者Kinect传感器也可以直接基于手心的起始位置和结束位置分析判断出手势信号并发给终端设备,此时判断模块2也设置在Kinect传感器中,终端设备根据手势信号调取所述手势信号对应的手势控制命令,因此命令生成模块3设置在终端设备中,最后终端设备将手势控制命令通过无线传输如蓝牙的方式发送给小车上的控制器,小车控制器根据手势控制命令控制小车的动作,因此控制模块4设置在小车上。The car control system detects the image information of the palm of the user through the Kinect sensor and can further detect the image gray value information of the position of the palm, and further recognizes the starting position and the ending position of the palm, so the determining module 1 is set in the Kinect sensor, Kinect After the sensor recognizes the starting position and the ending position of the palm, the position information is sent to the terminal device such as the PC end, and the Kinect sensor can be connected to the terminal device by wire, and the terminal device judges the gesture of the palm according to the starting position and the ending position. Signal, so the judging module 2 is set in the terminal device, or the Kinect sensor can directly determine the gesture signal and send it to the terminal device based on the initial position and the end position analysis of the palm, and the judging module 2 is also set in the Kinect sensor. The device retrieves the gesture control command corresponding to the gesture signal according to the gesture signal, so the command generation module 3 is set in the terminal device, and finally the terminal device sends the gesture control command to the controller on the trolley by wireless transmission, such as Bluetooth, the car Controller according to hand Car operation control command, the control module 4 is provided on the carriage.
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。The above are only the preferred embodiments of the present invention, and are not intended to limit the scope of the invention, and the equivalent structure or equivalent process transformations made by the description of the present invention and the drawings are directly or indirectly applied to other related technical fields. The same is included in the scope of patent protection of the present invention.

Claims (10)

  1. 一种基于手势识别的小车控制方法,其特征在于,所述控制方法包括以下步骤: A trolley control method based on gesture recognition, characterized in that the control method comprises the following steps:
    确定手心的起始位置和结束位置,并基于手心的起始位置和结束位置识别手心的运动轨迹,并根据所述手心的运动轨迹判断手掌的手势信号;Determining a starting position and an ending position of the palm, and identifying a movement trajectory of the palm based on the starting position and the ending position of the palm, and determining a gesture signal of the palm according to the movement trajectory of the palm;
    基于所述手势信号调取所述手势信号对应的手势控制命令;And acquiring, according to the gesture signal, a gesture control command corresponding to the gesture signal;
    根据所述手势控制命令控制小车运动。The cart motion is controlled according to the gesture control command.
  2. 如权利要求1所述的基于手势识别的小车控制方法,其特征在于,所述确定手心的起始位置具体包括:The gesture recognition-based car control method according to claim 1, wherein the determining the starting position of the palm comprises:
    根据手掌关节点确定手心位置,并在预设时间内判断所述手心的抖动范围是否在预设值内;Determining the position of the palm according to the joint point of the palm, and determining whether the range of the palm of the palm is within a preset value within a preset time;
    当所述手心的抖动范围在所述预设值内时,识别为有效手势,则识别所述预设时间内手心所在区域的中心点为手心的起始位置。When the jitter range of the palm is within the preset value, and is recognized as an effective gesture, the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
  3. 如权利要求1所述的基于手势识别的小车控制方法,其特征在于,所述确定手心的结束位具体包括:The gesture recognition-based trolley control method according to claim 1, wherein the determining the end position of the palm of the hand comprises:
    在识别到手心的起始位置后,延时第一预设时间;After identifying the starting position of the palm, delaying the first preset time;
    在所述第一预设时间计时到后在第二预设时间内判断所述手心的第一抖动范围是否在所述第一预设值内; Determining, in the second preset time, whether the first jitter range of the palm is within the first preset value after the first preset time is counted;
    当所述手心的第一抖动范围在所述第一预设值内时,则识别所述第二预设时间内手心所在区域的中心点为手心的结束位置。When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
  4. 如权要求2或3所述的基于手势识别的小车控制方法,其特征在于,A gesture recognition-based car control method according to claim 2 or 3, characterized in that
    所述判断手心的抖动通过根据手掌的灰度值判断手心的位移的变化实现。The judgment of the jitter of the palm is achieved by judging the change in the displacement of the palm according to the gradation value of the palm.
  5. 如权利要求1所述的基于手势识别的小车控制方法,其特征在于,所述基于手心的起始位置和结束位置识别手心的运动轨迹具体包括:The gesture recognition-based trolley control method according to claim 1, wherein the motion trajectory of the palm of the hand based on the starting position and the ending position of the palm center comprises:
    基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手心的运动轨迹。A coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
  6. 一种基于手势识别的小车控制装置,其特征在于,所述基于手势识别的小车控制装置包括:A gesture recognition-based car control device, characterized in that the gesture recognition-based car control device comprises:
    确定模块,用于确定手心的起始位置和结束位置;Determining a module for determining a starting position and an ending position of the palm;
    判断模块,用于基于手心的起始位置和结束位置识别手心的运动轨迹,并根据所述手心的运动轨迹判断手掌的手势信号;a judging module, configured to identify a motion trajectory of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion trajectory of the palm;
    命令生成模块,用于根据所述手势信号调取所述手势信号对应的手势控制命令;a command generating module, configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal;
    控制模块,用于根据所述手势控制命令控制小车运动。And a control module, configured to control the movement of the trolley according to the gesture control command.
  7. 如权利要求6所述的基于手势识别的小车控制装置,其特征在于,A gesture recognition-based car control device according to claim 6, wherein
    所述确定模块,具体用于根据手掌关节点确定手心位置,并在预设时间内判断所述手心的抖动范围是否在预设值内;当所述手心的抖动范围在所述预设值内时,识别为有效手势,则识别所述预设时间内手心所在区域的中心点为手心的起始位置。The determining module is specifically configured to determine a position of the palm according to the palm joint point, and determine whether the jitter range of the palm is within a preset value within a preset time; when the jitter range of the palm is within the preset value When it is recognized as a valid gesture, the center point of the area where the palm is located within the preset time is identified as the starting position of the palm.
  8. 如权利要求6所述的基于手势识别的小车控制装置,其特征在于,A gesture recognition-based car control device according to claim 6, wherein
    所述确定模块,具体用于在识别到手心的起始位置后,延时第一预设时间;在所述第一预设时间计时到后在第二预设时间内判断所述手心的第一抖动范围是否在所述第一预设值内;当所述手心的第一抖动范围在所述第一预设值内时,则识别所述第二预设时间内手心所在区域的中心点为手心的结束位置。The determining module is configured to delay a first preset time after the initial position of the palm is recognized; and determine the first of the palms in the second preset time after the first preset time is counted Whether a jitter range is within the first preset value; when the first jitter range of the palm is within the first preset value, identifying a center point of a region where the palm is located in the second preset time The end position of the palm.
  9. 如权利要求7或8所述的基于手势识别的小车控制装置,其特征在于,所述判断手心的抖动根据手掌的灰度值判断手心的位移的变化实现。The gesture recognition-based car control device according to claim 7 or 8, wherein the judging of the palm of the hand is performed by determining a change in the displacement of the palm according to the gradation value of the palm.
  10. 如权利要求6所述的基于手势识别的小车控制装置,其特征在于,A gesture recognition-based car control device according to claim 6, wherein
    所述判断模块,具体用于基于手心的起始位置建立坐标系,通过判断起始位置和结束位置的坐标识别手掌的运动轨迹。The determining module is specifically configured to establish a coordinate system based on a starting position of the palm, and identify a motion track of the palm by determining coordinates of the starting position and the ending position.
PCT/CN2017/077425 2017-03-21 2017-03-21 Gesture recognition-based robot car control method and device WO2018170713A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/077425 WO2018170713A1 (en) 2017-03-21 2017-03-21 Gesture recognition-based robot car control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/077425 WO2018170713A1 (en) 2017-03-21 2017-03-21 Gesture recognition-based robot car control method and device

Publications (1)

Publication Number Publication Date
WO2018170713A1 true WO2018170713A1 (en) 2018-09-27

Family

ID=63584080

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/077425 WO2018170713A1 (en) 2017-03-21 2017-03-21 Gesture recognition-based robot car control method and device

Country Status (1)

Country Link
WO (1) WO2018170713A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112926454A (en) * 2021-02-26 2021-06-08 重庆长安汽车股份有限公司 Dynamic gesture recognition method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193626A (en) * 2010-03-15 2011-09-21 欧姆龙株式会社 Gesture recognition apparatus, method for controlling gesture recognition apparatus, and control program
CN103390168A (en) * 2013-07-18 2013-11-13 重庆邮电大学 Intelligent wheelchair dynamic gesture recognition method based on Kinect depth information
JP2014085963A (en) * 2012-10-25 2014-05-12 Nec Personal Computers Ltd Information processing device, information processing method, and program
CN103869974A (en) * 2012-12-18 2014-06-18 现代自动车株式会社 System and method for effective section detecting of hand gesture
CN103941866A (en) * 2014-04-08 2014-07-23 河海大学常州校区 Three-dimensional gesture recognizing method based on Kinect depth image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193626A (en) * 2010-03-15 2011-09-21 欧姆龙株式会社 Gesture recognition apparatus, method for controlling gesture recognition apparatus, and control program
JP2014085963A (en) * 2012-10-25 2014-05-12 Nec Personal Computers Ltd Information processing device, information processing method, and program
CN103869974A (en) * 2012-12-18 2014-06-18 现代自动车株式会社 System and method for effective section detecting of hand gesture
CN103390168A (en) * 2013-07-18 2013-11-13 重庆邮电大学 Intelligent wheelchair dynamic gesture recognition method based on Kinect depth information
CN103941866A (en) * 2014-04-08 2014-07-23 河海大学常州校区 Three-dimensional gesture recognizing method based on Kinect depth image

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112926454A (en) * 2021-02-26 2021-06-08 重庆长安汽车股份有限公司 Dynamic gesture recognition method
CN112926454B (en) * 2021-02-26 2023-01-06 重庆长安汽车股份有限公司 Dynamic gesture recognition method

Similar Documents

Publication Publication Date Title
AU2014297039B2 (en) Auto-cleaning system, cleaning robot and method of controlling the cleaning robot
WO2018088806A1 (en) Image processing apparatus and image processing method
EP3237256A1 (en) Controlling a vehicle
US10059005B2 (en) Method for teaching a robotic arm to pick or place an object
JP2021507434A (en) Systems and methods for predicting pedestrian intent
US20190163284A1 (en) Apparatus and method for remote control using camera-based virtual touch
WO2017119664A1 (en) Display apparatus and control methods thereof
WO2016192438A1 (en) Motion sensing interaction system activation method, and motion sensing interaction method and system
KR102568708B1 (en) Apparatus and method for recognizing hand gestures in a virtual reality headset
WO2011030624A1 (en) Display device and control method
WO2013114493A1 (en) Communication draw-in system, communication draw-in method, and communication draw-in program
WO2015100911A1 (en) Medical device, and method and apparatus for adjusting resolution of digital image thereof
GB2474536A (en) Computer vision gesture based control by hand shape recognition and object tracking
WO2020145688A1 (en) Electronic device and controlling method thereof
WO2020124987A1 (en) Method and device for detecting automatic operation vehicle component for automatic vehicle washing
WO2019237501A1 (en) Article display system and method
JP2017510875A (en) Gesture device, operation method thereof, and vehicle equipped with the same
WO2020046038A1 (en) Robot and control method therefor
WO2015106503A1 (en) Embedded mainboard for parking lot control system and parking lot control system
WO2018170713A1 (en) Gesture recognition-based robot car control method and device
WO2021194254A1 (en) Electronic device for displaying image by using camera monitoring system (cms) side display mounted in vehicle, and operation method thereof
JP2008080472A (en) Robot system and its control method
JPH06110543A (en) Direct teaching device
WO2018018597A1 (en) Robot capable of automatically detecting and avoiding obstacle, system, and method
US20190169917A1 (en) Control device and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17901760

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17901760

Country of ref document: EP

Kind code of ref document: A1