CN112115853A - Gesture recognition method and device, computer storage medium and electronic equipment - Google Patents

Gesture recognition method and device, computer storage medium and electronic equipment Download PDF

Info

Publication number
CN112115853A
CN112115853A CN202010978670.6A CN202010978670A CN112115853A CN 112115853 A CN112115853 A CN 112115853A CN 202010978670 A CN202010978670 A CN 202010978670A CN 112115853 A CN112115853 A CN 112115853A
Authority
CN
China
Prior art keywords
gesture
track
action
coordinate
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010978670.6A
Other languages
Chinese (zh)
Inventor
宁瑞芳
孙景峰
李�权
陈永辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Lingkong Electronic Technology Co Ltd
Original Assignee
Xian Lingkong Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Lingkong Electronic Technology Co Ltd filed Critical Xian Lingkong Electronic Technology Co Ltd
Priority to CN202010978670.6A priority Critical patent/CN112115853A/en
Publication of CN112115853A publication Critical patent/CN112115853A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

A gesture recognition method, a gesture recognition device, a computer storage medium and an electronic device belong to the field of image recognition, and are characterized by comprising the following steps: segmenting a hand region from a video frame containing a hand image by a threshold segmentation method, and detecting a gesture action; acquiring a centroid coordinate of the hand area; when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action; and taking the recorded centroid coordinate of the hand area as an effective track coordinate, carrying out track type identification on the effective track coordinate, and providing an identification result to preset finger reading equipment for identification response. The method can effectively solve the problem of gesture recognition error caused by gesture track change caused by execution of different executors of the same gesture, can greatly improve the detection recognition rate of dynamic gestures, has universal applicability and real-time performance, and can be used in industries such as actual life, industry, control and smart home.

Description

Gesture recognition method and device, computer storage medium and electronic equipment
Technical Field
The invention belongs to the field of image recognition, and particularly relates to a gesture recognition method and device, a computer storage medium and electronic equipment.
Background
Currently, research on gesture recognition is mainly based on computer vision, and the gesture recognition based on vision mainly includes two parts, namely static gesture recognition and dynamic gesture recognition. Static gesture recognition is to change of hand type, different meanings are expressed through recognized different hand types, and a point in space corresponds to the static gesture recognition and does not include a gesture motion track. The dynamic gesture recognition is composed of a series of gesture actions, and mainly researches on continuous hand shape changes and track changes in a gesture video sequence. The dynamic gestures have variability in the motion process of the gestures, and meanwhile, the difficulty of gesture recognition is increased by complex actual environments. The changeful body of the gesture is that the state and the track of the gesture are changed, the gesture can be extended, a fist can be made, the state of fingers can be bent, and even the same person can show different tracks when doing the same gesture. In a dynamic gesture recognition system, the effect of dynamic gesture recognition is influenced by illumination change, complex background and skin color-like interference, a dynamic gesture track is usually formed by a series of continuous points, and how to determine the initial point of an effective dynamic gesture track is important, and the initial point is related to whether the gesture track can be completely extracted or not, so that the recognition rate of the gesture track is reduced.
Disclosure of Invention
The present invention is directed to solve the above problems, and provides a gesture recognition method, a gesture recognition apparatus, a computer storage medium, and an electronic device that incorporate trajectory geometric characteristics.
In a first aspect, the present invention provides a gesture recognition method, including: segmenting a hand region from a video frame containing a hand image by a threshold segmentation method, and detecting a gesture action; acquiring a centroid coordinate of the hand area; when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action; and taking the recorded centroid coordinate of the hand area as an effective track coordinate, carrying out track type identification on the effective track coordinate, and providing an identification result to preset finger reading equipment for identification response.
Preferably, the track types include a straight line type and a curve type; the straight line type comprises a straight line track and a broken line track; the curve class comprises a circular track and an S-shaped track; the circular tracks include single circular tracks and multi-circular tracks.
Preferably, the track type identification process includes: the sum of all adjacent two points in the effective track coordinate is approximately equal to the distance from the starting point to the end point, and the distance is a straight line; the sum of all adjacent two points between the starting point and a fixed point in the effective track coordinate is approximately equal to the distance from the starting point to the end point, and the distance from the fixed point to the end point after the fixed point is larger than the sum of all adjacent two points is a broken line type; the distance between the two adjacent points in the effective track coordinate, which is greater than the sum of the two adjacent points, and the starting point and the end point is in a curve class.
Preferably, the track type identification process includes: the distances between all points in the effective track coordinates and the starting point are increased firstly, and the trend of decreasing after the maximum distance is the circular track; the distances between all points in the effective track coordinates and the starting point are increased and then reduced, and the increasing trend is an S-shaped track.
Preferably, the track type identification process includes: the distances between all points in the effective track coordinates and the starting point are increased firstly and then decreased after reaching the maximum, and the maximum and the minimum values are multi-circle tracks; the number of the maximum values is the same as that of the minimum values; the number of the multiple circles is the same as the maximum number.
Preferably, the recognition result includes an invalid gesture.
Preferably, the hand image is provided with a specific color; the specific color is set in the detection of the video frame hand, so that the influence of other non-operator hands in the video is avoided, and the operator hand is easily detected in each frame image of the video, so that the gesture can be segmented and the feature can be extracted in the subsequent process.
In a second aspect, the present invention provides a gesture recognition apparatus, including: the acquisition module is used for segmenting a hand region from a video frame containing a hand image by a threshold segmentation method and detecting gesture actions; acquiring a centroid coordinate of the hand area; the judging module is used for detecting gesture actions; when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action; and the recognition module is used for taking the recorded centroid coordinates of the hand area as effective track coordinates, carrying out track type recognition on the effective track coordinates, and providing recognition results to preset finger reading equipment for recognition response.
In a third aspect, the present invention provides a computer storage medium having a computer program stored thereon; the computer program when executed by a processor implements the gesture recognition method of the first aspect described above.
In a fourth aspect, the present invention provides a gesture recognition electronic device, including a processor and a memory; the memory is used for storing executable instructions of the processor; the processor is configured to perform the gesture recognition method of the first aspect described above via execution of the executable instructions.
One or more technical schemes in the embodiment of the invention at least have one or more of the following advantages and positive effects: dividing a hand region from a video frame containing a hand image by a threshold segmentation method, and detecting a gesture action; acquiring a centroid coordinate of the hand area; when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action; and taking the recorded centroid coordinate of the hand area as an effective track coordinate, carrying out track type identification on the effective track coordinate, and providing an identification result to preset finger reading equipment for identification response. The method can effectively solve the problem of gesture recognition error caused by gesture track change caused by execution of different executors of the same gesture, can greatly improve the detection recognition rate of dynamic gestures, has universal applicability and real-time performance, and can be used in industries such as actual life, industry, control and smart home.
Drawings
FIG. 1 is a general block diagram of a gesture recognition method according to the present invention;
FIG. 2 is a detailed flow chart of the gesture recognition method of the present invention;
FIG. 3 is a diagram of a start gesture in accordance with an embodiment of the present invention;
FIG. 4 is a diagram of an end gesture according to an embodiment of the present invention;
FIG. 5 is a schematic view of a hand region according to an embodiment of the present invention;
FIG. 6 is a sigmoidal trace plot according to an embodiment of the present invention;
FIG. 7 is a circular trajectory diagram according to an embodiment of the present invention;
FIG. 8 is a diagram of a double circle trajectory in accordance with an embodiment of the present invention;
FIG. 9 is a polyline trace diagram according to an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating the trend of distance variation in a circular trajectory according to an embodiment of the present invention;
fig. 11 is a schematic diagram of a distance variation trend in the S-shaped locus according to the embodiment of the present invention.
Detailed Description
The gesture recognition method, the gesture recognition apparatus, the computer storage medium, and the electronic device according to the present invention are described in detail with reference to the accompanying drawings and embodiments.
The example embodiments may be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, apparatus, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a", "an", "all" and "the" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
The technical scheme provided by the invention has the following general idea: as shown in fig. 2, after the camera frame image is obtained, the frame image is subjected to image enhancement and other preprocessing operations, so that the hand in the frame image is clearer, the hand region is extracted according to the features of the hand in the next step, then the hand region is extracted from the enhanced image, and after the hand region is extracted, the centroid coordinate of the hand region shown in fig. 5 is obtained; and simultaneously detecting whether a sign for starting the gesture exists or not, detecting whether a gesture sign for ending exists or not in subsequent detection after detecting the sign for starting the gesture, and storing the centroid coordinate of the detected hand until detecting the sign for ending the gesture without ending the gesture. And then, processing and track classification identification are carried out on effective track point coordinates according to the effective track coordinates detected and stored in the front and the effective centroid coordinates of the effective hand.
Example one
The present disclosure firstly provides a gesture recognition method, as shown in fig. 1, including the following steps:
step S110: segmenting a hand region from a video frame containing a hand image by a threshold segmentation method, and detecting a gesture action; acquiring a centroid coordinate of the hand area; the hand image is provided with a specific color; in order to avoid the influence of other non-operator hands in the video, the hand of the operator performing the gesture wears gloves with a specific color (in the embodiment, the specific color is selected to be red), so that the hand of the operator is easily detected in each frame of image of the video, and the gesture is segmented and subjected to feature extraction in the following process; step S120: when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action; in the track acquisition of each gesture, in order to avoid the interference of gesture misoperation in the execution of the gesture, the starting action of the gesture shown in fig. 3 and the ending action identification shown in fig. 4 are added, so that an effective gesture action part is accurately and completely extracted from the whole gesture action;
step S130: and taking the recorded centroid coordinate of the hand area as an effective track coordinate, carrying out track type identification on the effective track coordinate, and providing an identification result to preset finger reading equipment for identification response, wherein the identification result comprises an invalid gesture.
The specific implementation of step S110 in this embodiment is as follows:
step S111: extracting gestures in the video frame;
the method comprises the steps of obtaining a video frame containing a hand image through a camera, preprocessing and enhancing the image, then carrying out color space conversion, converting the image into a color space with obvious red characteristic expression, and then carrying out gesture segmentation. And segmenting a hand gesture area in the picture with obvious red characteristic representation by using a threshold segmentation method, performing morphological operation on the segmented gesture area so as to eliminate interference of surrounding noise points and fill foreground image holes, then performing contour search and extraction on the picture, and finding out the area with the largest foreground in the picture, namely the foreground of the gesture.
Step S112: initiating recognition of a gesture;
on the basis of the acquired gesture foreground, finding out the outermost contour of the foreground by using a contour searching method, calculating a convex hull of the contour, and counting the number of acute angles of the convex hull to determine whether the current gesture is a starting gesture, and if the current gesture is the starting gesture, calculating the centroid of the current foreground gesture to serve as a first point of a gesture track;
the specific implementation of step S120 in this embodiment is as follows:
after the first gesture track point is stored, continuing the operation and storing the track point obtained by calculation, identifying whether a gesture is ended or not by using the method in the step S112, if the gesture is ended, stopping the calculation and storage of the track point, and entering a track classification flow of the next point; if the ending gesture is not recognized, continuing to perform the detection and recognition processes of the steps S111 and S112 of the next frame;
the specific implementation of step S130 in this embodiment is as follows: first, whether the trajectory is a straight line class or a curved line class is distinguished: if the sum of the two adjacent points is approximately equal to the distance between the starting point and the end point, the points are in a straight line type, and the straight line type algorithm identification flow is entered; the effective trajectory coordinates of the two adjacent points are determined by the following steps, wherein a certain point after the starting point in the effective trajectory coordinates satisfies the characteristics of straight lines, that is, the sum of all the adjacent two points is approximately equal to the distance from the starting point to the end point, and the sum of the adjacent two points after the certain point is greater than the distance from the fixed point to the end point, and the effective trajectory coordinates have the two characteristics of a polyline trajectory as shown in fig. 9. If the sum ratio of the two adjacent points is greater than the distance between the starting point and the end point, the curve class is determined, and the algorithm identification process of the curve class is entered. For the straight line type, the direction of the straight line is judged according to the coordinates of the starting point and the coordinates of the end point, and whether the straight line is vertical, horizontal, lower right to upper left or lower left to upper right is judged according to the offset of the x direction and the y direction. For curve type, calculating the distance between each point and the starting point, judging the increasing trend of the distance by using the calculated distance, if the distance is increased to the maximum and then decreased, the curve type is a circle type, if only one minimum value point is a single-circle track as shown in FIG. 7, and if two minimum value points are provided, the curve type is a double-circle track as shown in FIG. 8; if the distances between all points and the starting point are in accordance with the trend of increasing, decreasing and increasing, then the S-shaped locus is shown in FIG. 6.
AuthenticationFor example: if the coordinate of the point A is (x)1,y1) The coordinate of the point B is (x)2,y2) Then the distance between AB is:
Figure BDA0002686759210000051
if there are n points, the coordinates are (x)1,y1)、(x2,y2)、(x3,y3)……(xm,ym)……(xn,yn) The distance between two adjacent points is D12、D23、D34……D(n-1)nFirst point (x)1,y1) And last point (x)n,yn) A distance of D1nBy analogy with D1mAnd DmnFrom the basic characteristics of a straight line, the following formula holds:
D12+D23+D34+......+D(n-1)n≈D1n
the following formula is established according to the basic characteristics of the broken line:
D12+D23+D34+......+D(m-1)m≈D1m
Dm(m+1)+D(m+1)(m+2)+......+D(n-1)n≥Dmn
the following formula holds from the basic characteristics of the curve:
D12+D23+D34+......+D(n-1)n≥D1n
as shown in fig. 10, the distances in the circle have the following relationship trend:
D12<D13<D14<D15<D16>D17>D18>D19>D110
as shown in fig. 11, the distance in the S-shape has the following relationship change trend:
D12<D13<D14>D15>D16>D17>D18<D19<D110<D111<D112
by analogy, other complex tracks can be identified and classified by using unique characteristics of the tracks. The number of the maximum values is the same as that of the minimum values; the number of the multiple circles is the same as the maximum number.
Example two
Based on the same inventive concept as the gesture recognition method in the first embodiment, the embodiment discloses a gesture recognition device, and the gesture recognition device in the embodiment of the disclosure comprises an acquisition module, a judgment module and a recognition module.
In the embodiment of the disclosure, the acquisition module is used for segmenting a hand region from a video frame containing a hand image by a threshold segmentation method and detecting a gesture action; the centroid coordinates of the hand region are acquired.
The judging module is used for detecting gesture actions in the embodiment of the disclosure; when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture motion is matched with the preset ending motion.
The recognition module is used for taking the recorded centroid coordinate of the hand area as an effective track coordinate, performing track type recognition on the effective track coordinate, and providing a recognition result to the preset finger reading device for recognition response.
The specific example of the foregoing embodiment is also applicable to a gesture recognition apparatus of the present embodiment, and through the detailed description of the foregoing gesture recognition method, a person skilled in the art can clearly know the implementation method of the gesture recognition apparatus in the present embodiment, so for the brevity of the description, details are not repeated herein.
EXAMPLE III
Based on the same inventive concept as the gesture recognition method described in the first embodiment, the present embodiment further discloses a computer storage medium on which a computer program is stored; in an embodiment of the present disclosure, the computer program is executed by a processor to implement the gesture recognition method of the first embodiment, and the execution process includes: after the camera frame image is obtained, performing image enhancement and other preprocessing operations on the frame image to enable the hand in the frame image to be clearer, facilitating the next step of extracting the hand region according to the hand characteristics, then extracting the hand region in the enhanced image, and obtaining the centroid coordinate of the hand region after extracting the hand region; and simultaneously detecting whether a sign for starting the gesture exists or not, detecting whether a gesture sign for ending exists or not in subsequent detection after detecting the sign for starting the gesture, and storing the centroid coordinate of the detected hand until detecting the sign for ending the gesture without ending the gesture. And then, processing and track classification identification are carried out on effective track point coordinates according to the effective track coordinates detected and stored in the front and the effective centroid coordinates of the effective hand.
Example four
Based on the same inventive concept as the gesture recognition method in the first embodiment, the embodiment also discloses a gesture recognition electronic device, which comprises a processor and a memory; the memory is used for storing executable instructions of the processor; the electronic device is in the form of a general purpose computing device. Components of the electronic device may include, but are not limited to: the at least one processor, the at least one memory, and a bus connecting the various system components. In a specific implementation, the processor is configured to execute the gesture recognition method according to the first embodiment by executing the executable instructions.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Therefore, the embodiments of the present disclosure can be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A gesture recognition method, comprising:
segmenting a hand region from a video frame containing a hand image by a threshold segmentation method, and detecting a gesture action; acquiring a centroid coordinate of the hand area;
when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action;
and taking the recorded centroid coordinate of the hand area as an effective track coordinate, carrying out track type identification on the effective track coordinate, and providing an identification result to preset finger reading equipment for identification response.
2. The gesture recognition method according to claim 1, wherein: the track types comprise a straight line type and a curve type; the straight line type comprises a straight line track and a broken line track; the curve class comprises a circular track and an S-shaped track; the circular tracks include single circular tracks and multi-circular tracks.
3. The gesture recognition method according to claim 1, wherein: the track type identification process comprises the following steps: the sum of all adjacent two points in the effective track coordinate is approximately equal to the distance from the starting point to the end point, and the distance is a straight line; the sum of all adjacent two points between the starting point and a fixed point in the effective track coordinate is approximately equal to the distance from the starting point to the end point, and the distance from the fixed point to the end point after the fixed point is larger than the sum of all adjacent two points is a broken line type; the distance between the two adjacent points in the effective track coordinate, which is greater than the sum of the two adjacent points, and the starting point and the end point is in a curve class.
4. The gesture recognition method according to claim 3, wherein: the track type identification process comprises the following steps: the distances between all points in the effective track coordinates and the starting point are increased firstly, and the trend of decreasing after the maximum distance is the circular track; the distances between all points in the effective track coordinates and the starting point are increased and then reduced, and the increasing trend is an S-shaped track.
5. The gesture recognition method according to claim 4, wherein: the track type identification process comprises the following steps: the distances between all points in the effective track coordinates and the starting point are increased firstly and then decreased after reaching the maximum, and the maximum and the minimum values are multi-circle tracks; the number of the maximum values is the same as that of the minimum values; the number of the multiple circles is the same as the maximum number.
6. The gesture recognition method according to claim 1, wherein: the recognition result includes an invalid gesture.
7. The gesture recognition method according to claim 1, wherein: the hand image is provided with a specific color.
8. A gesture recognition apparatus, comprising:
the acquisition module is used for segmenting a hand region from a video frame containing a hand image by a threshold segmentation method and detecting gesture actions; acquiring a centroid coordinate of the hand area;
the judging module is used for detecting gesture actions; when the detected gesture action is matched with a preset starting action, the centroid coordinate of the hand area is recorded; until the detected gesture action is matched with a preset ending action;
and the recognition module is used for taking the recorded centroid coordinates of the hand area as effective track coordinates, carrying out track type recognition on the effective track coordinates, and providing recognition results to preset finger reading equipment for recognition response.
9. A computer storage medium having a computer program stored thereon; the method is characterized in that: the computer program, when executed by a processor, implements a gesture recognition method as claimed in any one of claims 1-7.
10. A gesture recognition electronic device, characterized in that: comprises a processor and a memory; the memory is used for storing executable instructions of the processor; the processor is configured to perform the gesture recognition method of any one of claims 1-7 via execution of the executable instructions.
CN202010978670.6A 2020-09-17 2020-09-17 Gesture recognition method and device, computer storage medium and electronic equipment Pending CN112115853A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010978670.6A CN112115853A (en) 2020-09-17 2020-09-17 Gesture recognition method and device, computer storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010978670.6A CN112115853A (en) 2020-09-17 2020-09-17 Gesture recognition method and device, computer storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112115853A true CN112115853A (en) 2020-12-22

Family

ID=73803212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010978670.6A Pending CN112115853A (en) 2020-09-17 2020-09-17 Gesture recognition method and device, computer storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112115853A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581279A (en) * 2020-12-25 2021-03-30 深圳市富途网络科技有限公司 Ordering method of desktop financial software and related product
CN112906563A (en) * 2021-02-19 2021-06-04 山东英信计算机技术有限公司 Dynamic gesture recognition method, device and system and readable storage medium
CN113269025A (en) * 2021-04-01 2021-08-17 广州车芝电器有限公司 Automatic alarm method and system

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101614552A (en) * 2009-07-24 2009-12-30 深圳市凯立德计算机系统技术有限公司 Gesture command input judging method of navigation system and device
US20110221974A1 (en) * 2010-03-11 2011-09-15 Deutsche Telekom Ag System and method for hand gesture recognition for remote control of an internet protocol tv
CN102226880A (en) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 Somatosensory operation method and system based on virtual reality
CN102495692A (en) * 2011-12-08 2012-06-13 青岛海信移动通信技术股份有限公司 Method and electronic product for recognizing gesture operations of user
CN103207674A (en) * 2013-03-21 2013-07-17 苏州展科光电科技有限公司 Electronic demonstration system based on motion sensing technology
CN103218167A (en) * 2013-04-02 2013-07-24 长安大学 Single-point touch gesture pattern recognition method for vehicle-mounted terminal
CN103513759A (en) * 2012-06-21 2014-01-15 富士通株式会社 Method and device for recognizing gesture tracks
CN104516649A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Intelligent cell phone operating technology based on motion-sensing technology
CN104571823A (en) * 2015-01-12 2015-04-29 济南大学 Non-contact virtual human-computer interaction method based on smart television set
CN104714637A (en) * 2013-12-16 2015-06-17 纬创资通股份有限公司 Polygonal gesture detection and interaction method, device and computer program product
CN104866826A (en) * 2015-05-17 2015-08-26 华南理工大学 Static gesture language identification method based on KNN algorithm and pixel ratio gradient features
CN104978010A (en) * 2014-04-03 2015-10-14 冠捷投资有限公司 Three-dimensional space handwriting trajectory acquisition method
CN104991687A (en) * 2015-06-09 2015-10-21 惠州Tcl移动通信有限公司 Method and system for acquiring curve operating track of touch-screen device
CN108256421A (en) * 2017-12-05 2018-07-06 盈盛资讯科技有限公司 A kind of dynamic gesture sequence real-time identification method, system and device
CN108446032A (en) * 2017-12-28 2018-08-24 安徽慧视金瞳科技有限公司 A kind of mouse gestures implementation method in projection interactive system
CN110232308A (en) * 2019-04-17 2019-09-13 浙江大学 Robot gesture track recognizing method is followed based on what hand speed and track were distributed
CN110287894A (en) * 2019-06-27 2019-09-27 深圳市优象计算技术有限公司 A kind of gesture identification method and system for ultra-wide angle video
CN110633666A (en) * 2019-09-10 2019-12-31 江南大学 Gesture track recognition method based on finger color patches

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101614552A (en) * 2009-07-24 2009-12-30 深圳市凯立德计算机系统技术有限公司 Gesture command input judging method of navigation system and device
US20110221974A1 (en) * 2010-03-11 2011-09-15 Deutsche Telekom Ag System and method for hand gesture recognition for remote control of an internet protocol tv
CN102226880A (en) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 Somatosensory operation method and system based on virtual reality
CN102495692A (en) * 2011-12-08 2012-06-13 青岛海信移动通信技术股份有限公司 Method and electronic product for recognizing gesture operations of user
CN103513759A (en) * 2012-06-21 2014-01-15 富士通株式会社 Method and device for recognizing gesture tracks
CN103207674A (en) * 2013-03-21 2013-07-17 苏州展科光电科技有限公司 Electronic demonstration system based on motion sensing technology
CN103218167A (en) * 2013-04-02 2013-07-24 长安大学 Single-point touch gesture pattern recognition method for vehicle-mounted terminal
CN104516649A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Intelligent cell phone operating technology based on motion-sensing technology
CN104714637A (en) * 2013-12-16 2015-06-17 纬创资通股份有限公司 Polygonal gesture detection and interaction method, device and computer program product
CN104978010A (en) * 2014-04-03 2015-10-14 冠捷投资有限公司 Three-dimensional space handwriting trajectory acquisition method
CN104571823A (en) * 2015-01-12 2015-04-29 济南大学 Non-contact virtual human-computer interaction method based on smart television set
CN104866826A (en) * 2015-05-17 2015-08-26 华南理工大学 Static gesture language identification method based on KNN algorithm and pixel ratio gradient features
CN104991687A (en) * 2015-06-09 2015-10-21 惠州Tcl移动通信有限公司 Method and system for acquiring curve operating track of touch-screen device
CN108256421A (en) * 2017-12-05 2018-07-06 盈盛资讯科技有限公司 A kind of dynamic gesture sequence real-time identification method, system and device
CN108446032A (en) * 2017-12-28 2018-08-24 安徽慧视金瞳科技有限公司 A kind of mouse gestures implementation method in projection interactive system
CN110232308A (en) * 2019-04-17 2019-09-13 浙江大学 Robot gesture track recognizing method is followed based on what hand speed and track were distributed
CN110287894A (en) * 2019-06-27 2019-09-27 深圳市优象计算技术有限公司 A kind of gesture identification method and system for ultra-wide angle video
CN110633666A (en) * 2019-09-10 2019-12-31 江南大学 Gesture track recognition method based on finger color patches

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581279A (en) * 2020-12-25 2021-03-30 深圳市富途网络科技有限公司 Ordering method of desktop financial software and related product
CN112581279B (en) * 2020-12-25 2024-03-19 深圳市富途网络科技有限公司 Ordering method of desktop financial software and related products
CN112906563A (en) * 2021-02-19 2021-06-04 山东英信计算机技术有限公司 Dynamic gesture recognition method, device and system and readable storage medium
CN113269025A (en) * 2021-04-01 2021-08-17 广州车芝电器有限公司 Automatic alarm method and system
CN113269025B (en) * 2021-04-01 2024-03-26 广州车芝电器有限公司 Automatic alarm method and system

Similar Documents

Publication Publication Date Title
RU2711029C2 (en) Touch classification
CN112115853A (en) Gesture recognition method and device, computer storage medium and electronic equipment
CN111931710B (en) Online handwritten character recognition method and device, electronic equipment and storage medium
US20160171293A1 (en) Gesture tracking and classification
CN106845384B (en) gesture recognition method based on recursive model
RU2014108820A (en) IMAGE PROCESSOR CONTAINING A SYSTEM FOR RECOGNITION OF GESTURES WITH FUNCTIONAL FEATURES FOR DETECTING AND TRACKING FINGERS
US20190332858A1 (en) Method and device for identifying wrist, method for identifying gesture, electronic equipment and computer-readable storage medium
Chang et al. Spatio-temporal hough forest for efficient detection–localisation–recognition of fingerwriting in egocentric camera
CN112114675B (en) Gesture control-based non-contact elevator keyboard using method
Panwar Hand gesture based interface for aiding visually impaired
CN110717385A (en) Dynamic gesture recognition method
CN110427909B (en) Mobile terminal driving license detection method and system, electronic equipment and storage medium
Chiang et al. Recognizing arbitrarily connected and superimposed handwritten numerals in intangible writing interfaces
US10262185B2 (en) Image processing method and image processing system
Xiong et al. Revisiting shortstraw: improving corner finding in sketch-based interfaces
Bai et al. Dynamic hand gesture recognition based on depth information
CN108255298B (en) Infrared gesture recognition method and device in projection interaction system
CN110737364B (en) Control method for touch writing acceleration under android system
Inoue et al. Depth sensor based automatic hand region extraction by using time-series curve and its application to Japanese finger-spelled sign language recognition
Teja et al. A ballistic stroke representation of online handwriting for recognition
CN113296616A (en) Pen point selection method and device and intelligent terminal
JPH0729002A (en) Handwritten graphic recognition device
EP0377129A2 (en) Fast spatial segmenter for handwritten characters
Choudhury et al. A Framework for Segmentation of Characters and Words from In-Air Handwritten Assamese Text
Inuganti et al. Recognition of online handwritten Telugu stroke by detected dominant points using curvature estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201222