CN109710066B - Interaction method and device based on gesture recognition, storage medium and electronic equipment - Google Patents

Interaction method and device based on gesture recognition, storage medium and electronic equipment Download PDF

Info

Publication number
CN109710066B
CN109710066B CN201811555601.3A CN201811555601A CN109710066B CN 109710066 B CN109710066 B CN 109710066B CN 201811555601 A CN201811555601 A CN 201811555601A CN 109710066 B CN109710066 B CN 109710066B
Authority
CN
China
Prior art keywords
gesture
user
behavior
interaction
point position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811555601.3A
Other languages
Chinese (zh)
Other versions
CN109710066A (en
Inventor
邱柏宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Puhui Enterprise Management Co Ltd
Original Assignee
Ping An Puhui Enterprise Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Puhui Enterprise Management Co Ltd filed Critical Ping An Puhui Enterprise Management Co Ltd
Priority to CN201811555601.3A priority Critical patent/CN109710066B/en
Publication of CN109710066A publication Critical patent/CN109710066A/en
Application granted granted Critical
Publication of CN109710066B publication Critical patent/CN109710066B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to the technical field of gesture recognition, and particularly discloses an interaction method and device based on gesture recognition, a storage medium and an electronic device. The interaction method based on gesture recognition comprises the following steps: acquiring gesture behaviors of a user; inputting the gesture behaviors to a mapping model to obtain interaction instructions corresponding to the gesture behaviors, wherein the mapping model is formed based on gesture behavior samples and corresponding interaction instruction samples in a training mode; and executing corresponding interactive operation in an interactive interface of the terminal equipment according to the interactive instruction. According to the method and the device, man-machine interaction can be achieved according to the acquired gesture behaviors of the user, the hands of the user can be released, and user experience is improved.

Description

Interaction method and device based on gesture recognition, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of gesture recognition technologies, and in particular, to an interaction method based on gesture recognition, an interaction apparatus based on gesture recognition, a storage medium, and an electronic device.
Background
With the rapid development of modern science and technology, man-machine interaction is widely applied to various fields, how to make a computer read human information and realize better cooperation between human and machine has started to attract more and more people to pay attention.
In the prior art, the interaction between the most computers and human beings is realized by the cooperation of a mouse or a keyboard, but as people seek convenience and portability more and more, the interaction mode through the mouse or the keyboard is not convenient any more; further, people begin to seek to realize interaction directly through sliding of hands on the smart device, but still need the user to perform contact operation on the device, which affects interaction efficiency, and further reduces interaction experience of the user.
Therefore, there is a need in the art to provide a new interaction method based on gesture recognition.
It is to be noted that the information invented in the background section above is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The invention aims to provide an interaction method and device based on gesture recognition, a storage medium and electronic equipment, and further solves the problems of poor human-computer interaction experience and low interaction efficiency caused by the limitation and defect that human-computer interaction is realized only by contacting with a machine at least to a certain extent. In order to achieve the technical effects, the following technical scheme is adopted in the disclosure.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to one aspect of the present disclosure, there is provided a gesture recognition based interaction method, including: acquiring gesture behaviors of a user; inputting the gesture behaviors to a mapping model to obtain interaction instructions corresponding to the gesture behaviors, wherein the mapping model is formed based on gesture behavior samples and corresponding interaction instruction samples in a training mode; and executing corresponding interactive operation in an interactive interface of the terminal equipment according to the interactive instruction.
In an exemplary embodiment of the present disclosure, the acquiring a gesture behavior of a user includes: acquiring a gesture image of a user; and extracting the features of the gesture image to obtain gesture feature information, and determining the gesture behavior of the user according to the gesture feature information.
In an exemplary embodiment of the present disclosure, the gesture image includes a first gesture image and a second gesture image spaced from the first gesture image by a preset time; the feature extraction of the gesture image to obtain gesture feature information, and determining the gesture behavior of the user according to the gesture feature information includes: respectively extracting features of the first gesture image and the second gesture image to obtain first feature information and second feature information; and determining the gesture behavior of the user according to the first characteristic information and the second characteristic information.
In an exemplary embodiment of the present disclosure, the gesture feature information includes a gesture reference point and a position coordinate corresponding to the gesture reference point; the determining the gesture behavior of the user according to the first characteristic information and the second characteristic information includes: acquiring position coordinates corresponding to the gesture reference points in the first characteristic information as initial point position coordinates; taking the position coordinate corresponding to the gesture reference point in the second characteristic information as an end point position coordinate, and obtaining the deviation degree of the end point position coordinate and the initial point position coordinate; and acquiring the gesture behavior of the user according to the deviation degree.
In an exemplary embodiment of the disclosure, the acquiring the gesture behavior of the user according to the deviation degree includes: matching the deviation degree with a preset deviation degree interval to obtain a target deviation degree interval; and determining the gesture behavior corresponding to the target deviation degree interval as the gesture behavior of the user.
In an exemplary embodiment of the present disclosure, the method further comprises: collecting a plurality of gesture image samples, performing feature extraction on the gesture image samples to obtain gesture feature information, and determining gesture behavior samples of the user according to the gesture feature information; acquiring an interaction instruction sample corresponding to the gesture behavior sample, and storing the interaction instruction sample and the corresponding gesture behavior sample in a database to form a gesture behavior database; training the machine learning model based on the gesture behavior database to generate the mapping model.
In an exemplary embodiment of the present disclosure, the gesture behavior includes a leftward swipe, a rightward swipe, an upward swipe, a downward swipe, a double click, and a single click.
According to an aspect of the present disclosure, there is provided an interaction device based on gesture recognition, including: the gesture behavior acquisition module is used for acquiring gesture behaviors of a user; the interactive instruction generating module is used for inputting the gesture behaviors to a mapping model to obtain interactive instructions corresponding to the gesture behaviors, wherein the mapping model is formed on the basis of gesture behavior samples and corresponding interactive instruction samples in a training mode; and the operation execution module is used for executing corresponding interactive operation in an interactive interface of the terminal equipment according to the interactive instruction.
According to an aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the gesture recognition based interaction method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform any one of the above-described gesture recognition based interaction methods via execution of the executable instructions.
In the interaction method based on gesture recognition in the exemplary embodiment of the disclosure, the corresponding interaction instruction can be obtained by inputting the acquired gesture behavior of the user to a mapping model, and corresponding interaction operation is realized in an interaction interface. On one hand, corresponding interactive operation can be executed in the interactive interface of the terminal equipment only by acquiring gesture behaviors of the user, so that direct contact between the user and the interactive interface is reduced, and the hands of the user are released; simultaneously, as a novel interactive mode of release both hands, bring more interests for the user when carrying out human-computer interaction, promoted user experience, and then attract more users to participate in human-computer interaction. On the other hand, the mapping model is obtained based on multiple gesture behavior samples and corresponding interactive instruction samples through training, the interactive instruction obtained based on the mapping model has reliability, misoperation caused by manual operation is avoided, and the accuracy of human-computer interaction is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 schematically illustrates a flow diagram of a gesture recognition based interaction method according to an embodiment of the present disclosure;
FIG. 2 schematically shows a schematic diagram of obtaining a user gesture reference point according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart for determining a gesture behavior of a user from first feature information and second feature information according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow diagram for establishing a mapping model of correspondence between gesture behaviors and interaction instructions, according to an embodiment of the present disclosure;
FIG. 5 schematically shows a structural diagram of an interaction device based on gesture recognition according to an embodiment of the present disclosure;
FIG. 6 schematically shows a schematic view of a storage medium according to an embodiment of the present disclosure; and fig. 7 schematically shows a block diagram of an electronic device according to an embodiment of the present disclosure.
In the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
Exemplary embodiments will now be described more fully with reference to the accompanying drawings. The exemplary embodiments, however, may be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of exemplary embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their detailed description will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
In the related art in the field, there are two main ways of human-computer interaction: one is contact-type human-computer interaction, such as interaction through a mouse, a keyboard, and the like; the other type is non-contact man-machine interaction, which is mainly realized by voice recognition.
Accordingly, the following drawbacks exist in the way of human-computer interaction in the related art:
for contact type human-computer interaction, people have to slide hands on an interaction interface to realize human-computer interaction, so that the hands of a user are difficult to release, and the interaction experience of the user is reduced; meanwhile, misoperation can be caused by errors when the hand slides, and interaction efficiency and accuracy are affected. For the non-contact voice recognition interaction mode, the interference of surrounding environment sounds is easily caused, the recognition accuracy is reduced, and the interaction experience of a user is further influenced.
Based on this, in the present exemplary embodiment, an interaction method based on gesture recognition is first provided. Referring to fig. 1, the interaction method based on gesture recognition includes the following steps:
step S110: acquiring gesture behaviors of a user;
step S120: inputting the gesture behaviors to a mapping model to obtain interaction instructions corresponding to the gesture behaviors, wherein the mapping model is formed based on gesture behavior samples and corresponding interaction instruction samples in a training mode;
step S130: and executing corresponding interactive operation in an interactive interface of the terminal equipment according to the interactive instruction.
According to the interaction method based on gesture recognition in the exemplary embodiment, on one hand, corresponding interaction operation can be executed in the interaction interface of the terminal equipment only by acquiring the gesture behavior of the user, so that direct contact between the user and the interaction interface is reduced, and the hands of the user are released; simultaneously, as a novel interactive mode of release both hands, bring more interests for the user when carrying out human-computer interaction, promoted user experience, and then attract more users to participate in human-computer interaction. On the other hand, the mapping model is obtained based on multiple gesture behavior samples and corresponding interactive instruction samples through training, the interactive instruction obtained based on the mapping model has reliability, misoperation caused by manual operation is avoided, and the accuracy of human-computer interaction is improved.
Terminal devices capable of realizing human-computer interaction include desktop computers, notebook computers, tablet computers, smart phones and other terminal devices with interactive interfaces, and the like. In the present exemplary embodiment, the interaction method based on gesture recognition applied to the terminal device with the interactive interface will be further described below.
In step S110, gesture behaviors of the user are acquired.
In this example embodiment, first, a gesture image of a user is obtained, where the gesture image may be an image of the user including a gesture, which is acquired by a camera, where the camera may be an internal camera carried by a terminal device with an interactive interface, or an external camera connected to the terminal device with the interactive interface; and then, performing feature extraction on the obtained gesture image of the user to obtain gesture feature information, and determining the gesture behavior of the user according to the gesture feature information. The method for extracting the features of the gesture image of the user may specifically be a skin color segmentation method, an optical flow method, a filtering algorithm, or the like. The feature extraction of the gesture image of the user is described by taking a skin color segmentation method as an example, before the feature extraction of the skin part in the image, in order to obtain a better effect, necessary processing such as binarization processing such as noise removal and edge sharpening can be carried out on the image, and the gesture feature information is obtained by using a density distribution feature method based on the binary image.
In this example embodiment, the gesture image of the user includes a first gesture image and a second gesture image separated from the first gesture image by a preset time, where the preset time is a time when the user completes one interactive action, that is, the corresponding user image is the second gesture image when one interactive action is completed. When the feature extraction is performed on the gesture image of the user, the feature extraction needs to be performed on the first gesture image and the second gesture image respectively to obtain first feature information corresponding to the first gesture image and second feature information corresponding to the second gesture image, and finally, the gesture behavior of the user is determined according to the first feature information and the second feature information.
Further, the user gesture feature information includes a gesture reference point and a position coordinate corresponding to the gesture reference point, where the gesture reference point may be a fingertip of a longest finger among the user fingers. Fig. 2 is a schematic diagram illustrating acquisition of a user gesture reference point, and as shown in fig. 2A, if an acquired gesture image of a user only includes one non-curved finger, a finger tip corresponding to the finger is used as the gesture reference point; referring to fig. 2B, if the acquired gesture image of the user includes at least two non-curved fingers, a fingertip corresponding to the longest finger of the fingers is acquired as a gesture reference point, and a fingertip corresponding to the longest finger (middle finger) in the drawing is acquired as a gesture reference point. It should be noted that the gesture state of the user in the captured gesture image of the user may include various situations (for example, fig. 2A and 2B), but only the longest finger of the user in the gesture state needs to be acquired as the gesture reference point. In addition, the gesture reference point of the user may also be obtained by other manners, for example, the palm of the user in the gesture image of the user may be used as the gesture reference point, and the like. The present disclosure includes, but is not limited to, the above-described method of obtaining a gesture reference point of a user.
Further, fig. 3 shows a flowchart for determining the gesture behavior of the user according to the first characteristic information and the second characteristic information, and as shown in fig. 3, determining the gesture behavior of the user specifically includes the following steps:
s310: and acquiring position coordinates corresponding to the gesture reference points in the first characteristic information as initial point position coordinates.
In the present exemplary embodiment, the position coordinate is a point in a spatial rectangular coordinate system including an X axis, a Y axis, and a Z axis, where the X axis is a coordinate axis directed to the user, the Y axis is a horizontal axis, and the Z axis is a coordinate axis perpendicular to a horizontal plane. The starting point position coordinate is a starting point corresponding to one gesture behavior of the user and serves as the start of the gesture behavior.
S320: and acquiring position coordinates corresponding to the gesture reference points in the second characteristic information as end point position coordinates, and acquiring the deviation degree of the end point position coordinates and the initial point position coordinates.
In this example embodiment, the end point position coordinate is an end point corresponding to one gesture behavior of the user, and is an end of the one gesture behavior. And acquiring the deviation degree of the terminal point coordinate of the gesture reference point of the user compared with the initial point coordinate, so as to judge the gesture behavior of the user. The deviation degree is the difference between the end point coordinate and the start point coordinate of the gesture reference point of the user, and the position coordinate corresponding to the gesture reference point comprises three direction values (the X direction, the Y direction and the Z direction), so that the value with the maximum absolute value of the deviation degree in the three directions is obtained as the deviation degree. For example, if the end point coordinate of the gesture reference point of the user is (2cm, 10cm, 3cm) and the start point coordinate is (1cm, 3cm, 5cm), the deviation degrees of the three directions included in the position coordinate corresponding to the gesture reference point of the user are 1cm, 7cm and-2 cm respectively, and then it is determined that 7cm is the deviation degree of the end point coordinate of the user compared with the start point coordinate. It should be noted that the calculation method of the deviation degree may also include other manners, for example, a difference value of the end point coordinate of the user gesture reference point compared with the start point coordinate may be obtained first, and then the difference value is divided by the start point coordinate to obtain the deviation degree of the end point coordinate of the user gesture reference point compared with the start point coordinate, and the like.
S330: and acquiring the gesture behavior of the user according to the deviation degree.
In the present exemplary embodiment, the deviation degree is matched with a preset deviation degree interval to obtain a target deviation degree interval, and a gesture behavior corresponding to the target deviation degree interval is determined as a gesture behavior of the user. Specifically, table 1 shows a correspondence table between different preset deviation intervals and the gesture behavior of the user, and referring to table 1, when the deviation between the end point position coordinate and the start point position coordinate of the acquired gesture reference point of the user is 7cm (Y axis direction), and within the preset deviation interval (0, + ∞), the preset deviation interval (0, + ∞) is the target deviation interval, and the gesture behavior corresponding to the target deviation interval slides rightward, that is, the gesture behavior of the user is the gesture behavior of the user. In the case where the degree of deviation in the X-axis direction is taken as the degree of deviation between the coordinates of the end point position and the coordinates of the start point position of the user, since the finger of the user tends to move in the positive direction of the X-axis once during the double-click operation performed by the user within a predetermined time interval before the second gesture image of the user is acquired after the first gesture image of the user is acquired, the degree of deviation in the X-axis direction corresponding to the double-click operation is equal to or greater than-2 cm, and the degree of deviation corresponding to the single-click operation is less than-2 cm.
TABLE 1
Figure BDA0001911793690000081
It should be noted that, in the present exemplary embodiment, the gesture behaviors of the user include leftward sliding, rightward sliding, upward sliding, downward sliding, double clicking and single clicking, but the gesture behaviors of the user in the present exemplary embodiment are only simple examples of the gesture behaviors of the user, and in practice, the gesture behaviors of the user may also include multiple types, such as opening of two fingers, and the like, but all of the gestures of the user are obtained according to the deviation degree of the endpoint position coordinates and the start point position coordinates of the gesture reference point of the user, and the disclosure includes, but is not limited to, the gesture behaviors of the user.
In step S120, the gesture behavior is input to a mapping model to obtain an interaction instruction corresponding to the gesture behavior, where the mapping model is formed based on a gesture behavior sample and a corresponding interaction instruction sample.
In the present exemplary embodiment, before inputting the gesture behavior of the user to a mapping model, a mapping model corresponding to the gesture behavior and the interaction instruction is first established. Fig. 4 shows a flow chart for establishing a mapping model for correspondence between gesture behaviors and interaction instructions, the establishing process of the mapping model comprises the following steps:
s410: collecting a plurality of gesture image samples, performing feature extraction on the gesture image samples to obtain gesture feature information, and determining gesture behavior samples of the user according to the gesture feature information.
In this exemplary embodiment, a variety of gesture image samples of a user may be collected through a camera built in a terminal device with an interactive interface or an external camera connected to the terminal device, gesture feature information is obtained based on the feature extraction method, and with continued reference to the method shown in fig. 3, a gesture behavior of the user is determined according to the gesture feature information to form a gesture behavior sample.
S420: and acquiring an interactive instruction sample corresponding to the gesture behavior sample, and storing the interactive instruction sample and the corresponding gesture behavior sample in a database to form a gesture behavior database.
In this example embodiment, a corresponding interaction instruction is set according to each gesture behavior, where the interaction instruction is a specific implementation instruction of the gesture behavior, and specifically, the interaction instruction corresponding to the gesture behavior sliding to the right is that the web page is turned to the right, that is, the user can browse a new web page; the gesture behavior of the single click is confirmed for the corresponding interactive instruction, for example, confirming to browse the content in a certain link, and the like. Further, the gesture behavior samples and the corresponding interaction instruction samples are stored in a database to form a gesture behavior database.
S430: training the machine learning model based on the gesture behavior database to generate the mapping model.
In this example embodiment, the mapping model may be a convolutional neural network or a deep residual error network, and a person skilled in the art may use a corresponding machine learning model as needed, which is not specifically limited by the present disclosure. When the model is trained, training parameters such as a learning rate, training times, a loss function and an optimization target are set based on the acquired gesture behavior samples in the gesture behavior database and interactive instruction samples corresponding to the gesture behavior samples, and then a machine learning model is trained, so that a mapping model between the gesture behavior and the interactive instruction can be acquired.
In step S130, corresponding interactive operation is executed in the interactive interface of the terminal device according to the interactive instruction.
In the embodiment, after the interactive instruction corresponding to the gesture image of the user is obtained, the interactive interface on the terminal device of the user can perform corresponding interactive operation, and if the gesture behavior is rightward sliding, the interactive interface on the terminal device of the user performs the interactive instruction of rightward sliding of the webpage, so that human-computer interaction is realized only by collecting the gesture image of the user, the direct contact between the hand of the user and the terminal device is avoided, the hands of the user are released, and the user experience is improved.
In conclusion, by acquiring the gesture behaviors of the user, corresponding interactive operation can be executed in the interactive interface of the terminal equipment, so that direct contact between the user and the interactive interface is reduced, and the hands of the user are released; simultaneously, as a novel interactive mode of release both hands, bring more interests for the user when carrying out human-computer interaction, promoted user experience, and then attract more users to participate in human-computer interaction. In addition, the mapping model is obtained based on multiple gesture behavior samples and corresponding interactive instruction samples through training, the interactive instruction obtained based on the mapping model has reliability, misoperation caused by manual operation is avoided, and the accuracy of human-computer interaction is improved.
In addition, in the example embodiment, an interaction device based on gesture recognition is also provided. Referring to fig. 5, the gesture recognition based interaction apparatus 500 may include: a gesture behavior acquisition module 510, an interaction instruction generation module 520, and an operation execution module 530. In particular, the amount of the solvent to be used,
a gesture behavior obtaining module 510, configured to obtain a gesture behavior of a user;
the interactive instruction generating module 520 is configured to input the gesture behavior into a mapping model to obtain an interactive instruction corresponding to the gesture behavior, where the mapping model is formed based on a gesture behavior sample and a corresponding interactive instruction sample through training;
and the operation executing module 530 is configured to execute corresponding interactive operations in the interactive interface of the terminal device according to the interactive instruction.
Since each functional module of the interaction device based on gesture recognition in the embodiment of the present disclosure is the same as that in the embodiment of the present invention of the interaction method based on gesture recognition, further description is omitted here.
Further, in an exemplary embodiment of the present disclosure, there is also provided a computer storage medium capable of implementing the above method. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 6, a program product 600 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided. As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 700 according to such an embodiment of the present disclosure is described below with reference to fig. 7. The electronic device 700 shown in fig. 7 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 is embodied in the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: the at least one processing unit 710, the at least one memory unit 720, a bus 730 connecting different system components (including the memory unit 720 and the processing unit 710), and a display unit 740.
Wherein the storage unit stores program code that is executable by the processing unit 710 to cause the processing unit 710 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification.
The storage unit 720 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)7201 and/or a cache memory unit 7202, and may further include a read only memory unit (ROM) 7203.
The storage unit 720 may also include a program/utility 7204 having a set (at least one) of program modules 7205, such program modules 7205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 730 may be any representation of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 700 may also communicate with one or more external devices 800 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 700, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 700 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 750. Also, the electronic device 700 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 760. As shown, the network adapter 760 communicates with the other modules of the electronic device 700 via the bus 730. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 700, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (7)

1. An interaction method based on gesture recognition is characterized by comprising the following steps:
collecting a first gesture image of a user and a second gesture image separated from the first gesture image by a preset time, respectively performing feature extraction on the first gesture image and the second gesture image to obtain first feature information and second feature information, obtaining a position coordinate corresponding to a gesture reference point in the first feature information as an initial point position coordinate, obtaining a position coordinate corresponding to a gesture reference point in the second feature information as an end point position coordinate, and determining a gesture behavior corresponding to the user according to a matching result of a deviation degree of the end point position coordinate and the initial point position coordinate and a preset deviation interval, wherein the longest finger in the gesture image is taken as the reference point gesture, and the deviation degree is in a plurality of deviation degree values determined according to the end point position coordinates and the corresponding initial point position coordinates in different directions, the deviation degree corresponding to the maximum deviation degree value;
inputting the gesture behaviors to a mapping model to obtain interaction instructions corresponding to the gesture behaviors, wherein the mapping model is formed based on gesture behavior samples and corresponding interaction instruction samples in a training mode;
and executing corresponding interactive operation in an interactive interface of the terminal equipment according to the interactive instruction.
2. The interaction method based on gesture recognition according to claim 1, wherein the determining the gesture behavior corresponding to the user according to the matching result of the deviation degree of the end point position coordinate and the start point position coordinate and a preset deviation interval comprises:
matching the deviation degree with a preset deviation degree interval to obtain a target deviation degree interval;
and determining the gesture behavior corresponding to the target deviation degree interval as the gesture behavior of the user.
3. The gesture recognition based interaction method according to claim 1, further comprising:
collecting a plurality of gesture image samples, performing feature extraction on the gesture image samples to obtain gesture feature information, and determining gesture behavior samples of the user according to the gesture feature information;
acquiring an interaction instruction sample corresponding to the gesture behavior sample, and storing the interaction instruction sample and the corresponding gesture behavior sample in a database to form a gesture behavior database;
training a machine learning model based on the gesture behavior database to generate the mapping model.
4. The gesture recognition based interaction method according to any one of claims 1 to 3, wherein the gesture behavior comprises a leftward swipe, a rightward swipe, an upward swipe, a downward swipe, a double click, and a single click.
5. An interaction device based on gesture recognition, the device comprising:
the gesture behavior acquisition module is used for acquiring a first gesture image of a user and a second gesture image which is separated from the first gesture image by preset time; respectively extracting features of the first gesture image and the second gesture image to obtain first feature information and second feature information, obtaining position coordinates corresponding to gesture reference points in the first feature information as initial point position coordinates, obtaining position coordinates corresponding to gesture reference points in the second feature information as end point position coordinates, and determining gesture behaviors corresponding to the user according to a matching result of deviation degrees of the end point position coordinates and the initial point position coordinates and a preset deviation interval, wherein the longest finger tip in the gesture image is used as the gesture reference point, and the deviation degrees are deviation degrees corresponding to the maximum deviation degree values in a plurality of deviation degree values determined according to the end point position coordinates and the corresponding initial point position coordinates in different directions;
the interactive instruction generating module is used for inputting the gesture behaviors to a mapping model to obtain interactive instructions corresponding to the gesture behaviors, wherein the mapping model is formed on the basis of gesture behavior samples and corresponding interactive instruction samples in a training mode;
and the operation execution module is used for executing corresponding interactive operation in an interactive interface of the terminal equipment according to the interactive instruction.
6. A computer storage medium having stored thereon a computer program which, when executed by a processor, implements a gesture recognition based interaction method according to any one of claims 1 to 4.
7. An electronic device, comprising:
a processor; and
a memory for storing executable requests of the processor;
wherein the processor is configured to perform the gesture recognition based interaction method of any of claims 1-4 via execution of the executable request.
CN201811555601.3A 2018-12-19 2018-12-19 Interaction method and device based on gesture recognition, storage medium and electronic equipment Active CN109710066B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811555601.3A CN109710066B (en) 2018-12-19 2018-12-19 Interaction method and device based on gesture recognition, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811555601.3A CN109710066B (en) 2018-12-19 2018-12-19 Interaction method and device based on gesture recognition, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109710066A CN109710066A (en) 2019-05-03
CN109710066B true CN109710066B (en) 2022-03-25

Family

ID=66255989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811555601.3A Active CN109710066B (en) 2018-12-19 2018-12-19 Interaction method and device based on gesture recognition, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109710066B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110806804A (en) * 2019-11-01 2020-02-18 大众问问(北京)信息科技有限公司 Audio control method and device, computer equipment and storage medium
CN111309153B (en) * 2020-03-25 2024-04-09 北京百度网讯科技有限公司 Man-machine interaction control method and device, electronic equipment and storage medium
CN112034981A (en) * 2020-08-20 2020-12-04 深圳创维-Rgb电子有限公司 Display terminal control method, display terminal, and computer-readable storage medium
CN112351325B (en) * 2020-11-06 2023-07-25 惠州视维新技术有限公司 Gesture-based display terminal control method, terminal and readable storage medium
CN113076004B (en) * 2021-04-12 2024-05-03 北京隐虚等贤科技有限公司 Method and device for dynamically evaluating user data based on immersion type equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101374720B1 (en) * 2013-07-15 2014-03-17 전자부품연구원 Virtual mouse control apparatus based on hand gesture and method thereof
CN103677417A (en) * 2013-12-12 2014-03-26 小米科技有限责任公司 Gesture detection method and device and terminal device
CN106503626A (en) * 2016-09-29 2017-03-15 南京信息工程大学 Being mated with finger contours based on depth image and refer to gesture identification method
CN106954033A (en) * 2016-01-06 2017-07-14 中兴通讯股份有限公司 Projector equipment and its control method
CN109032358A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 The control method and device of AR interaction dummy model based on gesture identification

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982557B (en) * 2012-11-06 2015-03-25 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
CN104536571B (en) * 2014-12-26 2018-02-23 深圳市冠旭电子股份有限公司 The method of controlling operation thereof and device of earphone
CN105487673B (en) * 2016-01-04 2018-01-09 京东方科技集团股份有限公司 A kind of man-machine interactive system, method and device
CN106200971A (en) * 2016-07-07 2016-12-07 广东技术师范学院 Man-machine interactive system device based on gesture identification and operational approach
CN106569596A (en) * 2016-10-20 2017-04-19 努比亚技术有限公司 Gesture control method and equipment
CN108509026B (en) * 2018-02-06 2020-04-14 西安电子科技大学 Remote maintenance support system and method based on enhanced interaction mode
CN108595095B (en) * 2018-04-13 2022-07-15 百度在线网络技术(北京)有限公司 Method and device for simulating movement locus of target body based on gesture control

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101374720B1 (en) * 2013-07-15 2014-03-17 전자부품연구원 Virtual mouse control apparatus based on hand gesture and method thereof
CN103677417A (en) * 2013-12-12 2014-03-26 小米科技有限责任公司 Gesture detection method and device and terminal device
CN106954033A (en) * 2016-01-06 2017-07-14 中兴通讯股份有限公司 Projector equipment and its control method
CN106503626A (en) * 2016-09-29 2017-03-15 南京信息工程大学 Being mated with finger contours based on depth image and refer to gesture identification method
CN109032358A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 The control method and device of AR interaction dummy model based on gesture identification

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Hand Gesture Recognition Using Statistical Analysis of Curvelet Coefficients;Palvi Singh等;《2013 International Conference on Machine Intelligence and Research Advancement》;20141009;第435-439页 *
基于OpenCV的摄像头动态手势轨迹识别及其应用;江超等;《计算机应用》;20120710;第134-139页 *

Also Published As

Publication number Publication date
CN109710066A (en) 2019-05-03

Similar Documents

Publication Publication Date Title
CN109710066B (en) Interaction method and device based on gesture recognition, storage medium and electronic equipment
KR102460737B1 (en) Method, apparatus, apparatus and computer readable storage medium for public handwriting recognition
US9911052B2 (en) System and method for superimposed handwriting recognition technology
US20180188938A1 (en) Multi-Task Machine Learning for Predicted Touch Interpretations
JP6807840B2 (en) Systems and methods for recognizing geometry
CN109865285B (en) Information processing method and device in game and computer storage medium
Blanco‐Gonzalo et al. Performance evaluation of handwritten signature recognition in mobile environments
Speicher et al. Gesturewiz: A human-powered gesture design environment for user interface prototypes
CN108700994A (en) System and method for digital ink interactivity
CN104303145A (en) Translation of touch input into local input based on a translation profile for an application
JP2015041317A (en) Method for building model for estimating level of skill of user for operating electronic devices, method for estimating level of skill of user, method for supporting the user according to the level of skill of the user, and computers and computer programs therefor
CN110850982B (en) AR-based man-machine interaction learning method, system, equipment and storage medium
US9395911B2 (en) Computer input using hand drawn symbols
CN108196675B (en) Interaction method and device for touch terminal and touch terminal
US20170131785A1 (en) Method and apparatus for providing interface interacting with user by means of nui device
Ye et al. Gestimator: Shape and stroke similarity based gesture recognition
US20160379033A1 (en) Interaction method and apparatus
Bufano et al. PolyRec Gesture Design Tool: A tool for fast prototyping of gesture‐based mobile applications
CN114092608B (en) Expression processing method and device, computer readable storage medium and electronic equipment
CN110908568A (en) Control method and device for virtual object
CN104484078A (en) Man-machine interactive system and method based on radio frequency identification
CN111078336B (en) Display method, device, equipment and storage medium
CN108563335B (en) Virtual reality interaction method and device, storage medium and electronic equipment
CN110908581B (en) Gesture recognition method and device, computer storage medium and electronic equipment
Lee et al. Vision-based fingertip-writing character recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information

Address after: 201, room 518000, building A, No. 1, front Bay Road, Qianhai Shenzhen Guangdong Shenzhen Hong Kong cooperation zone (Qianhai business secretary)

Applicant after: Pingan Pu Hui Enterprise Management Co., Ltd.

Address before: 518000 Guangdong city of Shenzhen province Qianhai Shenzhen Hong Kong cooperation zone before Bay Road No. 1 building 201 room A

Applicant before: Pingan Pu Hui Enterprise Management Co., Ltd.

CB02 Change of applicant information
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant