WO2017113407A1 - 一种手势识别方法、装置及电子设备 - Google Patents

一种手势识别方法、装置及电子设备 Download PDF

Info

Publication number
WO2017113407A1
WO2017113407A1 PCT/CN2015/100337 CN2015100337W WO2017113407A1 WO 2017113407 A1 WO2017113407 A1 WO 2017113407A1 CN 2015100337 W CN2015100337 W CN 2015100337W WO 2017113407 A1 WO2017113407 A1 WO 2017113407A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
gesture
response
module
biometric information
Prior art date
Application number
PCT/CN2015/100337
Other languages
English (en)
French (fr)
Inventor
龚树强
和永全
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2015/100337 priority Critical patent/WO2017113407A1/zh
Priority to CN201580066876.9A priority patent/CN107533599B/zh
Publication of WO2017113407A1 publication Critical patent/WO2017113407A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present invention relates to the field of human-computer interaction technologies, and in particular, to a gesture recognition method, device, and electronic device.
  • Multi-mode human-machine interface technology is the core technology to solve the problems of high intelligence and high availability of computing devices.
  • Multi-mode human-machine interface technology can establish a harmonious and natural human-computer interaction environment, enabling users to use humans conveniently and naturally.
  • the computer is used in a well known manner. The most important part is to enable the computer to accurately and accurately perceive different human expressions including natural language, gesture language and facial expressions to achieve anthropomorphic human-computer interaction, and the detection and recognition of gestures is man-machine.
  • the present application discloses a gesture recognition method, device and electronic device, which can reduce the false trigger rate of a gesture operation.
  • the present application discloses a gesture recognition method, which can acquire a gesture of a user and biometric information of the user's hand through an image sensor module, and perform identity verification on the user according to the biometric information, and if the verification passes, identify the user.
  • the gesture further determines whether there is a response corresponding to the gesture of the user, and if so, the response is performed, thereby reducing the false trigger rate of the gesture operation.
  • the preset operation permission range of the user may be obtained, and it is determined whether there is a response corresponding to the user's gesture within the operation authority range, and only a response matching the user's operation authority range is performed, thereby preventing An unauthorized operation occurs.
  • the preset gesture operation priority of the user may be acquired, and the response is performed according to the gesture operation priority, so that the response corresponding to the gestures of the multiple users may be performed in an orderly manner.
  • biometric information may include one or more of palm print information, size information, and skin color information.
  • the present application also discloses a gesture recognition apparatus, including an acquisition module, a verification module, an identification module, a determination module, and an execution module, wherein:
  • the acquiring module acquires the user's gesture and the biometric information of the user's hand through the image sensor module, and the verification module authenticates the user according to the biometric information, and the identification module performs the identity verification on the user in the verification module.
  • the determining module may determine whether there is a response corresponding to the gesture of the user, and the executing module is configured to execute the response when the determining module determines that there is a response corresponding to the gesture of the user. Responsive, thereby reducing the false trigger rate of gesture operations.
  • the determining module may first obtain a preset operating permission range of the user, and then determine whether there is a response corresponding to the user's gesture within the operating permission range, so that only the operating permission range of the user is matched. The response can prevent unauthorized operations.
  • the execution module may further acquire the preset gesture operation priority of the user, and perform the response according to the gesture operation priority, so that the response corresponding to the gestures of the multiple users may be performed in an orderly manner.
  • the application also discloses an electronic device comprising an image sensor module and a processor, wherein:
  • the image sensor module captures a gesture image.
  • the processor acquires a gesture of the user and biometric information of the user's hand according to the gesture image acquired by the image sensor module, and authenticates the user according to the biometric information, and if the verification passes, identifies the gesture of the user. Further, it is determined whether there is a response corresponding to the gesture of the user, and if so, the response is executed, so that the false trigger rate of the gesture operation can be reduced.
  • the specific manner in which the processor determines whether there is a response corresponding to the gesture of the user may be:
  • the processor acquires a preset operation permission range of the user, and determines whether there is a response corresponding to the gesture of the user within the operation authority range, and only performs a response that matches the operation authority range of the user, thereby preventing occurrence of Excessive operation.
  • the processor acquires a preset gesture operation priority of the user, and performs the response according to the gesture operation priority, so that the response corresponding to the gestures of the multiple users can be performed in an orderly manner.
  • the image sensor module can acquire the user's gesture and the biometric information of the user's hand through the image sensor module, and perform identity verification on the user according to the biometric information. If the verification passes, the user's gesture is recognized, and then whether the presence or absence is determined. The response corresponding to the gesture of the user, if present, is executed, thereby reducing the false trigger rate of the gesture operation.
  • FIG. 1 is a schematic structural view of an embodiment of an electronic device according to the present disclosure
  • FIG. 2 is a schematic flow chart of a first embodiment of a gesture recognition method according to the present disclosure
  • FIG. 3 is a schematic flow chart of a second embodiment of a gesture recognition method according to the present disclosure.
  • FIG. 4 is a schematic diagram of a gesture operation disclosed by the present invention.
  • FIG. 5 is a schematic flow chart of a third embodiment of a gesture recognition method according to the present disclosure.
  • FIG. 6 is a schematic structural diagram of a first embodiment of a gesture recognition apparatus according to the present disclosure.
  • FIG. 7 is a schematic structural diagram of a second embodiment of a gesture recognition apparatus according to the present invention.
  • the embodiment of the invention discloses a gesture recognition method, device and electronic device, which can reduce the false trigger rate of the gesture operation.
  • FIG. 1 is a schematic structural diagram of an electronic device according to the present disclosure.
  • the electronic device described in this embodiment is used to identify the gesture of the user, and may specifically include: an image sensor module 101, a processor 102, a communication module 103, an input module 104, an output module 105, a memory 106, and an image sensor module 101.
  • the processor 102, the communication module 103, the input module 104, the output module 105, and the memory 106 may be connected by a bus 107, or may be connected in other manners, wherein:
  • the image sensor module 101 may specifically include a first image sensor and a second image sensor, or may include only one image sensor or more than two image sensors for acquiring a gesture input by a user.
  • the input module 104 may specifically include one or more of a touch screen, a keyboard, a microphone, and the like for acquiring information such as text, voice, control instructions, and the like input by the user.
  • the output module 105 may specifically include one or more of a display, a speaker, and the like for outputting information such as a user interaction interface, text, voice, and the like.
  • touch screen and the display may also be designed in one piece, that is, the touch screen (or display) in the embodiment of the invention may be used for inputting text, control instructions and the like, and may also be used for outputting information such as a user interaction interface and text.
  • both the input module 104 and the output module 105 can be omitted.
  • the communication module 103 may specifically include one or more of a wireless communication circuit and a wired communication circuit for performing data communication with other devices or servers through wireless or wired means. When the electronic device does not have a communication function, the communication module 103 may be omitted.
  • the memory 106 may specifically include a volatile memory (English: volatile memory), such as a random access memory (English: random-access memory, abbreviation: RAM); the memory 106 may also include a non-volatile memory (English: non-volatile) Memory), such as read-only memory (English: read-only memory, abbreviation: ROM), flash memory (English: flash memory), hard disk (English: hard disk drive, abbreviation: HDD) or solid state drive (English: solid- State drive (abbreviation: SSD); the memory 106 may also include a combination of the above types of memory for storing an operating system (English: operating system, abbreviated: OS), gesture recognition, network communication, input/output, and the like.
  • an operating system English: operating system, abbreviated: OS
  • OS operating system
  • the processor 102 may specifically include a first processor and a second processor, respectively, which may be used to perform different functions, for example, the first processor may be a graphic of the processor 102.
  • a processing unit (English: graphics processing unit, abbreviated as GPU) for processing the image data output by the image sensor module 101 for calling the gesture recognition related program stored in the memory 106, and a user for generating the display output in the output module 105
  • the second processor may be a central processing unit (English: central processing unit, abbreviated as CPU) for calling the network communication stored in the memory 106, the input/output related program pair communication module 103, and the input.
  • the data of the speakers in the module 104 and the output module 105 are processed.
  • the CPU and the GPU can be implemented as an integrated multi-core (for example, dual-core, quad-core, eight-core, etc.) package, or can be integrated into a system on chip (English: system on chip, abbreviated as SoC), which is a multi-layer package.
  • SoC system on chip
  • FIG. 2 is a schematic flowchart diagram of a first embodiment of a gesture recognition method according to the present disclosure.
  • the gesture recognition method described in this embodiment is based on the electronic device shown in FIG. 1 and includes the following steps:
  • the electronic device acquires a gesture of the user and biometric information of the user's hand through the image sensor module.
  • the image sensor module may be one or more. Using the image sensor module to collect the gesture image, the information such as the depth of field of the user's gesture may be acquired, and the spatial position of the user's hand is determined, thereby identifying a richer gesture. .
  • the gesture of the user may specifically include various actions performed by the user's hand, and may also include location information of the user's hand and the like.
  • the biometric information may specifically include one or more of palm print information, size information, and skin color information, and the size information may specifically be a palm, a finger thickness, a length, and the like.
  • the electronic device only acquires the gesture input by the user, and does not recognize the gesture.
  • step S202 The electronic device performs identity verification on the user according to the biometric information. If the verification passes, step S203 is performed. If the verification fails, the current process ends.
  • the biometric information of the palm, the finger, and the like is unique for different users.
  • the electronic device may collect and store biometric information of the designated user's hand in advance, and subsequently compare the biometric information of the user's hand with the stored biometric information when detecting the gesture input by the user, The user performs identity authentication. If the authentication fails, the electronic device determines that the user does not have any gesture operation authority, that is, does not perform a response corresponding to any gesture input by the user.
  • the electronic device identifies a gesture of the user.
  • the electronic device only recognizes the gesture of the user who passes the identity authentication, and passes the identity authentication as a startup signal for gesture recognition, that is, a workflow for performing identity authentication and gesture recognition, which can effectively filter the hands without the operation authority.
  • the interference effectively reduces the false trigger rate of the gesture operation.
  • the electronic device determines whether there is a response corresponding to the gesture of the user, and if yes, performs step S205; if not, the current process ends.
  • the electronic device may set one or more effective gestures, that is, only when the user inputs one of the valid gestures, the electronic device performs the response corresponding to the gesture.
  • the user's gesture and the biometric information of the user's hand are acquired by the image sensor module, and the user is authenticated according to the biometric information. If the verification is passed, the user's gesture is recognized, and then the presence or absence of the user is determined. The response corresponding to the gesture of the user, if present, executes the response, and only the identity verification passes to identify the corresponding gesture, thereby reducing the false trigger rate of the gesture operation.
  • FIG. 3 is a schematic flowchart diagram of a second embodiment of a gesture recognition method according to the present disclosure.
  • the gesture recognition method described in this embodiment is based on the electronic device shown in FIG. 1 and includes the following steps:
  • the step S301 is the same as the step S201 in the previous embodiment, and the steps S306 and S307 are the same as the steps S204 and S205 in the previous embodiment, and details are not described herein again.
  • the electronic device acquires first location information of the user's hand in the process of acquiring the gesture, and acquires second location information of the user's hand in the process of acquiring the biometric information.
  • the electronic device performs two processes on the image of each frame that is collected by the image sensor module, including the user gesture.
  • the first process is: the electronic device acquires the gesture of the user (for example, may be a gesture to the user). Gesture segmentation to get the gesture) and get the gesture of the user The first position information of the user's hand is obtained in real time.
  • the second process is: the electronic device extracts the biometric information of the user's hand in the image, and acquires the second location information of the user's hand in real time in the process of extracting the biometric information of the user's hand.
  • step S303 The electronic device performs identity verification on the user according to the biometric information. If the verification succeeds, step S304 is performed. If the verification fails, the current process ends.
  • the electronic device sets the identification information of the hand corresponding user according to the first location information and the second location information.
  • the electronic device identifies a gesture of the user, and associates the gesture with the identification information of the user.
  • the electronic device determines that the first location information and the second location information are the same, the electronic device may determine that the user to which the hand belongs (ie, the hand corresponds to the identification information of the user Therefore, after the gesture made by the hand is recognized, the gesture can be associated with the identification information of the user, that is, the gestures made by different users are differentiated, and the execution subject (ie, the user) of each gesture is determined.
  • the images above and below in FIG. 4a are the same frame image, assuming that the position of the hand is the starting position of the gesture, and the images above and below in FIG. 4b are also the same frame image.
  • the hand completes the panning gesture shown in FIG. 4c from FIG. 4a to FIG. 4b, and the electronic device performs the first type on the upper image in FIGS. 4a and 4b. Processing, that is, acquiring a gesture made by the hand, and first position information of the hand in the process of the hand making the gesture; the electronic device performing the second image on the lower image in FIG. 4a and FIG.
  • the identity of the hand corresponding to the user is determined by identity verification, and then the hand position obtained during the identity verification process and the hand position obtained during the process of acquiring the gesture can be determined to determine each hand correspondingly.
  • the execution subject that is, the execution subject corresponding to the gesture made by each hand.
  • the image sensor module acquires the gesture of the user and the biometric information of the user's hand, and acquires the first location information of the user's hand in the process of acquiring the gesture, and acquires the biometric information. Simultaneously acquiring the second location information of the user's hand, authenticating the user according to the biometric information, and if the verification passes, identifying the gesture of the user, and setting according to the first location information and the second location information
  • the hand corresponds to the identification information of the user, and the gesture is associated with the identification information of the user, thereby determining whether there is a response corresponding to the gesture of the user, and if yes, executing the response, thereby reducing false triggering of the gesture operation. Rate, and can distinguish between different execution subjects, allowing multiple users to input gestures at the same time.
  • FIG. 5 is a schematic flowchart diagram of a third embodiment of a gesture recognition method according to the present disclosure.
  • the gesture recognition method described in this embodiment is based on the electronic device shown in FIG. 1 and includes the following steps:
  • the steps S501 to S503 are the same as the steps S201 to S203 in the previous embodiment, and are not described herein again.
  • the electronic device acquires a preset operating permission range of the user.
  • the electronic device can set the same or different operating permission ranges for different users, so that the user can only control the electronic device within their limited operating rights.
  • step S505 The electronic device determines whether there is a response corresponding to the gesture of the user within the scope of the operation authority. If yes, step S506 is performed; if not, the current process ends.
  • the electronic device acquires a preset gesture operation priority of the user, and performs the response according to the gesture operation priority.
  • the electronic device can set the user's operation authority range and the gesture operation priority according to the user's experience richness. For example, the older the age, the greater the user's operation authority range and/or the higher the gesture operation priority.
  • the user's operating authority range and the gesture operation priority may also be set according to the matching (or association) degree of the current working state of the electronic device with the user, for example, the vehicle driver has the largest operating permission range and the highest gesture operation priority.
  • the average passenger has a smaller operating range and lower gesture priority, and so on.
  • the palm a corresponds to the driver
  • the palm b corresponds to the passenger 1
  • the palm c corresponds to the passenger 2
  • the palm a and the palm b are provided in advance to have the vehicle control
  • the palm c does not have the gesture operation authority of the vehicle control terminal, and the operation authority range of the palm a is larger than the palm b.
  • the palm a and the palm b can set the multimedia playing parameters of the car and the air conditioning parameters, but only the palm a can set the vehicle.
  • Driving parameters such as adjustment of the vehicle speed, opening or closing of the safety lock, etc.
  • the gesture operation of the palm a has priority over the palm b.
  • the in-vehicle control terminal acquires the biometric information of the three palms, and according to the biometric information, it is determined that only the palm a and the palm b have the gesture operation authority, so that only the palm a and the palm b are performed. Position tracking and identify gestures made by palm a and palm b.
  • the in-vehicle control terminal does not perform the response corresponding to the gesture issued by the palm b, but executes the palm
  • the response corresponding to the gesture issued by a can prevent unauthorized operation.
  • the response corresponding to the gesture action of the palm a is to adjust the air conditioning temperature
  • the response corresponding to the gesture action of the palm b is to switch the currently playing song, and since the gesture operation priority of the palm a is higher than the palm b, the air conditioning temperature is adjusted first. Then, the cut song is performed, so that the response corresponding to the gestures made by the plurality of palms at the same time can be performed in an orderly manner.
  • the user's gesture and the biometric information of the user's hand are acquired by the image sensor module, and the user is authenticated according to the biometric information. If the verification is passed, the user's gesture is recognized, and the preset is obtained. a range of the operation authority of the user, and determining whether there is a response corresponding to the gesture of the user within the scope of the operation authority, and if yes, acquiring a preset priority of the gesture operation of the user, and then operating the priority according to the gesture The response is executed so that the false trigger rate of the gesture operation can be reduced, the unauthorized operation can be prevented, and the response corresponding to the gesture made by the plurality of users simultaneously can be performed in an orderly manner.
  • FIG. 6 is a schematic structural diagram of a first embodiment of a gesture recognition apparatus according to the present disclosure.
  • the gesture recognition apparatus described in this embodiment includes:
  • the obtaining module 601 is configured to acquire, by the image sensor module, a gesture of the user and biometric information of the user's hand.
  • the gesture of the user may specifically include various actions performed by the user's hand, and may also include location information of the user's hand and the like.
  • the biometric information may specifically include palm print information and ruler One or more of the inch information and the skin color information, the size information may specifically be the weight of the palm, the finger, the length, and the like.
  • the verification module 602 is configured to perform identity verification on the user according to the biometric information.
  • the biometric information of the palm, the finger, and the like is unique, and the acquiring module 601 may collect and store the biometric information of the designated user's hand in advance, and subsequently detect the gesture input by the user.
  • the verification module 602 compares the biometric information of the user's hand with the stored biometric information, and performs identity authentication on the user. If the authentication fails, the verification module 602 determines that the user does not have any gesture operation authority, that is, The response corresponding to any gesture input by the user is performed.
  • the identification module 603 is configured to identify the gesture of the user when the verification module performs authentication on the user.
  • the identification module 603 only recognizes the gesture of the user who passes the identity authentication, and passes the identity authentication as the activation signal of the gesture recognition, that is, the workflow of performing the identity authentication and the gesture recognition, which can effectively filter the hands without the operation authority.
  • the interference effectively reduces the false trigger rate of the gesture operation.
  • the determining module 604 is configured to determine whether there is a response corresponding to the gesture of the user.
  • the executing module 605 is configured to execute the response when the determining module determines that there is a response corresponding to the gesture of the user.
  • the gesture recognition device may set one or more effective gestures, that is, only when the user inputs one of the valid gestures, the gesture recognition device performs the response corresponding to the gesture.
  • the user's gesture and the biometric information of the user's hand are acquired by the image sensor module, and the user is authenticated according to the biometric information. If the verification is passed, the user's gesture is recognized, and then the presence or absence of the user is determined. The response corresponding to the gesture of the user, if present, executes the response, and only the identity verification passes to identify the corresponding gesture, thereby reducing the false trigger rate of the gesture operation.
  • FIG. 7 is a schematic structural diagram of a second embodiment of a gesture recognition apparatus according to the present disclosure.
  • the gesture recognition apparatus described in this embodiment includes:
  • An obtaining module 701 configured to acquire a gesture of the user and the user's hand through the image sensor module Biometric information.
  • the verification module 702 is configured to perform identity verification on the user according to the biometric information.
  • the identification module 703 is configured to identify the gesture of the user when the verification module performs authentication on the user.
  • the determining module 704 is configured to determine whether there is a response corresponding to the gesture of the user.
  • the executing module 705 is configured to execute the response when the determining module determines that there is a response corresponding to the gesture of the user.
  • the obtaining module 701 performs two processes on the image of each frame that is collected by the image sensor module, including the user gesture.
  • the first processing is: the obtaining module 701 acquires a gesture of the user (for example, may be a gesture for the user).
  • the gesture segmentation is performed to obtain a gesture), and the first location information of the user's hand is acquired in real time during the process of acquiring the gesture of the user.
  • the second processing is: the obtaining module 701 extracts the biometric information of the user's hand in the image, and acquires the second location information of the user's hand in real time in the process of extracting the biometric information of the user's hand.
  • the gesture recognition device determines that the first location information and the second location information are the same, the gesture recognition device may determine the user to which the hand belongs (ie, the hand corresponds to the identification information of the user Therefore, after the gesture made by the hand is recognized, the gesture can be associated with the identification information of the user, that is, the gestures made by different users are differentiated, and the execution subject (ie, the user) of each gesture is determined. It can be seen that, for a scenario in which a multi-user input gesture is determined by identity verification, the identity of the hand corresponding user is determined, and then the hand position corresponding to the identity verification is compared with the hand position obtained by the acquisition gesture, and the execution subject corresponding to each hand can be determined. That is, the execution subject corresponding to the gesture made by each hand is determined.
  • the image sensor module acquires the gesture of the user and the biometric information of the user's hand, and acquires the first location information of the user's hand in the process of acquiring the gesture, and acquires the biometric information. Simultaneously acquiring the second location information of the user's hand, authenticating the user according to the biometric information, and if the verification passes, identifying the gesture of the user, and setting according to the first location information and the second location information
  • the hand corresponds to the identification information of the user, and the gesture is associated with the identification information of the user, thereby determining whether there is a response corresponding to the gesture of the user, and if yes, executing the response, thereby reducing false triggering of the gesture operation. Rate, and can distinguish between different execution subjects, allowing multiple users to input gestures at the same time.
  • the determining module 704 may specifically include:
  • the first obtaining unit 7040 is configured to acquire a preset operating permission range of the user.
  • the determining unit 7041 is configured to determine whether there is a response corresponding to the gesture of the user within the operating authority range.
  • the executing module 705 may specifically include:
  • the second obtaining unit 7050 is configured to acquire a preset gesture operation priority of the user.
  • the executing unit 7051 is configured to perform the response according to the gesture operation priority.
  • the gesture recognition apparatus may set the user's operation authority range and the gesture operation priority according to the user's experience richness, for example, the greater the age of the user in a certain age group, the greater the operation authority range of the user and/or the more the gesture operation priority High; the user's operation authority range and gesture operation priority may also be set according to the degree of matching (or association) between the current working state of the gesture recognition device and the user, for example, the vehicle driver has the largest operation authority range and the highest gesture operation priority. Level, while the average passenger has a smaller operating range and lower gesture priority, and so on.
  • the user's gesture and the biometric information of the user's hand are acquired by the image sensor module, and the user is authenticated according to the biometric information. If the verification is passed, the user's gesture is recognized, and the preset is obtained. a range of the operation authority of the user, and determining whether there is a response corresponding to the gesture of the user within the scope of the operation authority, and if yes, acquiring a preset priority of the gesture operation of the user, and then operating the priority according to the gesture The response is executed so that the false trigger rate of the gesture operation can be reduced, the unauthorized operation can be prevented, and the response corresponding to the gesture made by the plurality of users simultaneously can be performed in an orderly manner.
  • the program may be stored in a computer readable storage medium, and the storage medium may include: Flash disk, read-only memory (Read-Only Memory, ROM), random access memory (RAM), disk or optical disc.

Abstract

本发明公开了一种手势识别方法、装置及电子设备,该方法包括:通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应。本发明可以降低手势操作的误触发率。

Description

一种手势识别方法、装置及电子设备 技术领域
本发明涉及人机交互技术领域,尤其涉及一种手势识别方法、装置及电子设备。
背景技术
多模式人机接口技术是解决计算设备的高智能性和高可用性等问题的核心技术,通过多模式人机接口技术可以建立和谐、自然的人机交互环境,使得用户可以方便、自然地使用人类所熟知的方式使用计算机。其中最重要的环节就是要让计算机能够准确无误地感知包括自然语言、手势语言、面部表情在内的不同的人类表达方式,以实现拟人化的人机交互,而手势的检测和识别是人机交互及模式识别领域的一项重要研究内容,同时也是由手形动作辅之以表情姿势为符号构成的手语识别的关键技术之一。
目前普遍采用视觉识别、红外检测、超声检测、雷达信号检测等方案实现用户的手势采样和识别,其中的视觉识别是目前手势识别的主流发展方向,然而现有的通过视觉识别检测用户手势的方案仍然存在一些不足,例如容易产生误触发。
发明内容
本申请公开了一种手势识别方法、装置及电子设备,可以降低手势操作的误触发率。
本申请公开了一种手势识别方法,可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,如果验证通过,则识别该用户的手势,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,从而可以降低手势操作的误触发率。
进一步的,可以获取预设的该用户的操作权限范围,并判断在该操作权限范围内是否存在与该用户的手势对应的响应,只执行与该用户的操作权限范围相匹配的响应,从而防止出现越权操作。
进一步的,可以获取预设的该用户的手势操作优先级,并按照该手势操作优先级执行该响应,从而可以实现有序地执行多个用户的手势对应的响应。
进一步的,该生物特征信息可以包括掌纹信息、尺寸信息和肤色信息中的一种或多种。
本申请还公开了一种手势识别装置,包括获取模块、验证模块、识别模块、判断模块和执行模块,其中:
该获取模块通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,该验证模块根据该生物特征信息对该用户进行身份验证,识别模块在该验证模块对该用户进行身份验证的结果为验证通过时,识别该用户的手势,判断模块可以判断是否存在与该用户的手势对应的响应,执行模块,用于在该判断模块判断出存在与该用户的手势对应的响应时,执行该响应,从而可以降低手势操作的误触发率。
进一步的,该判断模块可以先获取预设的该用户的操作权限范围,然后判断在该操作权限范围内是否存在与该用户的手势对应的响应,从而只执行与该用户的操作权限范围相匹配的响应,可以防止出现越权操作。
进一步的,该执行模块还可以获取预设的该用户的手势操作优先级,并按照该手势操作优先级执行该响应,从而可以实现有序地执行多个用户的手势对应的响应。
本申请还公开了一种电子设备,包括图像传感器模块和处理器,其中:
该图像传感器模块采集手势图像。
该处理器根据该图像传感器模块采集的手势图像获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,如果验证通过,则识别该用户的手势,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,从而可以降低手势操作的误触发率。
进一步的,该处理器判断是否存在与该用户的手势对应的响应的具体方式可以为:
该处理器获取预设的该用户的操作权限范围,并判断在该操作权限范围内是否存在与该用户的手势对应的响应,只执行与该用户的操作权限范围相匹配的响应,从而防止出现越权操作。
进一步的,该处理器执行该响应的具体方式可以为:
该处理器获取预设的该用户的手势操作优先级,并按照该手势操作优先级执行该响应,从而可以实现有序地执行多个用户的手势对应的响应。
本申请可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,从而可以降低手势操作的误触发率。
附图说明
为了更清楚地说明本发明中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本发明公开的一种电子设备的一实施例结构示意图;
图2是本发明公开的一种手势识别方法的第一实施例流程示意图;
图3是本发明公开的一种手势识别方法的第二实施例流程示意图;
图4是本发明公开的一种手势操作示意图;
图5是本发明公开的一种手势识别方法的第三实施例流程示意图;
图6是本发明公开的一种手势识别装置的第一实施例结构示意图;
图7是本发明公开的一种手势识别装置的第二实施例结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例公开了一种手势识别方法、装置及电子设备,可以降低手势操作的误触发率。
请参阅图1,为本发明公开的一种电子设备的结构示意图。本实施例中所描述的电子设备用于对用户的手势进行识别,具体可以包括:图像传感器模块101、处理器102、通信模块103、输入模块104、输出模块105、存储器106,图像传感器模块101、处理器102、通信模块103、输入模块104、输出模块105和存储器106可以通过总线107连接,也可以以其它方式连接,其中:
图像传感器模块101具体可以包括第一图像传感器和第二图像传感器,也可以只包括一个图像传感器或者多于两个的图像传感器,用于获取用户输入的手势。
输入模块104具体可以包括触摸屏、键盘、麦克风等中的一种或多种,用于获取用户输入的文本、语音、控制指令等信息。
输出模块105具体可以包括显示器、扬声器等中的一种或多种,用于输出用户交互界面、文本、语音等信息。
需要说明的是,触摸屏和显示器也可以为一体设计,即本发明实施例中触摸屏(或显示器)既可以用于输入文本、控制指令等信息,也可以用于输出用户交互界面、文本等信息。
在一些可行的实施方式中,输入模块104和输出模块105均可以省略。
通信模块103具体可以包括无线通信电路、有线通信电路中的一种或多种,用于通过无线或者有线的方式与其它设备或服务器进行数据通信。在该电子设备不具备通信功能的情况下,通信模块103也可以省略。
存储器106具体可以包括易失性存储器(英文:volatile memory),例如随机存取存储器(英文:random-access memory,缩写:RAM);存储器106也可以包括非易失性存储器(英文:non-volatile memory),例如只读存储器(英文:read-only memory,缩写:ROM),快闪存储器(英文:flash memory),硬盘(英文:hard disk drive,缩写:HDD)或固态硬盘(英文:solid-state drive,缩写:SSD);存储器106还可以包括上述种类的存储器的组合,用于存储操作系统(英文:operating system,缩写:OS)、手势识别、网络通信、输入/输出等程序。
处理器102具体可以包括第一处理器和第二处理器,第一处理器和第二处理器可以分别用于执行不同的功能,例如第一处理器可以为处理器102的图形 处理单元(英文:graphics processing unit,缩写:GPU),用于调用存储器106存储的手势识别相关的程序对图像传感器模块101输出的图像数据进行处理,以及用于生成输出模块105中显示器输出的用户交互界面;第二处理器可以为处理器102的中央处理单元(英文:central processing unit,缩写:CPU),用于调用存储器106存储的网络通信、输入/输出相关的程序对通信模块103、输入模块104、输出模块105中扬声器的数据进行处理。其中,CPU和GPU可以实现为集成多核(例如双核、四核、八核等)封装,也可以集成为片上系统(英文:system on chip,缩写:SoC),为多层封装。
目前,在进行手势识别时经常会出现一些误触发的情况,例如人手不经意划过该电子设备的图像传感器模块的图像采集区域时,或者没有手势操作权限的人手进行手势操作时,该电子设备都很可能执行相应的手势响应而产生误操作的情况,无法滤除没有手势操作权限的人手带来的干扰。
请参阅图2,为本发明公开的一种手势识别方法的第一实施例流程示意图。本实施例中所描述的手势识别方法基于图1所示的电子设备,包括以下步骤:
S201、电子设备通过图像传感器模块获取用户的手势和所述用户手部的生物特征信息。
本实施例中该图像传感器模块可以是一个或者多个,使用多个该图像传感器模块采集手势图像可以获取用户手势的景深等信息,确定该用户手部的空间位置,从而识别出更加丰富的手势。
其中,该用户的手势具体可以包括该用户的手部做出的各种动作,还可以包括该用户的手部的位置信息等。该生物特征信息具体可以包括掌纹信息、尺寸信息和肤色信息中的一种或多种,尺寸信息具体可以是手掌、手指的粗细、长短等。
需要说明的是,本步骤中该电子设备只是先获取该用户输入的手势,并不会对手势进行识别。
S202、所述电子设备根据所述生物特征信息对所述用户进行身份验证,若验证通过,则执行步骤S203,若验证不通过,则本次流程结束。
具体的,对于不同用户来说手掌、手指等部位的生物特征信息都是唯一的, 该电子设备可以事先采集指定用户的手部的生物特征信息并进行存储,后续在检测到用户输入的手势时,将该用户手部的生物特征信息与存储的生物特征信息进行比对,对该用户进行身份认证,如果认证失败,则该电子设备确定该用户不具备任何手势操作权限,即不执行该用户输入的任意手势对应的响应。
S203、所述电子设备识别所述用户的手势。
具体的,该电子设备只对身份认证通过的用户的手势进行识别,把身份认证通过作为手势识别的启动信号,即进行先身份认证再手势识别的作业流程,可以有效滤除无操作权限的人手的干扰,有效地降低了手势操作的误触发率。
S204、所述电子设备判断是否存在与所述用户的手势对应的响应,若是,则执行步骤S205;若否,则本次流程结束。
具体的,该电子设备可以设置一种或者多种有效手势,即只有用户输入的是有效手势中的其中一种时,该电子设备才会执行手势对应的响应。
S205、所述电子设备执行所述响应。
本实施例可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,只有身份验证通过才会识别对应的手势,从而可以降低手势操作的误触发率。
请参阅图3,为本发明公开的一种手势识别方法的第二实施例流程示意图。本实施例中所描述的手势识别方法基于图1所示的电子设备,包括以下步骤:
其中,步骤S301与前面实施例中的步骤S201相同,步骤S306和S307与前面实施例中的步骤S204和S205相同,本发明实施例此处不再赘述。
S302、所述电子设备在获取所述手势的过程中获取所述用户手部的第一位置信息,并在获取所述生物特征信息的过程中获取所述用户手部的第二位置信息。
具体的,该电子设备对图像传感器模块采集的包括该用户手势的每一帧图像同时进行两种处理,第一种处理是:该电子设备获取该用户的手势(例如可以是对该用户的手势进行手势分割以获得手势),并在获取该用户的手势的过 程中实时获取该用户手部的第一位置信息。第二种处理是:该电子设备提取图像中该用户手部的生物特征信息,并在提取该用户手部的生物特征信息的过程中实时获取该用户手部的第二位置信息。
S303、所述电子设备根据所述生物特征信息对所述用户进行身份验证,若验证通过,则执行步骤S304,若验证不通过,则本次流程结束。
S304、所述电子设备根据所述第一位置信息和所述第二位置信息设置所述手部对应用户的标识信息。
S305、所述电子设备识别所述用户的手势,并将所述手势与所述用户的标识信息对应。
具体的,对于身份验证通过的用户,该电子设备确定该第一位置信息和该第二位置信息相同,则该电子设备可以确定出该手部所属的用户(即手部与用户的标识信息对应),从而在识别出该手部做出的手势之后可以将该手势与用户的标识信息对应,即实现了区分不同用户做出的手势,确定了各个手势的执行主体(即用户)。
举例来说,如图4所示,图4a中上方和下方的图像为同一帧图像,假设其中手部的位置为手势的起始位置,图4b中上方和下方的图像也为同一帧图像,假设其中该手部的位置为手势的终止位置,该手部从图4a到图4b完成了图4c所示的平移手势,该电子设备对图4a、图4b中上方的图像进行该第一种处理,即获取该手部做出的手势,以及该手部做出该手势的过程中该手部的第一位置信息;该电子设备对图4a、图4b中下方的图像进行该第二种处理,即获取该手部的生物特征信息(例如手指、手腕的尺寸等),以及该手部做出该手势的过程中该手部的第二位置信息,如果该电子设备根据该手部的手指、手腕的尺寸对该手部对应用户的身份验证通过,该电子设备确定该第一位置信息与该第二位置信息相同,则可以确定出该手部所属的用户,从而确定出了该手部做出的平移手势(图4c所示)的执行主体。
可见,对于多用户输入手势的场景,通过身份验证确定手部对应用户的身份,再对比身份验证处理过程中得到的手部位置与获取手势过程中得到的手部位置可以确定出各个手部对应的执行主体,也即确定了各个手部做出的手势对应的执行主体。
本实施例可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,在获取该手势的过程中同时获取该用户手部的第一位置信息,并在获取该生物特征信息的过程中同时获取该用户手部的第二位置信息,根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,并根据该第一位置信息和该第二位置信息设置该手部对应用户的标识信息,并将该手势与该用户的标识信息对应,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,从而可以降低手势操作的误触发率,并可以区分出不同的执行主体,允许多用户同时输入手势。
请参阅图5,为本发明公开的一种手势识别方法的第三实施例流程示意图。本实施例中所描述的手势识别方法基于图1所示的电子设备,包括以下步骤:
其中,步骤S501~步骤S503与前面实施例中的步骤S201~步骤S203相同,本发明实施例此处不再赘述。
S504、所述电子设备获取预设的所述用户的操作权限范围。
具体的,该电子设备可以针对不同用户设置相同或者不同的操作权限范围,从而用户只能在各自有限的操作权限内对该电子设备进行控制。
S505、所述电子设备判断在所述操作权限范围内是否存在与所述用户的手势对应的响应,若是,则执行步骤S506;若否,则本次流程结束。
S506、所述电子设备获取预设的所述用户的手势操作优先级,并按照所述手势操作优先级执行所述响应。
具体的,该电子设备可以根据用户的经验丰富程度设置用户的操作权限范围以及手势操作优先级,例如在某个年龄段内年龄越大用户的操作权限范围越大和/或手势操作优先级越高;也可以根据该电子设备当前的工作状态与用户的匹配(或者关联)程度设置用户的操作权限范围以及手势操作优先级,例如车辆驾驶员拥有最大的操作权限范围和最高的手势操作优先级,而一般的乘客拥有较小的操作权限范围和较低的手势操作优先级,等等。
举例来说,以该电子设备为车载控制终端为例,假设汽车内有三人,手掌a对应驾驶员,手掌b对应乘客1,手掌c对应该乘客2,事先设置手掌a和手掌b具备该车载控制终端的手势操作权限并记录手掌a和手掌b的生物特征信息,而 手掌c不具备该车载控制终端的手势操作权限,手掌a的操作权限范围大于手掌b,例如手掌a和手掌b都可以设置汽车的多媒体播放参数以及空调参数等,但只有手掌a可以设置车辆的驾驶参数(例如车速的调整、安全锁的开启或关闭等),同时手掌a的手势操作优先级高于手掌b。
在三个手掌同时做出手势动作时,该车载控制终端获取这三个手掌的生物特征信息,根据生物特征信息确定只有手掌a和手掌b具备手势操作权限,从而只对手掌a和手掌b进行定位跟踪,并识别手掌a和手掌b做出的手势。如果二者的手势对应的响应均为设置车辆的驾驶参数,由于设置车辆的驾驶参数不在手掌b的操作权限范围内,该车载控制终端不执行手掌b发出的手势对应的响应,而是执行手掌a发出的手势对应的响应(即相应地设置车辆的驾驶参数),可以防止出现越权操作。
进一步的,如果手掌a的手势动作对应的响应为调节空调温度,手掌b的手势动作对应的响应为切换当前播放的歌曲,由于手掌a的手势操作优先级高于手掌b,则先调节空调温度,再进行切歌,从而可以有序地执行多个手掌同时做出的手势对应的响应。
本实施例可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,获取预设的该用户的操作权限范围,并判断在该操作权限范围内是否存在与该用户的手势对应的响应,如果存在,则再获取预设的该用户的手势操作优先级,进而按照该手势操作优先级执行该响应,从而可以降低手势操作的误触发率,防止出现越权操作,以及有序地执行多个用户同时做出的手势对应的响应。
请参阅图6,为本发明公开的一种手势识别装置的第一实施例结构示意图。本实施例中所描述的手势识别装置,包括:
获取模块601,用于通过图像传感器模块获取用户的手势和所述用户手部的生物特征信息。
其中,该用户的手势具体可以包括该用户的手部做出的各种动作,还可以包括该用户的手部的位置信息等。该生物特征信息具体可以包括掌纹信息、尺 寸信息和肤色信息中的一种或多种,尺寸信息具体可以是手掌、手指的粗细、长短等。
验证模块602,用于根据所述生物特征信息对所述用户进行身份验证。
具体的,对于不同用户来说手掌、手指等部位的生物特征信息都是唯一的,获取模块601可以事先采集指定用户的手部的生物特征信息并进行存储,后续在检测到用户输入的手势时,验证模块602将该用户手部的生物特征信息与存储的生物特征信息进行比对,对该用户进行身份认证,如果认证失败,则验证模块602确定该用户不具备任何手势操作权限,即不执行该用户输入的任意手势对应的响应。
识别模块603,用于在所述验证模块对所述用户进行身份验证的结果为验证通过时,识别所述用户的手势。
具体的,识别模块603只对身份认证通过的用户的手势进行识别,把身份认证通过作为手势识别的启动信号,即进行先身份认证再手势识别的作业流程,可以有效滤除无操作权限的人手的干扰,有效地降低了手势操作的误触发率。
判断模块604,用于判断是否存在与所述用户的手势对应的响应。
执行模块605,用于在所述判断模块判断出存在与所述用户的手势对应的响应时,执行所述响应。
具体的,该手势识别装置可以设置一种或者多种有效手势,即只有用户输入的是有效手势中的其中一种时,该手势识别装置才会执行手势对应的响应。
本实施例可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,只有身份验证通过才会识别对应的手势,从而可以降低手势操作的误触发率。
请参阅图7,为本发明公开的一种手势识别装置的第二实施例结构示意图。本实施例中所描述的手势识别装置,包括:
获取模块701,用于通过图像传感器模块获取用户的手势和所述用户手部 的生物特征信息。
验证模块702,用于根据所述生物特征信息对所述用户进行身份验证。
识别模块703,用于在所述验证模块对所述用户进行身份验证的结果为验证通过时,识别所述用户的手势。
判断模块704,用于判断是否存在与所述用户的手势对应的响应。
执行模块705,用于在所述判断模块判断出存在与所述用户的手势对应的响应时,执行所述响应。
具体的,获取模块701对图像传感器模块采集的包括该用户手势的每一帧图像同时进行两种处理,第一种处理是:获取模块701获取该用户的手势(例如可以是对该用户的手势进行手势分割以获得手势),并在获取该用户的手势的过程中实时获取该用户手部的第一位置信息。第二种处理是:获取模块701提取图像中该用户手部的生物特征信息,并在提取该用户手部的生物特征信息的过程中实时获取该用户手部的第二位置信息。
进一步的,对于身份验证通过的用户,手势识别装置确定该第一位置信息和该第二位置信息相同,则手势识别装置可以确定出该手部所属的用户(即手部与用户的标识信息对应),从而在识别出该手部做出的手势之后可以将该手势与用户的标识信息对应,即实现了区分不同用户做出的手势,确定了各个手势的执行主体(即用户)。可见,对于多用户输入手势的场景,通过身份验证确定手部对应用户的身份,再对比身份验证通过的手部位置与获取手势得到的手部位置是否相同可以确定出各个手部对应的执行主体,也即确定了各个手部做出的手势对应的执行主体。
本实施例可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,在获取该手势的过程中同时获取该用户手部的第一位置信息,并在获取该生物特征信息的过程中同时获取该用户手部的第二位置信息,根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,并根据该第一位置信息和该第二位置信息设置该手部对应用户的标识信息,并将该手势与该用户的标识信息对应,进而判断是否存在与该用户的手势对应的响应,如果存在,则执行该响应,从而可以降低手势操作的误触发率,并可以区分出不同的执行主体,允许多用户同时输入手势。
在一些可行的实施方式中,所述判断模块704具体可以包括:
第一获取单元7040,用于获取预设的所述用户的操作权限范围。
判断单元7041,用于判断在所述操作权限范围内是否存在与所述用户的手势对应的响应。
在一些可行的实施方式中,所述执行模块705具体可以包括:
第二获取单元7050,用于获取预设的所述用户的手势操作优先级。
执行单元7051,用于按照所述手势操作优先级执行所述响应。
具体的,该手势识别装置可以根据用户的经验丰富程度设置用户的操作权限范围以及手势操作优先级,例如在某个年龄段内年龄越大用户的操作权限范围越大和/或手势操作优先级越高;也可以根据该手势识别装置当前的工作状态与用户的匹配(或者关联)程度设置用户的操作权限范围以及手势操作优先级,例如车辆驾驶员拥有最大的操作权限范围和最高的手势操作优先级,而一般的乘客拥有较小的操作权限范围和较低的手势操作优先级,等等。
本实施例可以通过图像传感器模块获取用户的手势和该用户手部的生物特征信息,并根据该生物特征信息对该用户进行身份验证,若验证通过,则识别该用户的手势,获取预设的该用户的操作权限范围,并判断在该操作权限范围内是否存在与该用户的手势对应的响应,如果存在,则再获取预设的该用户的手势操作优先级,进而按照该手势操作优先级执行该响应,从而可以降低手势操作的误触发率,防止出现越权操作,以及有序地执行多个用户同时做出的手势对应的响应。
需要说明的是,对于前述的各个方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某一些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory, ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
以上对本发明实施例所提供的手势识别方法、装置及电子设备进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (11)

  1. 一种手势识别方法,其特征在于,包括:
    通过图像传感器模块获取用户的手势和所述用户手部的生物特征信息;
    根据所述生物特征信息对所述用户进行身份验证,若验证通过,则识别所述用户的手势;
    判断是否存在与所述用户的手势对应的响应,若是,则执行所述响应。
  2. 根据权利要求1所述的方法,其特征在于,所述判断是否存在与所述用户的手势对应的响应,包括:
    获取预设的所述用户的操作权限范围,并判断在所述操作权限范围内是否存在与所述用户的手势对应的响应。
  3. 根据权利要求1或2所述的方法,其特征在于,所述执行所述响应,包括:
    获取预设的所述用户的手势操作优先级,并按照所述手势操作优先级执行所述响应。
  4. 根据权利要求1所述的方法,其特征在于,
    所述生物特征信息包括掌纹信息、尺寸信息和肤色信息中的一种或多种。
  5. 一种手势识别装置,其特征在于,包括:
    获取模块,用于通过图像传感器模块获取用户的手势和所述用户手部的生物特征信息;
    验证模块,用于根据所述生物特征信息对所述用户进行身份验证;
    识别模块,用于在所述验证模块对所述用户进行身份验证的结果为验证通过时,识别所述用户的手势;
    判断模块,用于判断是否存在与所述用户的手势对应的响应;
    执行模块,用于在所述判断模块判断出存在与所述用户的手势对应的响应时,执行所述响应。
  6. 根据权利要求5所述的装置,其特征在于,所述判断模块包括:
    第一获取单元,用于获取预设的所述用户的操作权限范围;
    判断单元,用于判断在所述操作权限范围内是否存在与所述用户的手势对应的响应。
  7. 根据权利要求5或6所述的装置,其特征在于,所述执行模块包括:
    第二获取单元,用于获取预设的所述用户的手势操作优先级;
    执行单元,用于按照所述手势操作优先级执行所述响应。
  8. 根据权利要求5所述的装置,其特征在于,
    所述生物特征信息包括掌纹信息、尺寸信息和肤色信息中的一种或多种。
  9. 一种电子设备,其特征在于,包括图像传感器模块和处理器,其中:
    所述图像传感器模块,用于采集手势图像;
    所述处理器,用于根据所述图像传感器模块采集的手势图像获取用户的手势和所述用户手部的生物特征信息;
    所述处理器,还用于根据所述生物特征信息对所述用户进行身份验证,若验证通过,则识别所述用户的手势;
    所述处理器,还用于判断是否存在与所述用户的手势对应的响应,若是,则执行所述响应。
  10. 根据权利要求9所述的设备,其特征在于,所述处理器判断是否存在与所述用户的手势对应的响应的具体方式为:
    获取预设的所述用户的操作权限范围,并判断在所述操作权限范围内是否存在与所述用户的手势对应的响应。
  11. 根据权利要求9或10所述的设备,其特征在于,所述处理器执行所述响应的具体方式为:
    获取预设的所述用户的手势操作优先级,并按照所述手势操作优先级执行所述响应。
PCT/CN2015/100337 2015-12-31 2015-12-31 一种手势识别方法、装置及电子设备 WO2017113407A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2015/100337 WO2017113407A1 (zh) 2015-12-31 2015-12-31 一种手势识别方法、装置及电子设备
CN201580066876.9A CN107533599B (zh) 2015-12-31 2015-12-31 一种手势识别方法、装置及电子设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/100337 WO2017113407A1 (zh) 2015-12-31 2015-12-31 一种手势识别方法、装置及电子设备

Publications (1)

Publication Number Publication Date
WO2017113407A1 true WO2017113407A1 (zh) 2017-07-06

Family

ID=59224360

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/100337 WO2017113407A1 (zh) 2015-12-31 2015-12-31 一种手势识别方法、装置及电子设备

Country Status (2)

Country Link
CN (1) CN107533599B (zh)
WO (1) WO2017113407A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110908822A (zh) * 2019-11-26 2020-03-24 珠海格力电器股份有限公司 智能硬件防误碰方法、装置、存储介质及电子设备
CN113050434A (zh) * 2019-12-26 2021-06-29 佛山市云米电器科技有限公司 家用电器控制方法、家用电器及计算机可读存储介质
CN113056718A (zh) * 2018-11-14 2021-06-29 华为技术有限公司 手持移动终端操控方法及相关装置
CN114924686A (zh) * 2022-07-20 2022-08-19 深圳市星卡软件技术开发有限公司 汽车诊断设备手势智能响应处理方法、装置、设备及介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109376513A (zh) * 2018-08-29 2019-02-22 盐城线尚天使科技企业孵化器有限公司 基于手势控制的双路验证方法和系统
CN111123959B (zh) * 2019-11-18 2023-05-30 亿航智能设备(广州)有限公司 基于手势识别的无人机控制方法及采用该方法的无人机
CN112989959A (zh) * 2021-02-20 2021-06-18 北京鸿合爱学教育科技有限公司 手归属识别方法、装置、电子设备及存储介质
CN114701409B (zh) * 2022-04-28 2023-09-05 东风汽车集团股份有限公司 一种手势交互式智能座椅调节方法和系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20100020020A1 (en) * 2007-11-15 2010-01-28 Yuannan Chen System and Method for Typing Using Fingerprint Recognition System
CN103135761A (zh) * 2011-11-22 2013-06-05 创见资讯股份有限公司 利用侦测生物特征执行软件功能的方法和相关电子装置
CN103995998A (zh) * 2014-05-19 2014-08-20 华为技术有限公司 一种非接触手势命令的认证方法以及用户设备
CN104331656A (zh) * 2014-11-22 2015-02-04 广东欧珀移动通信有限公司 一种基于指纹识别传感器安全操作文件的方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110114732A (ko) * 2007-09-24 2011-10-19 애플 인크. 전자 장치 내의 내장형 인증 시스템들
US20130159939A1 (en) * 2011-10-12 2013-06-20 Qualcomm Incorporated Authenticated gesture recognition
CN104809383A (zh) * 2015-04-28 2015-07-29 百度在线网络技术(北京)有限公司 便携式智能容器及其解锁方法和解锁装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20100020020A1 (en) * 2007-11-15 2010-01-28 Yuannan Chen System and Method for Typing Using Fingerprint Recognition System
CN103135761A (zh) * 2011-11-22 2013-06-05 创见资讯股份有限公司 利用侦测生物特征执行软件功能的方法和相关电子装置
CN103995998A (zh) * 2014-05-19 2014-08-20 华为技术有限公司 一种非接触手势命令的认证方法以及用户设备
CN104331656A (zh) * 2014-11-22 2015-02-04 广东欧珀移动通信有限公司 一种基于指纹识别传感器安全操作文件的方法及装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113056718A (zh) * 2018-11-14 2021-06-29 华为技术有限公司 手持移动终端操控方法及相关装置
CN110908822A (zh) * 2019-11-26 2020-03-24 珠海格力电器股份有限公司 智能硬件防误碰方法、装置、存储介质及电子设备
CN113050434A (zh) * 2019-12-26 2021-06-29 佛山市云米电器科技有限公司 家用电器控制方法、家用电器及计算机可读存储介质
CN114924686A (zh) * 2022-07-20 2022-08-19 深圳市星卡软件技术开发有限公司 汽车诊断设备手势智能响应处理方法、装置、设备及介质

Also Published As

Publication number Publication date
CN107533599B (zh) 2020-10-16
CN107533599A (zh) 2018-01-02

Similar Documents

Publication Publication Date Title
WO2017113407A1 (zh) 一种手势识别方法、装置及电子设备
US10482230B2 (en) Face-controlled liveness verification
KR102223416B1 (ko) 사용자 인증 제스쳐 기법
US9594893B2 (en) Multi-touch local device authentication
US9239917B2 (en) Gesture controlled login
Tian et al. KinWrite: Handwriting-Based Authentication Using Kinect.
US10042995B1 (en) Detecting authority for voice-driven devices
Jain et al. Exploring orientation and accelerometer sensor data for personal authentication in smartphones using touchscreen gestures
CN107223254B (zh) 用于隐藏设置处理的方法、用户装置和存储介质
WO2017067431A1 (zh) 权限控制系统和方法、鼠标器以及计算机系统
JP2017511912A5 (zh)
US20190130411A1 (en) Method and system for data processing
WO2019033576A1 (zh) 人脸姿态检测方法、装置及存储介质
Lu et al. Gesture on: Enabling always-on touch gestures for fast mobile access from the device standby mode
US20110206244A1 (en) Systems and methods for enhanced biometric security
WO2021220423A1 (ja) 認証装置、認証システム、認証方法および認証プログラム
TW202113685A (zh) 人臉辨識的方法及裝置
JP2015018413A (ja) 携帯端末、画像表示方法、及びプログラム
TWI584146B (zh) 基於人臉識別的整合登錄系統及方法
Burgbacher et al. A behavioral biometric challenge and response approach to user authentication on smartphones
JPWO2021130967A5 (zh)
JP2017142614A (ja) 情報処理装置
CN111582078B (zh) 基于生物信息和姿势的操作方法、终端设备及存储介质
US9674185B2 (en) Authentication using individual's inherent expression as secondary signature
CN110809089B (zh) 处理方法和处理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15912031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15912031

Country of ref document: EP

Kind code of ref document: A1