CN111582078B - Operation method based on biological information and gesture, terminal device and storage medium - Google Patents

Operation method based on biological information and gesture, terminal device and storage medium Download PDF

Info

Publication number
CN111582078B
CN111582078B CN202010328853.3A CN202010328853A CN111582078B CN 111582078 B CN111582078 B CN 111582078B CN 202010328853 A CN202010328853 A CN 202010328853A CN 111582078 B CN111582078 B CN 111582078B
Authority
CN
China
Prior art keywords
information
identified
determining
gesture
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010328853.3A
Other languages
Chinese (zh)
Other versions
CN111582078A (en
Inventor
陈建昌
陈皓麟
周黎
劳鹏飞
房雪雁
苏武龙
钟俊楷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Wedonetech Technology Co ltd
Original Assignee
Guangzhou Wedonetech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Wedonetech Technology Co ltd filed Critical Guangzhou Wedonetech Technology Co ltd
Priority to CN202010328853.3A priority Critical patent/CN111582078B/en
Publication of CN111582078A publication Critical patent/CN111582078A/en
Application granted granted Critical
Publication of CN111582078B publication Critical patent/CN111582078B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of biological recognition, and particularly discloses an operation method, terminal equipment and a storage medium based on biological information and gestures. The method comprises the following steps: when the part to be identified of the user is detected, acquiring identity verification information of the part to be identified; determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information; and executing corresponding target operation according to the identity verification information and the target instruction. The embodiment of the application can realize verification of the identity of the user and expand the functional application of the terminal equipment by collecting the part to be identified, thereby optimizing the user experience.

Description

Operation method based on biological information and gesture, terminal device and storage medium
Technical Field
The present application relates to the field of biometric technologies, and in particular, to an operation method, a terminal device, and a storage medium based on biometric information and gestures.
Background
With the development of identity recognition technology, more and more types of recognition technology are applied to identity authentication, including various human body biometric recognition such as face recognition for authentication by recognizing a face, voiceprint recognition for authentication by recognizing a voice, iris recognition for authentication by recognizing a pupil, finger vein recognition for authentication by recognizing a finger vein, and the like.
The biological recognition module in the prior art can only realize a single verification function, and as the application of the biological recognition technology is wider and wider, the single verification function cannot meet the expansion application of terminal products, and the user experience is poor.
Disclosure of Invention
The application provides an operation method based on biological information and gestures, terminal equipment and a storage medium, which can expand the functional application of the terminal equipment and optimize user experience.
In a first aspect, the present application provides a method of operation based on biological information and gestures, the method comprising:
when the part to be identified of the user is detected, acquiring identity verification information of the part to be identified;
determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information;
and executing corresponding target operation according to the identity verification information and the target instruction.
In a second aspect, the present application further provides a terminal device, where the terminal device includes a memory and a processor; the memory is used for storing a computer program; the processor is configured to execute the computer program and implement the operation method based on biological information and gestures as described above when the computer program is executed.
In a third aspect, the present application also provides a computer-readable storage medium storing a computer program which, when executed by a processor, causes the processor to implement the method of operation based on biological information and gestures as described above.
The application discloses an operation method based on biological information and gestures, a terminal device and a storage medium, wherein the method comprises the following steps: when the part to be identified of the user is detected, acquiring identity verification information of the part to be identified; determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information; and executing corresponding target operation according to the identity verification information and the target instruction. According to the embodiment of the application, the identity verification information and the gesture information of the part to be identified can be obtained, the identity of the user can be verified according to the identity verification information, different types of target instructions can be determined according to the gesture information of the part to be identified, and the terminal equipment can execute corresponding target operations according to the different types of target instructions; the embodiment of the application can realize verification of the identity of the user and expand the functional application of the terminal equipment by collecting the part to be identified, thereby optimizing the user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a method of operation based on biological information and gestures provided by an embodiment of the application;
fig. 2 is a schematic structural diagram of a biological information collecting device according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of another method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 4 is a schematic flow chart of another method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 5 is a schematic flow chart of another method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 6 is a schematic flow chart diagram of another method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 7 is a schematic illustration of a scenario of a method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 8 is another schematic illustration of a scenario of a method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 9 is another schematic illustration of a scenario of a method of operation based on biological information and gestures provided by an embodiment of the present application;
FIG. 10 is another schematic illustration of a scenario of a method of operation based on biological information and gestures provided by an embodiment of the present application;
fig. 11 is a schematic block diagram of a structure of a terminal device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
It is to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Embodiments of the present application provide an operation method, a terminal device, and a storage medium based on biological information and gestures. Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic flowchart of an operation method based on biological information and gestures, which is provided in an embodiment of the present application, and the method is applied to a terminal device having a biological information acquisition device, where the terminal device may be a terminal product to which the biological information acquisition device is applied, and for example, the terminal device may be an access control device. As shown in fig. 1, the operation method based on the biological information and the gesture specifically includes steps S101 to S103.
S101, when a part to be identified of a user is detected, acquiring identity verification information of the part to be identified.
The part to be identified is a body part for performing biological identification authentication, and the authentication information is information for the biological identification authentication, and the biological identification authentication can be performed according to the authentication information by collecting the authentication information of the part to be identified so as to determine the identity of the user.
When the user needs to carry out the biological identification authentication, the part to be identified is placed in the acquisition space of the biological information acquisition device, so that the identity verification information of the part to be identified of the user can be acquired. The trigger end can be arranged on the terminal equipment, and the trigger end can be used for detecting the part to be identified through infrared detection or can be used for detecting the part to be identified through contact triggering. For example, the trigger end may be a key disposed in the collection space, when the portion to be identified of the user is disposed in the collection space of the biological information collection device, the key is triggered after being pressed by the portion to be identified, the trigger end may send trigger information to the processor, and the processor may determine that the portion to be identified of the user is detected. When the part to be identified of the user is detected, the identity verification information of the part to be identified can be acquired through the biological information acquisition device.
In one embodiment, the authentication information includes vein feature information, the portion to be identified includes a finger, palm or wrist-to-fingertip area, and the biological information collection device may include an image collection end. As shown in fig. 2, the terminal device 20 may include an image capturing end 30, a triggering end 40, an infrared unit 50, a processor 60 and an optical filter 70, where the image capturing end 30, the triggering end 40 and the infrared unit 50 are all connected to the processor 60, and the processor 60 is used for inputting, processing and outputting data signals, and may coordinate data communication and data processing between other modules. The processor 60 may control the image acquisition terminal 30 to acquire authentication information for authenticating the identity of the user. The processor 60 may also be connected to an operating end 80, where the operating end 80 may be an external other end, and may specifically be determined according to practical applications, and the operating end 80 may be an electronic lock, for example. Optionally, the terminal device further includes an optical equalizing component and other sensors, which can refer to the structural features of the biological information collecting device in the prior art during application, and are not described herein.
Veins are hidden in the body, blood flows in veins, and heme in blood can absorb infrared light, so that when a part to be identified of a user is detected, the infrared unit 50 is controlled to emit infrared light to irradiate fingers, and the image acquisition end 30 shoots the part to be identified 10, so that vein characteristic information showing the shape of veins can be acquired.
S102, determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information.
After the authentication information of the part to be identified is obtained, the gesture information of the part to be identified can be determined, wherein the gesture information is characteristic information for reflecting the gesture form of the part to be identified, and the gesture information can comprise a static gesture and/or a dynamic gesture. The authentication information of each user is generally used only to authenticate the identity of the user, i.e. either pass authentication or fail authentication. But the user can transmit richer information by adjusting the gesture information of the part to be recognized, different gesture information can be corresponding to different target instructions, and the function application of the terminal equipment can be expanded.
When acquiring authentication information of a part to be identified, the part to be identified may be required to be at a specific position or to take a specific posture to acquire valid authentication information.
And when the corresponding target instruction is determined, the gesture of the part to be recognized may be a different gesture from that when the authentication information is collected, so that the user may need to convert the gesture of the part to be recognized after the authentication information is collected.
Therefore, a preset time interval can be set, timing can be started from the time of acquiring the identity verification information of the part to be identified, and the user can change the gesture of the part to be identified in the preset time interval, so that the terminal equipment can determine the gesture information of the part to be identified. Alternatively, an image of the part to be identified may be acquired by the image acquisition end, so as to determine pose information of the part to be identified.
The corresponding relation between the plurality of gesture information and the corresponding target instruction can be stored in advance, and after the gesture information of the part to be recognized is determined, the corresponding target instruction can be determined according to the corresponding relation.
S103, executing corresponding target operation according to the identity verification information and the target instruction.
The target instruction may be an instruction for enabling the terminal device to execute a corresponding operation, alternatively, the method of the embodiment of the present application is applied to a processor of the terminal device, where the processor may determine the corresponding target instruction according to the gesture information of the portion to be identified, and further the processor may execute the corresponding target operation according to the target instruction, where the target operation may be that the processor sends the corresponding information to the operation end or other modules in the terminal device according to the target instruction. The other module may be an infrared unit, for example, and may adjust the illumination intensity of the infrared unit, or other parameters, by different target instructions.
The terminal device may be a terminal product to which the biological information acquisition device is applied, and the terminal device generally needs to acquire the operation authority of the user when executing the corresponding target operation, so that the identity information of the user can be verified through the identity verification information. After the operation authority of the user is determined, the corresponding target operation can be executed according to the target instruction.
Optionally, as shown in fig. 3, the performing the corresponding target operation according to the authentication information and the target instruction may be implemented as follows:
s1031, determining whether the identity verification information accords with preset identity information.
S1032, if the identity verification information accords with the preset identity information, executing corresponding target operation according to the target instruction.
The identity of the user can be determined according to the identity verification information, and then the authority of the user is determined, namely whether the user has authority to operate the terminal equipment is judged.
The preset identity information is the identity verification information of the part to be identified of the target user, which is acquired in advance, and can be, for example, vein feature information of the finger of the target user acquired in advance, and is used as the preset identity information for verifying whether the user is the target user or not; the target user is a user who has authority to operate the terminal device.
If the identity verification information accords with the preset identity information, the user is indicated to have the authority for operating the terminal equipment, so that the target instruction can be determined to be from the user with the authority, and the corresponding target operation can be executed according to the target instruction.
In one embodiment, as shown in fig. 4, the operation method based on biological information and gestures provided in the embodiment of the present application may be further implemented as follows:
s201, when a part to be identified of a user is detected, acquiring identity verification information of the part to be identified.
S202, determining whether the identity verification information accords with preset identity information.
S203, if the identity verification information accords with preset identity information, determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information.
S204, executing corresponding target operation according to the target instruction.
In this embodiment of the present application, it may be determined whether the authentication information conforms to the preset identity information before determining the posture information of the portion to be identified in the preset time interval, and when the authentication information conforms to the preset identity information, that is, the authenticated user has the operation right, the operation of determining the posture information of the portion to be identified in the preset time interval is performed.
Therefore, the operation of determining the gesture information of the part to be identified in the preset time interval can be avoided when the identity of the user fails to be verified, the unnecessary work of the terminal equipment is avoided, and the power consumption of the terminal equipment is reduced.
The user authentication information is determined to be in accordance with the preset identity information before the user authentication information is executed, so that the target instruction determined later is the target instruction subjected to the user authority authentication, and the corresponding target operation can be directly executed according to the target instruction.
According to the operation method based on the biological information and the gesture disclosed by the embodiment of the application, the identity verification information and the gesture information of the part to be identified can be obtained, the identity of a user can be verified according to the identity verification information, meanwhile, different types of target instructions can be determined according to the gesture information of the part to be identified, and further, the terminal equipment can execute corresponding target operations according to the different types of target instructions; the embodiment of the application can realize verification of the identity of the user and expand the functional application of the terminal equipment by collecting the part to be identified, thereby optimizing the user experience.
In one embodiment, the acquiring the authentication information of the portion to be identified may be implemented as follows:
and acquiring a first target image of the part to be identified through an image acquisition end, and determining the identity verification information of the part to be identified according to the first target image.
The authentication information may be confirmed through an image acquired by the image acquisition end, and illustratively, the authentication information may be vein feature information, and the first target image may be acquired by irradiating infrared rays to a finger and shooting the finger to a part to be identified through the image acquisition end, where the first target image includes a vein shape, so that the vein feature information may be acquired from the first target image.
Accordingly, the determining of the gesture information of the part to be identified in the preset time interval may be implemented as follows:
acquiring a second target image of the part to be identified at a preset moment through an image acquisition end, and determining the gesture information of the part to be identified according to the second target image or determining the gesture information of the part to be identified according to the first target image and the second target image; the interval between the preset time and the acquisition time of the first target image is smaller than or equal to a preset time interval.
After the first target image is acquired by the image acquisition end, the pose information of the part to be identified can be continuously determined by the image acquisition end. The second target image of the part to be identified is acquired by the image acquisition end, and the gesture information of the part to be identified can be determined according to the second target image.
The second target image may be acquired at a preset time after a certain time interval after the first target image is acquired; if the method is implemented before determining the gesture information of the part to be identified in the preset time interval, judging whether the identity verification information accords with the preset identity information, and before acquiring the second target image, the processor of the terminal equipment can execute the process of determining whether the identity verification information accords with the preset identity information. In the implementation, if the identity verification information accords with the preset identity information before the corresponding target operation is executed according to the target instruction, a certain time can be reserved for converting the gesture of the part to be identified for the user before the second target image is acquired.
The interval between the preset time and the acquisition time of the first target image may be determined according to the actual situation, so long as the interval is smaller than or equal to a preset time interval.
In one embodiment, the interval between the preset time and the acquisition time of the first target image may be determined to be a preset time interval, that is, a second target image acquired after the preset time interval has elapsed after the acquisition of the first target image.
In one embodiment, after the authentication information of the part to be identified is obtained, if the movement of the part to be identified is detected, when the movement of the part to be identified is stopped, a second target image of the part to be identified is acquired through an image acquisition end.
The image acquisition end can continuously acquire the image of the part to be identified so as to detect whether the part to be identified moves, and can determine a preset moment when the part to be identified is detected to stop moving, namely, the image acquisition end acquires a second target image of the part to be identified. In this embodiment, the interval between the preset time and the acquisition time of the first target image is not fixed, and is specifically determined according to the time length of the user moving the part to be identified. The acquisition process changes along with the operation of the user, and no pause feeling exists, so that the use experience of the user can be optimized.
In one embodiment, the portion to be identified may include a finger, palm, or wrist-to-fingertip area, and the determining pose information of the portion to be identified according to the second target image may be implemented by:
and determining the number of fingers included in the second target image, and determining the number of fingers as gesture information.
When the part to be identified is the region from the finger, palm or wrist to the fingertip of the user, the user can determine different gestures according to the different number of the extended fingers, and then determine corresponding target instructions according to the different number of the fingers.
For example, when the second target image includes one finger, the gesture information may be an a gesture, and the target instruction corresponding to the corresponding a gesture may be a first instruction, where the first instruction may be an operation for causing the terminal device to perform unlocking; when the second target image comprises two fingers, the gesture information can be a B gesture, the target instruction corresponding to the corresponding B gesture can be a second instruction, and the second instruction can be an operation for enabling the terminal equipment to execute locking; when the second target image comprises three fingers, the gesture information can be a C gesture, the target instruction corresponding to the corresponding C gesture can be a third instruction, and the third instruction can be an operation for enabling the terminal device to execute a new user registration; when the second target image includes four fingers, the gesture information may be a D gesture, and the target instruction corresponding to the corresponding D gesture may be a fourth instruction, where the fourth instruction may be an operation for enabling the terminal device to execute deleting the user account.
The preset identity information is the vein feature information of the index finger of the right hand of the user, so that the user stretches out of the index finger of the right hand to collect the vein feature information of the index finger of the right hand when collecting the identity verification information of the user. The user can switch the gesture to stretch the middle finger of the right hand, the image acquisition end can acquire the index finger of the right hand and the middle finger of the right hand of the user, and the gesture can be determined as the B gesture.
In this embodiment, the authentication information may be all-hand vein feature information, and the all-hand vein feature information may be vein feature information including all fingers, and may also include vein feature information. The corresponding biological information acquisition device has enough acquisition space for users to change the number of fingers.
In one embodiment, the gesture information may be a dynamic gesture, the gesture information includes a movement track, and as shown in fig. 5, the determining the gesture information of the portion to be identified according to the first target image and the second target image may be implemented as follows:
s301, determining first position information of the part to be identified according to the first target image, and determining second position information of the part to be identified according to the second target image.
S302, determining the moving track of the part to be identified according to the first position information and the second position information.
The gesture information may also be a dynamic gesture, that is, the gesture information includes a movement track of the portion to be recognized, where the movement track includes a position before the portion to be recognized moves and a position after the portion to be recognized moves.
The first target image is an image for determining the identity verification information, namely the part to be identified of the user is not moved yet, and the second target image is acquired when a certain time interval is reserved between the first target image and the acquisition time of the first target image, namely the part to be identified of the user is moved. Therefore, the first position information when the part to be identified is not moved can be determined according to the first target image, the second position information after the part to be identified is moved can be determined according to the second target image, and the moving track of the part to be identified can be determined according to the first position information and the second position information.
By taking the movement track as gesture information, the gesture information can be further enriched, and the function application of the terminal product can be satisfied under the condition that the target operation types of the terminal product are more.
In one embodiment, the portion to be identified includes a finger, as shown in fig. 6, the determining the first location information of the portion to be identified according to the first target image and the determining the second location information of the portion to be identified according to the second target image may be implemented as follows:
s401, determining a first placement angle of the finger according to the first target image, and determining a second placement angle of the finger according to the second target image.
S402, determining the moving direction and the moving angle of the finger according to the first placing angle and the second placing angle, and determining the moving direction and the moving angle as moving tracks.
The part to be identified can be a finger, and the shape of the finger is a long strip, so that the moving track of the finger can be determined according to the placing angle of the finger. The placement angle of the finger may be an angle between the finger and a predetermined direction. For example, the preset direction may be a direction in which the finger is placed in the biological information acquisition device, and the first placement angle of the finger in acquiring the first target image is generally consistent with the preset direction, so the first placement angle may be 0 degrees. After the first target image is acquired, the finger can move, and the included angle between the moved finger and the preset direction is determined to be a second placement angle according to the second target image.
According to the first placement angle and the second placement angle, the movement direction and the movement angle of the finger can be determined, and the movement direction and the movement angle can be determined as a movement track. More types of gesture information can be determined according to different movement directions and different movement angles of the finger, and further more types of target instructions can be determined.
In one embodiment, the determining the target instruction according to the gesture information includes:
if the moving direction belongs to a first direction and the moving angle belongs to a first angle range, determining a target instruction of the gesture information as a first instruction;
if the moving direction belongs to the second direction and the moving angle belongs to the first angle range, determining a target instruction of the gesture information as a second instruction;
if the moving direction belongs to the first direction and the moving angle belongs to the second angle range, determining a target instruction of the gesture information as a third instruction;
and if the moving direction belongs to the second direction and the moving angle belongs to the second angle range, determining the target instruction of the gesture information as a fourth instruction.
Wherein, several areas can be divided in the movable range of the finger of the user, and the finger moves to one area, namely, one gesture information can be determined, and corresponding target instructions can be determined. The first and second directions may be opposite, i.e. movement of the finger in different directions may represent different gesture information, and the second and first angular ranges may be disjoint angular ranges. Illustratively, the first angular range may be 30 degrees to 70 degrees and the second angular range may be 70 degrees to 110 degrees. The first direction may be a clockwise direction and the second direction may be a counterclockwise direction.
As shown in fig. 7 to 10, in fig. 7, when the finger moves from the first placement angle 11 to the second placement angle 12, the movement direction of the finger is clockwise, and the movement angle is 45 degrees, it may be determined that the target instruction corresponding to the gesture information in fig. 7 is a first instruction, and the first instruction may be an operation for causing the terminal device to perform unlocking; the finger in fig. 8 moves from the first placement angle 21 to the second placement angle 22, the movement direction of the finger is counterclockwise, the movement angle is 45 degrees, it may be determined that the target instruction corresponding to the gesture information in fig. 8 is a second instruction, and the second instruction may be an operation for causing the terminal device to perform locking; in fig. 9, when the finger moves from the first placement angle 31 to the second placement angle 32, the movement direction of the finger is clockwise, and the movement angle is 80 degrees, it may be determined that the target instruction corresponding to the gesture information in fig. 9 is a third instruction, and the third instruction may be an operation for enabling the terminal device to perform registration of a new user; in fig. 10, when the finger moves from the first placement angle 41 to the second placement angle 42, the movement direction of the finger is counterclockwise, the movement angle is 80 degrees, it may be determined that the target instruction corresponding to the gesture information in fig. 10 is a fourth instruction, and the fourth instruction may be an operation for causing the terminal device to execute deleting the user account.
Referring to fig. 11, fig. 11 is a schematic block diagram of a structure of a terminal device according to an embodiment of the present application. The terminal device may be a terminal product to which the biological information collecting apparatus is applied.
Referring to fig. 11, the terminal device 100 includes a processor 110, a memory 120, and a bio-information collection device 130 connected through a system bus, wherein the memory 120 may include a non-volatile storage medium and an internal memory.
The non-volatile storage medium may store an operating system and a computer program. The computer program comprises program instructions that, when executed, cause the processor to perform any of a variety of biological information and gesture based methods of operation.
The biological information acquisition device is used for acquiring the identity verification information of the part to be identified.
The processor is used to provide computing and control capabilities to support the operation of the entire terminal device.
The internal memory provides an environment for the execution of a computer program in a non-volatile storage medium that, when executed by a processor, causes the processor to perform any one of a variety of biological information and gesture-based methods of operation.
It will be appreciated by those skilled in the art that the structure shown in fig. 11 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the terminal device to which the present inventive arrangements are applied, and that a particular terminal device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
It should be appreciated that the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein in one embodiment the processor is configured to run a computer program stored in the memory to implement the steps of:
when the part to be identified of the user is detected, acquiring identity verification information of the part to be identified;
determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information;
and executing corresponding target operation according to the identity verification information and the target instruction.
In one embodiment, before implementing the determining of the posture information of the portion to be identified in the preset time interval, the processor is further configured to implement:
determining whether the identity verification information accords with preset identity information;
the processor is configured to, when implementing the determining the gesture information of the portion to be identified in the preset time interval, implement:
if the identity verification information accords with the preset identity information, determining the gesture information of the part to be identified in a preset time interval;
in one embodiment, the processor is configured to, when implementing the performing the corresponding target operation according to the authentication information and the target instruction, implement:
and executing corresponding target operation according to the target instruction.
In one embodiment, the processor is configured to, when implementing the acquiring the authentication information of the portion to be identified, implement:
acquiring a first target image of the part to be identified through an image acquisition end, and determining identity verification information of the part to be identified according to the first target image;
in one embodiment, the processor is configured to, when implementing the determining the pose information of the portion to be identified within the preset time interval, implement:
acquiring a second target image of the part to be identified at a preset moment through an image acquisition end, and determining the gesture information of the part to be identified according to the second target image or determining the gesture information of the part to be identified according to the first target image and the second target image;
the interval between the preset time and the acquisition time of the first target image is smaller than or equal to a preset time interval.
In one embodiment, the authentication information includes vein feature information, and the portion to be identified includes a finger, palm, or wrist-to-fingertip area.
In one embodiment, the gesture information includes a movement track, and the processor is configured to, when implementing the determining the gesture information of the portion to be identified according to the first target image and the second target image, implement:
determining first position information of the part to be identified according to the first target image, and determining second position information of the part to be identified according to the second target image;
and determining the moving track of the part to be identified according to the first position information and the second position information.
In one embodiment, the portion to be identified includes a finger, and the processor is configured to, when implementing the determining, according to the first target image, first location information of the portion to be identified, and determining, according to the second target image, second location information of the portion to be identified, implement:
determining a first placement angle of the finger according to the first target image, and determining a second placement angle of the finger according to the second target image;
and determining the moving direction and the moving angle of the finger according to the first placing angle and the second placing angle, and determining the moving direction and the moving angle as moving tracks.
In one embodiment, the processor, when implementing the determining the target instruction according to the gesture information, is configured to implement:
if the moving direction belongs to a first direction and the moving angle belongs to a first angle range, determining a target instruction of the gesture information as a first instruction;
if the moving direction belongs to the second direction and the moving angle belongs to the first angle range, determining a target instruction of the gesture information as a second instruction;
if the moving direction belongs to the first direction and the moving angle belongs to the second angle range, determining a target instruction of the gesture information as a third instruction;
and if the moving direction belongs to the second direction and the moving angle belongs to the second angle range, determining the target instruction of the gesture information as a fourth instruction.
In one embodiment, the processor is configured, when implementing the determining, according to the second target image, pose information of the portion to be identified, to implement:
and determining the number of fingers included in the second target image, and determining the number of fingers as gesture information.
In an embodiment of the present application, a computer readable storage medium is further provided, where the computer readable storage medium stores a computer program, where the computer program includes program instructions, and the processor executes the program instructions to implement any of the methods of operation provided in the embodiments of the present application based on biological information and gestures.
The computer readable storage medium may be an internal storage unit of the terminal device according to the foregoing embodiment, for example, a hard disk or a memory of the terminal device. The computer readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device.
While the application has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (9)

1. A method of operation based on biological information and gestures, the method comprising:
when a part to be identified of a user is detected, acquiring identity verification information of the part to be identified, wherein the part to be identified is a body part for performing biological identification authentication, and the identity verification information is information for the biological identification authentication;
determining whether the identity verification information accords with preset identity information;
if the identity verification information accords with the preset identity information, determining the gesture information of the part to be identified in a preset time interval, and determining a target instruction according to the gesture information;
and executing corresponding target operation according to the target instruction.
2. The method of claim 1, wherein the acquiring authentication information of the part to be identified comprises:
acquiring a first target image of the part to be identified through an image acquisition end, and determining identity verification information of the part to be identified according to the first target image;
the determining the gesture information of the part to be identified in the preset time interval comprises the following steps:
acquiring a second target image of the part to be identified at a preset moment through an image acquisition end, and determining the gesture information of the part to be identified according to the second target image or determining the gesture information of the part to be identified according to the first target image and the second target image;
the interval between the preset time and the acquisition time of the first target image is smaller than or equal to a preset time interval.
3. The method of claim 2, wherein the authentication information includes vein feature information and the portion to be identified includes a finger, palm, or wrist-to-fingertip area.
4. The method according to claim 2, wherein the gesture information includes a movement trajectory, and the determining the gesture information of the portion to be recognized from the first target image and the second target image includes:
determining first position information of the part to be identified according to the first target image, and determining second position information of the part to be identified according to the second target image;
and determining the moving track of the part to be identified according to the first position information and the second position information.
5. The method of claim 4, wherein the portion to be identified includes a finger, wherein determining first location information of the portion to be identified based on the first target image, and determining second location information of the portion to be identified based on the second target image comprises:
determining a first placement angle of the finger according to the first target image, and determining a second placement angle of the finger according to the second target image;
and determining the moving direction and the moving angle of the finger according to the first placing angle and the second placing angle, and determining the moving direction and the moving angle as moving tracks.
6. The method of claim 5, wherein determining the target instruction from the gesture information comprises:
if the moving direction belongs to a first direction and the moving angle belongs to a first angle range, determining a target instruction of the gesture information as a first instruction;
if the moving direction belongs to the second direction and the moving angle belongs to the first angle range, determining a target instruction of the gesture information as a second instruction;
if the moving direction belongs to the first direction and the moving angle belongs to the second angle range, determining a target instruction of the gesture information as a third instruction;
and if the moving direction belongs to the second direction and the moving angle belongs to the second angle range, determining the target instruction of the gesture information as a fourth instruction.
7. The method according to claim 2, wherein determining the pose information of the part to be recognized from the second target image includes:
and determining the number of fingers included in the second target image, and determining the number of fingers as gesture information.
8. A terminal device, characterized in that the terminal device comprises a memory, a processor and a biological information acquisition device;
the biological information acquisition device is used for acquiring identity verification information of a part to be identified, wherein the part to be identified is a body part for performing biological identification authentication, and the identity verification information is information for the biological identification authentication;
the memory is used for storing a computer program;
the processor for executing the computer program and implementing the biological information and gesture based operation method according to any one of claims 1 to 7 when the computer program is executed.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to implement the biological information and gesture-based operation method according to any one of claims 1 to 7.
CN202010328853.3A 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium Active CN111582078B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010328853.3A CN111582078B (en) 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010328853.3A CN111582078B (en) 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium

Publications (2)

Publication Number Publication Date
CN111582078A CN111582078A (en) 2020-08-25
CN111582078B true CN111582078B (en) 2023-11-07

Family

ID=72113094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010328853.3A Active CN111582078B (en) 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium

Country Status (1)

Country Link
CN (1) CN111582078B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113111844B (en) * 2021-04-28 2022-02-15 中德(珠海)人工智能研究院有限公司 Operation posture evaluation method and device, local terminal and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109656359A (en) * 2018-11-26 2019-04-19 深圳奥比中光科技有限公司 3D body feeling interaction adaptation method, system, terminal device and readable storage medium storing program for executing
CN110248025A (en) * 2019-06-12 2019-09-17 读书郎教育科技有限公司 The personal identification method of more fingerprints and palm print information, device and storage medium
WO2019218880A1 (en) * 2018-05-16 2019-11-21 Oppo广东移动通信有限公司 Interaction recognition method and apparatus, storage medium, and terminal device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019218880A1 (en) * 2018-05-16 2019-11-21 Oppo广东移动通信有限公司 Interaction recognition method and apparatus, storage medium, and terminal device
CN109656359A (en) * 2018-11-26 2019-04-19 深圳奥比中光科技有限公司 3D body feeling interaction adaptation method, system, terminal device and readable storage medium storing program for executing
CN110248025A (en) * 2019-06-12 2019-09-17 读书郎教育科技有限公司 The personal identification method of more fingerprints and palm print information, device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于重力传感器的身份认证技术研究;张曼;咸鹤群;张曙光;;信息网络安全(09);全文 *

Also Published As

Publication number Publication date
CN111582078A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
JP6397036B2 (en) Dynamic keyboard and touchscreen biometrics
Tian et al. KinWrite: Handwriting-Based Authentication Using Kinect.
EP3100152B1 (en) User-authentication gestures
KR101598771B1 (en) Method and apparatus for authenticating biometric by using face recognizing
TWI595374B (en) User identification with biokinematic input
US20170185760A1 (en) Face-Controlled Liveness Verification
Liu et al. Exploiting eye tracking for smartphone authentication
US20140118520A1 (en) Seamless authorized access to an electronic device
JP4410543B2 (en) Personal identification device
CN107533599B (en) Gesture recognition method and device and electronic equipment
US20180173863A1 (en) Biometric authentication of a user
KR20170046448A (en) Method and device for complex authentication
CN105678147B (en) Touch operation method and device
WO2021220423A1 (en) Authentication device, authentication system, authentication method, and authentication program
US20210049392A1 (en) Authentication method for an electronic device
EP3612966A1 (en) Access control for access restricted domains using first and second biometric data
KR102205495B1 (en) Method and apparatus for recognizing finger print
CN111582078B (en) Operation method based on biological information and gesture, terminal device and storage medium
US20230045850A1 (en) Fingerprint Capturing and Matching for Authentication
WO2018161312A1 (en) Fingerprint identification method and apparatus
US20210406353A1 (en) System and method for biometric authentication
US8826392B2 (en) Device and method for authenticating biological information
Chen et al. Modeling interactive sensor-behavior with smartphones for implicit and active user authentication
Kumar et al. Fingerprint based authentication system with keystroke dynamics for realistic user
Ducray et al. Authentication based on a changeable biometric using gesture recognition with the kinect™

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant