CN111582078A - Operation method based on biological information and gesture, terminal device and storage medium - Google Patents

Operation method based on biological information and gesture, terminal device and storage medium Download PDF

Info

Publication number
CN111582078A
CN111582078A CN202010328853.3A CN202010328853A CN111582078A CN 111582078 A CN111582078 A CN 111582078A CN 202010328853 A CN202010328853 A CN 202010328853A CN 111582078 A CN111582078 A CN 111582078A
Authority
CN
China
Prior art keywords
information
determining
recognized
target image
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010328853.3A
Other languages
Chinese (zh)
Other versions
CN111582078B (en
Inventor
陈建昌
陈皓麟
周黎
劳鹏飞
房雪雁
苏武龙
钟俊楷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Wedone Technology Co ltd
Original Assignee
Guangzhou Wedone Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Wedone Technology Co ltd filed Critical Guangzhou Wedone Technology Co ltd
Priority to CN202010328853.3A priority Critical patent/CN111582078B/en
Publication of CN111582078A publication Critical patent/CN111582078A/en
Application granted granted Critical
Publication of CN111582078B publication Critical patent/CN111582078B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of biological identification, and particularly discloses an operation method based on biological information and gestures, a terminal device and a storage medium. The method comprises the following steps: when a part to be identified of a user is detected, acquiring identity authentication information of the part to be identified; determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information; and executing corresponding target operation according to the identity authentication information and the target instruction. According to the embodiment of the application, the identity of the user can be verified and the functional application of the terminal equipment can be expanded by collecting the part to be recognized, and the user experience is optimized.

Description

Operation method based on biological information and gesture, terminal device and storage medium
Technical Field
The present application relates to the field of biometric identification technologies, and in particular, to an operation method, a terminal device, and a storage medium based on biometric information and gestures.
Background
With the development of identity recognition technology, more and more types of recognition technology are applied to identity authentication, including various human biometric recognition, such as face recognition for authentication by recognizing a face, voiceprint recognition for authentication by recognizing voice, iris recognition for authentication by recognizing a pupil, finger vein recognition for authentication by recognizing a finger vein, and the like.
The biometric identification module in the prior art can only realize a single verification function, and along with the wider and wider application of the biometric identification technology, the single verification function cannot meet the requirement of expanding application of terminal products, and the user experience is poor.
Disclosure of Invention
The application provides an operation method based on biological information and gestures, a terminal device and a storage medium, which can expand the functional application of the terminal device and optimize user experience.
In a first aspect, the present application provides a method of operation based on biological information and gestures, the method comprising:
when a part to be identified of a user is detected, acquiring identity authentication information of the part to be identified;
determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information;
and executing corresponding target operation according to the identity authentication information and the target instruction.
In a second aspect, the present application further provides a terminal device, where the terminal device includes a memory and a processor; the memory is used for storing a computer program; the processor is configured to execute the computer program and to implement the operation method based on the biological information and the gesture as described above when the computer program is executed.
In a third aspect, the present application also provides a computer-readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the operation method based on biological information and gestures as described above.
The application discloses an operation method based on biological information and gestures, a terminal device and a storage medium, wherein the method comprises the following steps: when a part to be identified of a user is detected, acquiring identity authentication information of the part to be identified; determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information; and executing corresponding target operation according to the identity authentication information and the target instruction. According to the embodiment of the application, the identity verification information and the posture information of the part to be recognized can be obtained, the identity of a user can be verified according to the identity verification information, different types of target instructions can be determined according to the collected posture information of the part to be recognized, and then the terminal equipment can execute corresponding target operations according to the different types of target instructions; according to the embodiment of the application, the identity of the user can be verified and the functional application of the terminal equipment can be expanded by collecting the part to be recognized, and the user experience is optimized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of a method of operation based on biological information and gestures provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a biological information acquisition device provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart diagram of another operation method based on biological information and gestures provided by the embodiments of the present application;
FIG. 4 is a schematic flow chart diagram of another operation method based on biological information and gestures provided by the embodiments of the present application;
FIG. 5 is a schematic flow chart diagram of another operation method based on biological information and gestures provided by the embodiments of the present application;
FIG. 6 is a schematic flow chart diagram of another operation method based on biological information and gestures provided by the embodiments of the present application;
FIG. 7 is a schematic diagram of a scenario of an operation method based on biological information and gestures provided by an embodiment of the present application;
FIG. 8 is another scenario schematic diagram of an operation method based on biological information and gestures provided by an embodiment of the present application;
FIG. 9 is another schematic diagram of a scenario of an operation method based on biological information and gestures provided by an embodiment of the present application;
FIG. 10 is another scenario schematic diagram of an operation method based on biological information and gestures provided by an embodiment of the present application;
fig. 11 is a schematic block diagram of a structure of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Embodiments of the present application provide an operation method, a terminal device, and a storage medium based on biological information and gestures. Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic flowchart of an operation method based on biological information and gestures according to an embodiment of the present application, where the method is applied to a terminal device having a biological information collection apparatus, and the terminal device may be a terminal product to which the biological information collection apparatus is applied, for example, the terminal device may be an entrance guard device. As shown in fig. 1, the operation method based on the biological information and the gesture specifically includes steps S101 to S103.
S101, when a part to be identified of a user is detected, acquiring identity verification information of the part to be identified.
The part to be identified is a body part for carrying out biological identification authentication, the identity verification information is information for carrying out biological identification authentication, and the biological identification authentication can be carried out according to the identity verification information to determine the identity of the user by acquiring the identity verification information of the part to be identified.
When a user wants to perform biological identification authentication, the part to be identified is placed in the acquisition space of the biological information acquisition device, so that the identity verification information of the part to be identified of the user can be acquired. The trigger end can be arranged on the terminal equipment, the trigger end can detect the part to be identified through infrared detection, and the trigger end can also detect the part to be identified through contact triggering. For example, the trigger end may be a key disposed in the acquisition space, when the portion to be identified of the user is placed in the acquisition space of the biological information acquisition device, the key is pressed by the portion to be identified to trigger, the trigger end may send trigger information to the processor, and the processor may determine that the portion to be identified of the user is detected. When the part to be identified of the user is detected, the identity verification information of the part to be identified can be acquired through the biological information acquisition device.
In one embodiment, the identification information includes vein feature information, the part to be identified includes a finger, a palm or a region from a wrist to a fingertip, and the biological information acquisition device may include an image acquisition end. As shown in fig. 2, the terminal device 20 may include an image acquisition terminal 30, a trigger terminal 40, an infrared unit 50, a processor 60 and an optical filter 70, wherein the image acquisition terminal 30, the trigger terminal 40 and the infrared unit 50 are all connected to the processor 60, the processor 60 is used for inputting, processing and outputting data signals, and data communication and data processing between other modules may be coordinated. The processor 60 may control the image collecting terminal 30 to collect authentication information for authenticating the user. The processor 60 may also be connected to the operation end 80, and the operation end 80 may be other external ends, which may be determined according to the practical application, and for example, the operation end 80 may be an electronic lock. Optionally, the terminal device further includes a light equalizing component and other sensors, and the like, which may refer to structural features of the biological information acquisition device in the prior art when applied, and are not described herein again.
The vein is hidden in the body, blood flows in the vein, and heme in the blood can absorb infrared light, so that when the part to be identified of the user is detected, the infrared unit 50 is controlled to emit infrared light to irradiate the finger, the image is shot by the image acquisition end 30 to the part to be identified 10, and vein characteristic information showing the shape of the vein can be acquired.
S102, determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information.
After the identity verification information of the part to be recognized is acquired, the posture information of the part to be recognized can be determined again, the posture information is characteristic information used for representing the posture form of the part to be recognized, and the posture information can comprise a static posture and/or a dynamic posture. The authentication information of each user is generally used only to authenticate the user's identity, i.e., either authenticated or not authenticated. However, the user can transmit richer information by adjusting the posture information of the part to be recognized, different posture information can be corresponding to different target instructions, and the functional application of the terminal equipment can be expanded.
When acquiring the authentication information of the part to be recognized, the part to be recognized may need to be in a specific position or adopt a specific gesture to acquire effective authentication information.
When the gesture recognition device is used for determining the corresponding target instruction, the gesture of the part to be recognized may be a different gesture from that when the identity verification information is acquired, and therefore, after the identity verification information is acquired, the user may need to convert the gesture of the part to be recognized.
Therefore, a preset time interval may be set, timing may be started when the authentication information of the part to be recognized is acquired, and the user may switch the posture of the part to be recognized within the preset time interval, so that the terminal device may determine the posture information of the part to be recognized. Alternatively, an image of the part to be recognized may be acquired by the image acquisition terminal to determine the posture information of the part to be recognized.
The corresponding relationship between the plurality of posture information and the corresponding target instructions may be stored in advance, and after the posture information of the part to be recognized is determined, the corresponding target instructions may be determined according to the corresponding relationship.
S103, executing corresponding target operation according to the identity authentication information and the target instruction.
The target instruction may be an instruction for enabling the terminal device to execute a corresponding operation, and optionally, the method according to the embodiment of the present application is applied to a processor of the terminal device, where the processor may determine the corresponding target instruction according to the posture information of the portion to be recognized, and then the processor may execute a corresponding target operation according to the target instruction, and for example, the target operation may be that the processor sends corresponding information to an operation terminal or other modules in the terminal device according to the target instruction. The other modules may be, for example, infrared units, and the light intensity of the infrared units, or other parameters, may be adjusted by different target instructions.
The terminal device may be a terminal product applied to the biological information acquisition device, and the terminal device generally needs to acquire the operation authority of the user when executing the corresponding target operation, so that the identity information of the user can be verified through the identity verification information. After the operation authority of the user is determined, the corresponding target operation can be executed according to the target instruction.
Optionally, as shown in fig. 3, the executing the corresponding target operation according to the identity verification information and the target instruction may be implemented by:
and S1031, determining whether the identity authentication information accords with preset identity information.
S1032, if the identity authentication information accords with the preset identity information, executing corresponding target operation according to the target instruction.
The identity of the user can be determined according to the identity authentication information, so that the user authority is determined, namely whether the user has the authority to operate the terminal equipment is judged.
The preset identity information is identity verification information of a part to be identified of a target user, which is acquired in advance, and illustratively, vein feature information of a finger of the target user, which is acquired in advance, and is used as the preset identity information for verifying whether the user is the target user later; the target user is a user having the authority to operate the terminal device.
If the identity authentication information conforms to the preset identity information, the user is indicated to have the authority to operate the terminal equipment, so that the target instruction can be determined to be from the authorized user, and further the corresponding target operation can be executed according to the target instruction.
In one embodiment, as shown in fig. 4, the operation method based on the biological information and the gesture provided in the embodiment of the present application may be further implemented by:
s201, when a part to be identified of a user is detected, acquiring identity verification information of the part to be identified.
S202, determining whether the identity authentication information accords with preset identity information.
S203, if the identity verification information accords with preset identity information, determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information.
And S204, executing corresponding target operation according to the target instruction.
In this embodiment of the present application, it may be determined whether the identity verification information conforms to the preset identity information before determining the posture information of the to-be-recognized part within the preset time interval, and the operation of determining the posture information of the to-be-recognized part within the preset time interval is performed only when the identity verification information conforms to the preset identity information, that is, when the verified user has the operation right.
Therefore, the operation of determining the posture information of the part to be recognized in the preset time interval can be avoided when the identity of the user fails to be verified, the terminal equipment is prevented from doing redundant work, and the power consumption of the terminal equipment is reduced.
And the authentication information of the determined user is executed in advance to accord with the preset identity information, so that the determined target instruction is the target instruction subjected to the user authority authentication, and the corresponding target operation can be directly executed according to the target instruction.
According to the operation method based on the biological information and the gesture, the identity verification information and the gesture information of the part to be recognized can be obtained, the identity of the user can be verified according to the identity verification information, meanwhile, different types of target instructions can be determined according to the collected gesture information of the part to be recognized, and then the terminal equipment can execute corresponding target operation according to the different types of target instructions; according to the embodiment of the application, the identity of the user can be verified and the functional application of the terminal equipment can be expanded by collecting the part to be recognized, and the user experience is optimized.
In an embodiment, the obtaining of the authentication information of the to-be-identified portion may be implemented by:
and acquiring a first target image of the part to be identified through an image acquisition end, and determining the identity verification information of the part to be identified according to the first target image.
The identification information may be vein feature information, and the identification information may be a first target image obtained by irradiating infrared rays to a finger and shooting the finger through the image collecting end against a to-be-identified portion, where the first target image includes a vein shape, so that the vein feature information may be obtained from the first target image.
Accordingly, the determination of the posture information of the part to be recognized within the preset time interval may be implemented as follows:
acquiring a second target image of the part to be recognized through an image acquisition end at a preset moment, and determining the posture information of the part to be recognized according to the second target image or determining the posture information of the part to be recognized according to the first target image and the second target image; and the interval between the preset moment and the acquisition moment of the first target image is less than or equal to a preset time interval.
After the first target image is acquired by the image acquisition end, the posture information of the part to be recognized can be determined continuously by the image acquisition end. And acquiring a second target image of the part to be recognized through the image acquisition terminal, and determining the posture information of the part to be recognized according to the second target image.
The second target image can be acquired at a preset moment after a certain time interval after the first target image is acquired; if the specific implementation is that before determining the posture information of the part to be recognized in the preset time interval, whether the identity verification information accords with the preset identity information is judged, and before acquiring the second target image, the processor of the terminal equipment can execute the processing of determining whether the identity verification information accords with the preset identity information. In specific implementation, if the identity authentication information is judged to be in accordance with the preset identity information before the corresponding target operation is executed according to the target instruction, a certain time can be reserved before the second target image is collected, and the posture of the part to be recognized can be converted for the user.
The interval between the preset time and the acquisition time of the first target image can be determined according to actual conditions, and the interval is only less than or equal to a preset time interval.
In one embodiment, the interval between the preset time and the acquisition time of the first target image may be determined as a preset time interval, that is, a second target image acquired after the preset time interval elapses after the first target image is acquired.
In one embodiment, after the identity verification information of the part to be recognized is acquired, if the part to be recognized is detected to move, when the part to be recognized stops moving, a second target image of the part to be recognized is acquired through an image acquisition end.
The image acquisition end can continuously acquire the image of the part to be identified so as to detect whether the part to be identified moves, and the image acquisition end can determine the preset time when the part to be identified stops moving, namely, the image acquisition end acquires the second target image of the part to be identified. In this embodiment, the interval between the preset time and the acquisition time of the first target image is not fixed, and is specifically determined according to the time length for the user to move the part to be recognized. The acquisition process is changed along with the operation of the user, no pause feeling exists, and the use experience of the user can be optimized.
In one embodiment, the part to be recognized may include a finger, a palm or a region from a wrist to a fingertip, and the determining of the gesture information of the part to be recognized according to the second target image may be implemented by:
determining the number of fingers included in the second target image, and determining the number of fingers as gesture information.
When the part to be recognized is the region from the finger, the palm or the wrist of the user to the fingertip, the user can determine different postures according to different numbers of the extending fingers, and then determine the corresponding target instruction according to different numbers of the fingers.
Illustratively, when one finger is included in the second target image, the gesture information may be an a gesture, and the target instruction corresponding to the corresponding a gesture may be a first instruction, and the first instruction may be an operation for causing the terminal device to perform unlocking; when the second target image includes two fingers, the posture information may be a B gesture, the corresponding target instruction corresponding to the B gesture may be a second instruction, and the second instruction may be an operation for causing the terminal device to perform locking; when the second target image includes three fingers, the gesture information may be a C gesture, the target instruction corresponding to the corresponding C gesture may be a third instruction, and the third instruction may be an operation for causing the terminal device to perform registration of a new user; when four fingers are included in the second target image, the gesture information may be a D-gesture, and the target instruction corresponding to the corresponding D-gesture may be a fourth instruction, which may be an operation of causing the terminal device to delete the user account.
For example, the preset identity information is vein feature information of a right index finger of the user, so that when the authentication information of the user is collected, the user stretches out the right index finger to collect the vein feature information of the right index finger. And the user needs to perform the locking operation afterwards, the user can convert the gesture, the right middle finger is extended out again, the image acquisition end can acquire the index finger and the right middle finger of the right hand of the user, and the gesture can be determined to be B.
In this embodiment, the authentication information may be full-hand vein feature information, and the full-hand vein feature information may include vein feature information of all fingers, and may further include varicose vein feature information. The acquisition space in the corresponding biological information acquisition device has enough space for the user to convert the number of fingers.
In one embodiment, the gesture information may be a dynamic gesture, the gesture information includes a movement track, and the determining the gesture information of the part to be recognized according to the first target image and the second target image, as shown in fig. 5, may be implemented by:
s301, determining first position information of the part to be recognized according to the first target image, and determining second position information of the part to be recognized according to the second target image.
S302, determining the moving track of the part to be identified according to the first position information and the second position information.
The gesture information may also be a dynamic gesture, that is, the gesture information includes a movement track of the part to be recognized, and the movement track includes a position before the part to be recognized moves and a position after the part to be recognized moves.
The first target image is an image used for determining identity authentication information, namely, a part to be identified of a user is not moved, and the second target image is acquired at a certain time interval from the acquisition time of the first target image, namely, the part to be identified of the user is moved. Therefore, the first position information of the part to be recognized when the part to be recognized is not moved can be determined according to the first target image, the second position information of the part to be recognized after the part to be recognized is moved can be determined according to the second target image, and the moving track of the part to be recognized can be determined according to the first position information and the second position information.
By taking the moving track as the gesture information, the gesture information can be further enriched, and the functional application of the terminal product can be met under the condition that the target operation types of the terminal product are more.
In one embodiment, the portion to be recognized includes a finger, and as shown in fig. 6, the determining the first position information of the portion to be recognized according to the first target image and the determining the second position information of the portion to be recognized according to the second target image may be implemented by:
s401, determining a first placement angle of the finger according to the first target image, and determining a second placement angle of the finger according to the second target image.
S402, determining the moving direction and the moving angle of the finger according to the first placing angle and the second placing angle, and determining the moving direction and the moving angle as a moving track.
The part to be identified can be a finger, and the shape of the finger is a long strip, so that the moving track of the finger can be determined according to the placement angle of the finger. The finger placing angle may be an included angle between the finger and the preset direction. For example, the preset direction may be a direction in which a finger is inserted into the biological information collecting device, and the first placement angle of the finger in collecting the first target image is generally consistent with the preset direction, and thus the first placement angle may be 0 degree. After the first target image is collected, the finger can move, and an included angle between the moved finger and the preset direction is determined as a second placing angle according to the second target image.
According to the first placing angle and the second placing angle, the moving direction and the moving angle of the finger can be determined, and the moving direction and the moving angle can be determined as the moving track. More types of gesture information can be determined according to different moving directions and different moving angles of the fingers, and further more types of target instructions can be determined.
In one embodiment, the determining a target instruction from the gesture information comprises:
if the moving direction belongs to a first direction and the moving angle belongs to a first angle range, determining that a target instruction of the posture information is a first instruction;
if the moving direction belongs to a second direction and the moving angle belongs to a first angle range, determining that the target instruction of the posture information is a second instruction;
if the moving direction belongs to a first direction and the moving angle belongs to a second angle range, determining that a target instruction of the posture information is a third instruction;
and if the moving direction belongs to a second direction and the moving angle belongs to a second angle range, determining that the target instruction of the posture information is a fourth instruction.
In which several areas can be divided within the movable range of the user's finger, and the finger moves to one area, i.e. one gesture information can be determined, and the corresponding target instruction can be determined. The first and second directions may be opposite, i.e. movement of the finger in different directions may represent different gesture information, and the second and first angular ranges may be disjoint angular ranges. Illustratively, the first angle range may be 30 to 70 degrees, and the second angle range may be 70 to 110 degrees. The first direction may be a clockwise direction and the second direction may be a counter-clockwise direction.
As shown in fig. 7 to 10, when the finger in fig. 7 moves from the first placement angle 11 to the second placement angle 12, the moving direction of the finger is clockwise, the moving angle is 45 degrees, and the target instruction corresponding to the posture information in fig. 7 can be determined to be a first instruction, and the first instruction can be an operation for enabling the terminal device to perform unlocking; the finger in fig. 8 moves from the first placing angle 21 to the second placing angle 22, the moving direction of the finger is counterclockwise, the moving angle is 45 degrees, the target instruction corresponding to the gesture information in fig. 8 may be determined to be a second instruction, and the second instruction may be an operation for causing the terminal device to perform lock closing; in fig. 9, when the finger moves from the first placement angle 31 to the second placement angle 32, the moving direction of the finger is clockwise, and the moving angle is 80 degrees, the target instruction corresponding to the gesture information in fig. 9 may be determined to be a third instruction, and the third instruction may be an operation for causing the terminal device to perform registration of a new user; in fig. 10, when the finger moves from the first placement angle 41 to the second placement angle 42, the moving direction of the finger is counterclockwise, and the moving angle is 80 degrees, the target instruction corresponding to the gesture information in fig. 10 may be determined to be a fourth instruction, and the fourth instruction may be an operation of causing the terminal device to delete the user account.
Referring to fig. 11, fig. 11 is a schematic block diagram of a structure of a terminal device according to an embodiment of the present application. The terminal device may be a terminal product to which the biological information collecting apparatus is applied.
Referring to fig. 11, the terminal device 100 includes a processor 110, a memory 120, and a biological information collecting apparatus 130 connected by a system bus, wherein the memory 120 may include a nonvolatile storage medium and an internal memory.
The non-volatile storage medium may store an operating system and a computer program. The computer program includes program instructions that, when executed, cause a processor to perform any one of the methods of operation based on the biological information and the gesture.
The biological information acquisition device is used for acquiring the identity verification information of the part to be identified.
The processor is used for providing calculation and control capability and supporting the operation of the whole terminal equipment.
The internal memory provides an environment for the execution of a computer program on a non-volatile storage medium, which, when executed by the processor, causes the processor to perform any one of the methods of operation based on biometric information and gestures.
Those skilled in the art will appreciate that the structure shown in fig. 11 is a block diagram of only a portion of the structure relevant to the present application, and does not constitute a limitation on the terminal device to which the present application is applied, and a particular terminal device may include more or less components than those shown in the drawings, or combine some components, or have a different arrangement of components.
It should be understood that the Processor may be a Central Processing Unit (CPU), and the Processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein, in one embodiment, the processor is configured to execute a computer program stored in the memory to implement the steps of:
when a part to be identified of a user is detected, acquiring identity authentication information of the part to be identified;
determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information;
and executing corresponding target operation according to the identity authentication information and the target instruction.
In one embodiment, before implementing the determining of the posture information of the part to be recognized within the preset time interval, the processor is further configured to implement:
determining whether the identity authentication information conforms to preset identity information;
the processor is used for realizing that when the posture information of the part to be recognized in the preset time interval is determined, the processor is used for realizing that:
if the identity verification information accords with preset identity information, determining the posture information of the part to be recognized in a preset time interval;
in one embodiment, when implementing the corresponding target operation executed according to the authentication information and the target instruction, the processor is configured to implement:
and executing corresponding target operation according to the target instruction.
In one embodiment, when implementing the acquiring of the authentication information of the portion to be identified, the processor is configured to implement:
acquiring a first target image of the part to be identified through an image acquisition end, and determining the identity verification information of the part to be identified according to the first target image;
in one embodiment, the processor, when implementing the determining of the posture information of the part to be recognized within a preset time interval, is configured to implement:
acquiring a second target image of the part to be recognized through an image acquisition end at a preset moment, and determining the posture information of the part to be recognized according to the second target image or determining the posture information of the part to be recognized according to the first target image and the second target image;
and the interval between the preset moment and the acquisition moment of the first target image is less than or equal to a preset time interval.
In one embodiment, the identification information includes vein feature information, and the part to be identified includes a finger, a palm, or a region from a wrist to a fingertip.
In one embodiment, the gesture information includes a movement trajectory, and the processor, when implementing the determining of the gesture information of the part to be recognized from the first target image and the second target image, is configured to implement:
determining first position information of the part to be recognized according to the first target image, and determining second position information of the part to be recognized according to the second target image;
and determining the moving track of the part to be identified according to the first position information and the second position information.
In one embodiment, the part to be recognized comprises a finger, and the processor, when implementing the determining of the first position information of the part to be recognized according to the first target image and the determining of the second position information of the part to be recognized according to the second target image, is configured to implement:
determining a first placement angle of the finger according to the first target image, and determining a second placement angle of the finger according to the second target image;
and determining the moving direction and the moving angle of the finger according to the first placing angle and the second placing angle, and determining the moving direction and the moving angle as a moving track.
In one embodiment, the processor, when implementing the determining a target instruction from the pose information, is configured to implement:
if the moving direction belongs to a first direction and the moving angle belongs to a first angle range, determining that a target instruction of the posture information is a first instruction;
if the moving direction belongs to a second direction and the moving angle belongs to a first angle range, determining that the target instruction of the posture information is a second instruction;
if the moving direction belongs to a first direction and the moving angle belongs to a second angle range, determining that a target instruction of the posture information is a third instruction;
and if the moving direction belongs to a second direction and the moving angle belongs to a second angle range, determining that the target instruction of the posture information is a fourth instruction.
In one embodiment, the processor, when implementing the determining of the pose information of the part to be recognized from the second target image, is configured to implement:
determining the number of fingers included in the second target image, and determining the number of fingers as gesture information.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and the processor executes the program instructions to implement any one of the operation methods based on biological information and gestures provided in the embodiments of the present application.
The computer-readable storage medium may be an internal storage unit of the terminal device described in the foregoing embodiment, for example, a hard disk or a memory of the terminal device. The computer readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of operation based on biological information and gestures, the method comprising:
when a part to be identified of a user is detected, acquiring identity authentication information of the part to be identified;
determining posture information of the part to be recognized in a preset time interval, and determining a target instruction according to the posture information;
and executing corresponding target operation according to the identity authentication information and the target instruction.
2. The method according to claim 1, wherein before determining the posture information of the part to be recognized within a preset time interval, the method further comprises:
determining whether the identity authentication information conforms to preset identity information;
the determining the posture information of the part to be recognized in the preset time interval comprises the following steps:
if the identity verification information accords with preset identity information, determining the posture information of the part to be recognized in a preset time interval;
the executing the corresponding target operation according to the identity authentication information and the target instruction comprises:
and executing corresponding target operation according to the target instruction.
3. The operation method based on biological information and gesture according to claim 1 or 2, wherein the obtaining of the authentication information of the part to be recognized comprises:
acquiring a first target image of the part to be identified through an image acquisition end, and determining the identity verification information of the part to be identified according to the first target image;
the determining the posture information of the part to be recognized in the preset time interval comprises the following steps:
acquiring a second target image of the part to be recognized through an image acquisition end at a preset moment, and determining the posture information of the part to be recognized according to the second target image or determining the posture information of the part to be recognized according to the first target image and the second target image;
and the interval between the preset moment and the acquisition moment of the first target image is less than or equal to a preset time interval.
4. The biological information and gesture based operation method according to claim 3, wherein the authentication information includes vein feature information, and the portion to be recognized includes a finger, a palm, or a wrist-to-fingertip region.
5. The biological information and gesture based operation method according to claim 3, wherein the gesture information includes a movement trajectory, and the determining the gesture information of the part to be recognized from the first target image and the second target image includes:
determining first position information of the part to be recognized according to the first target image, and determining second position information of the part to be recognized according to the second target image;
and determining the moving track of the part to be identified according to the first position information and the second position information.
6. The biological information and gesture based operation method according to claim 5, wherein the part to be recognized comprises a finger, the determining first position information of the part to be recognized according to the first target image and the determining second position information of the part to be recognized according to the second target image comprise:
determining a first placement angle of the finger according to the first target image, and determining a second placement angle of the finger according to the second target image;
and determining the moving direction and the moving angle of the finger according to the first placing angle and the second placing angle, and determining the moving direction and the moving angle as a moving track.
7. The method of claim 6, wherein the determining a target instruction according to the gesture information comprises:
if the moving direction belongs to a first direction and the moving angle belongs to a first angle range, determining that a target instruction of the posture information is a first instruction;
if the moving direction belongs to a second direction and the moving angle belongs to a first angle range, determining that the target instruction of the posture information is a second instruction;
if the moving direction belongs to a first direction and the moving angle belongs to a second angle range, determining that a target instruction of the posture information is a third instruction;
and if the moving direction belongs to a second direction and the moving angle belongs to a second angle range, determining that the target instruction of the posture information is a fourth instruction.
8. The method according to claim 3, wherein the determining the gesture information of the part to be recognized according to the second target image comprises:
determining the number of fingers included in the second target image, and determining the number of fingers as gesture information.
9. A terminal device is characterized in that the terminal device comprises a memory, a processor and a biological information acquisition device;
the biological information acquisition device is used for acquiring the identity verification information of the part to be identified;
the memory is used for storing a computer program;
the processor for executing the computer program and implementing the method of operation based on biological information and gestures according to any one of claims 1 to 8 when executing the computer program.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to implement the biological information and gesture based operation method according to any one of claims 1 to 8.
CN202010328853.3A 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium Active CN111582078B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010328853.3A CN111582078B (en) 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010328853.3A CN111582078B (en) 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium

Publications (2)

Publication Number Publication Date
CN111582078A true CN111582078A (en) 2020-08-25
CN111582078B CN111582078B (en) 2023-11-07

Family

ID=72113094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010328853.3A Active CN111582078B (en) 2020-04-23 2020-04-23 Operation method based on biological information and gesture, terminal device and storage medium

Country Status (1)

Country Link
CN (1) CN111582078B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113111844A (en) * 2021-04-28 2021-07-13 中德(珠海)人工智能研究院有限公司 Operation posture evaluation method and device, local terminal and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109656359A (en) * 2018-11-26 2019-04-19 深圳奥比中光科技有限公司 3D body feeling interaction adaptation method, system, terminal device and readable storage medium storing program for executing
CN110248025A (en) * 2019-06-12 2019-09-17 读书郎教育科技有限公司 The personal identification method of more fingerprints and palm print information, device and storage medium
WO2019218880A1 (en) * 2018-05-16 2019-11-21 Oppo广东移动通信有限公司 Interaction recognition method and apparatus, storage medium, and terminal device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019218880A1 (en) * 2018-05-16 2019-11-21 Oppo广东移动通信有限公司 Interaction recognition method and apparatus, storage medium, and terminal device
CN109656359A (en) * 2018-11-26 2019-04-19 深圳奥比中光科技有限公司 3D body feeling interaction adaptation method, system, terminal device and readable storage medium storing program for executing
CN110248025A (en) * 2019-06-12 2019-09-17 读书郎教育科技有限公司 The personal identification method of more fingerprints and palm print information, device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张曼;咸鹤群;张曙光;: "基于重力传感器的身份认证技术研究", 信息网络安全 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113111844A (en) * 2021-04-28 2021-07-13 中德(珠海)人工智能研究院有限公司 Operation posture evaluation method and device, local terminal and readable storage medium
CN113111844B (en) * 2021-04-28 2022-02-15 中德(珠海)人工智能研究院有限公司 Operation posture evaluation method and device, local terminal and readable storage medium

Also Published As

Publication number Publication date
CN111582078B (en) 2023-11-07

Similar Documents

Publication Publication Date Title
Tian et al. KinWrite: Handwriting-Based Authentication Using Kinect.
Gamboa et al. A behavioral biometric system based on human-computer interaction
KR101598771B1 (en) Method and apparatus for authenticating biometric by using face recognizing
KR102202690B1 (en) Method, apparatus and system for recognizing fingerprint
Liu et al. Exploiting eye tracking for smartphone authentication
JP2017533516A (en) Fingerprint authentication using stitching and cutting
KR102401170B1 (en) Method and device for complex authentication
US20150049922A1 (en) Method for logging a user in to a mobile device
JP4410543B2 (en) Personal identification device
CN107533599B (en) Gesture recognition method and device and electronic equipment
WO2017012186A1 (en) Method and system for fingerprint unlocking
JP2017533517A (en) Production of fingerprint authentication template
US20050134427A1 (en) Technique using order and timing for enhancing fingerprint authentication system effectiveness
CN105678147B (en) Touch operation method and device
US20210049392A1 (en) Authentication method for an electronic device
WO2021220423A1 (en) Authentication device, authentication system, authentication method, and authentication program
CN111582078B (en) Operation method based on biological information and gesture, terminal device and storage medium
WO2018161312A1 (en) Fingerprint identification method and apparatus
JP2007200051A (en) Personal identification device
CN104077513A (en) Information processing method and electronic equipment
CN111160247B (en) Method for three-dimensional modeling and identification by scanning palm vein
Canfora et al. Silent and continuous authentication in mobile environment
Burgbacher et al. A behavioral biometric challenge and response approach to user authentication on smartphones
Kumar et al. Fingerprint based authentication system with keystroke dynamics for realistic user
Ducray et al. Authentication based on a changeable biometric using gesture recognition with the kinect™

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant