CN117111727A - Hand direction detection method, electronic device and readable medium - Google Patents

Hand direction detection method, electronic device and readable medium Download PDF

Info

Publication number
CN117111727A
CN117111727A CN202310189865.6A CN202310189865A CN117111727A CN 117111727 A CN117111727 A CN 117111727A CN 202310189865 A CN202310189865 A CN 202310189865A CN 117111727 A CN117111727 A CN 117111727A
Authority
CN
China
Prior art keywords
hand
finger
image
coordinates
fingers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310189865.6A
Other languages
Chinese (zh)
Inventor
田宇
杜远超
朱世宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310189865.6A priority Critical patent/CN117111727A/en
Publication of CN117111727A publication Critical patent/CN117111727A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a hand direction detection method, electronic equipment and a readable medium, wherein a display screen of the electronic equipment is on, a camera is triggered to pick up an image, the image shot by the camera is obtained, if the image comprises a human hand, an included angle between the finger direction and the horizontal direction of the human hand in the image can be calculated, and then an initial gesture of human-computer interaction input by a user is determined. Then, the electronic equipment judges the included angle between the finger direction of the hand in the image and the horizontal direction, and meets the condition that the finger faces the first direction, and a hand-shaped icon of the finger faces the first direction is displayed on the display screen; and judging that the included angle between the finger direction of the hand in the image and the horizontal direction meets the condition that the finger faces the second direction, displaying a hand-shaped icon of the finger facing the second direction on the display screen, reminding the user that the initial gesture is input completely in a mode of displaying the hand-shaped icon on the display screen, and inputting a space-free gesture to perform man-machine interaction.

Description

Hand direction detection method, electronic device and readable medium
Technical Field
The present application relates to the field of electronic devices, and in particular, to a method for detecting a hand direction, an electronic device, a computer program product, and a computer readable storage medium.
Background
Currently, electronic devices may support contactless spaced-apart gesture interaction techniques. The space gesture interaction technology can perform man-machine interaction under the use scene that the user is inconvenient to contact with the electronic equipment, and the use experience of the user is improved.
What is needed is a solution that can support an electronic device to respond to a user input of a start gesture of a human-computer interaction, and output feedback information to remind the user that a blank gesture can be input for the human-computer interaction.
Disclosure of Invention
The application provides a hand direction detection method, electronic equipment, a computer program product and a computer readable storage medium, and aims to realize that after the electronic equipment detects a starting gesture of human-computer interaction input by a user, feedback information is output so as to remind the user that a blank gesture can be input for human-computer interaction.
In order to achieve the above object, the present application provides the following technical solutions:
in a first aspect, the present application provides a method for detecting a hand direction applied to an electronic device, where the electronic device includes a display screen and a camera, and the method for detecting a hand direction includes: in an application scene that a display screen of electronic equipment is in a bright screen state, acquiring an image shot by a camera of the electronic equipment, and calculating an included angle between a finger direction and a horizontal direction of a hand in the image; if the included angle between the finger direction of the hand in the image and the horizontal direction is judged, the condition that the finger faces the first direction is met, and a hand-shaped icon of the finger faces the first direction is displayed on a display screen; and if the included angle between the finger direction and the horizontal direction of the hand in the image meets the condition that the finger faces the second direction, displaying a hand-shaped icon of which the finger faces the second direction on the display screen. Wherein: the second direction is generally the opposite direction to the first direction, e.g., the first direction is an upward direction, and the second direction is a downward direction.
From the above, it can be seen that: the display screen of the electronic equipment is on, the camera is triggered to pick up an image, the image shot by the camera is obtained, if the image comprises a human hand, the included angle between the finger direction and the horizontal direction of the human hand in the image can be calculated, and then the initial gesture of human-computer interaction input by a user is determined. Then, the electronic equipment judges the included angle between the finger direction of the hand in the image and the horizontal direction, and meets the condition that the finger faces the first direction, and a hand-shaped icon of the finger faces the first direction is displayed on the display screen; and judging that the included angle between the finger direction of the hand in the image and the horizontal direction meets the condition that the finger faces the second direction, displaying a hand-shaped icon of the finger facing the second direction on the display screen, reminding the user that the initial gesture is input completely in a mode of displaying the hand-shaped icon on the display screen, and inputting a space-free gesture to perform man-machine interaction.
Further, if the first direction is the upward direction and the second direction is the downward direction, the hand-shaped icon with the upward finger is displayed on the display screen, so that the user can be reminded that the initial gesture of the blank gesture is input, and the blank gesture such as sliding down or grabbing can be input. The hand-shaped icon with the downward fingers is displayed on the display screen, so that a user can be reminded that the initial gesture of the blank gesture is input, and the blank gesture such as sliding upwards can be input. The user can also check whether the initial gesture input by the user is correct according to the hand-shaped icon displayed on the display screen.
The hand-shaped icons with upward fingers are displayed on the display screen, which means that the electronic equipment predicts that a user needs to input a blank gesture such as sliding down or grabbing, and the like, and the electronic equipment can call a software flow for responding to the blank gesture such as sliding down or grabbing, and the like in the process of displaying the hand-shaped icons with upward fingers on the display screen, so that the response speed of the electronic equipment is improved. Similarly, the hand-shaped icons with downward fingers are displayed on the display screen, which means that the electronic equipment predicts that the user needs to input the blank gestures such as upward sliding, and the electronic equipment can call a software flow for responding to the blank gestures such as upward sliding in the process of displaying the hand-shaped icons with downward fingers on the display screen, so that the response speed of the electronic equipment is improved.
In one possible implementation manner, after acquiring an image captured by a camera of the electronic device, the method further includes: the hands in the detected image are of the palm and back type.
In the possible implementation manner, after the electronic device acquires the image captured by the camera, the electronic device detects that the hand in the image is of a palm and back hand type, and the electronic device is used for operating the electronic device through the spaced hand, so that the electronic device is used for carrying out the included angle between the finger direction and the horizontal direction of the hand in the image, and the verification of the prediction result of the electronic device through the spaced hand is realized.
In one possible implementation manner, after acquiring an image captured by a camera of the electronic device, the method further includes: cutting out a hand area image from the image; detecting coordinates of hand key points of the hand area image; calculating an angle between a finger of a hand in an image and a horizontal direction, comprising: and calculating and obtaining the included angle between the finger direction vector and the horizontal direction of the hand by using the coordinates of the hand key points.
In one possible embodiment, calculating the included angle between the finger direction vector and the horizontal direction of the hand by using the coordinates of the key points of the hand includes: calculating a vector of finger root coordinates pointing to fingertip coordinates in coordinates of a hand key point to obtain a finger direction vector of a hand in an image; an angle between a finger direction vector and a horizontal direction of a hand in the image is calculated.
In this possible embodiment, the hand key point includes a finger root and a fingertip of the finger, and the vector of the finger root pointing to the fingertip of the finger can reflect the direction of the finger, so the finger direction vector of the hand in the image can be calculated by using the finger root coordinate and the fingertip coordinate of the hand in the coordinates of the hand key point.
In one possible embodiment, calculating a vector in which the finger root coordinates of the hand among the coordinates of the hand keypoints point to the fingertip coordinates, results in a finger direction vector of the hand in the image, includes: calculating to obtain the average coordinates of the finger roots of the plurality of fingers in the image by using the coordinates of the finger roots of the plurality of fingers in the coordinates of the key points of the hand, and calculating to obtain the average coordinates of the finger tips of the plurality of fingers in the image by using the coordinates of the finger tips of the plurality of fingers in the coordinates of the key points of the hand, wherein the plurality of fingers at least comprise middle fingers; and calculating the vector of the average coordinates of the finger roots of the plurality of fingers pointing to the average coordinates of the fingertips of the plurality of fingers, and obtaining the finger direction vector of the hand in the image.
In this possible embodiment, the finger direction of the middle finger is generally the same as the whole direction of the finger, but in some specific hand types, the finger is turned up as a whole, but the middle finger is turned towards other directions, in order to avoid that the finger direction vector calculated by pointing the finger tip coordinates with the finger root coordinates of the middle finger is not accurate enough, the finger root average coordinates of a plurality of fingers in the image are calculated by using the finger root coordinates of a plurality of fingers in the coordinates of the hand key point, the finger tip average coordinates of a plurality of fingers in the coordinates of the hand key point are calculated by using the finger tip coordinates of a plurality of fingers in the image, and then the finger direction vector of the hand in the image is obtained by calculating the vector of the finger root average coordinates of a plurality of fingers pointing to the finger tip average coordinates of a plurality of fingers.
In one possible embodiment, calculating the included angle between the finger direction vector and the horizontal direction of the hand by using the coordinates of the key points of the hand includes: calculating finger direction vectors of a plurality of fingers of the hand in the image by utilizing the finger root coordinates and the fingertip coordinates of the plurality of fingers in the coordinates of the hand key points, wherein the finger direction vectors of the plurality of fingers at least comprise finger direction vectors of middle fingers; calculating the included angles between the finger direction vectors and the horizontal directions of the plurality of fingers to obtain a plurality of included angles; and calculating the average of the included angles to obtain the included angles of the finger direction and the horizontal direction of the hand in the image.
In the possible implementation manner, in order to avoid that the finger direction vector calculated by pointing the finger root coordinate of the middle finger to the fingertip coordinate is not accurate enough, the finger direction vector of a plurality of fingers of the hand in the image is calculated by utilizing the finger root coordinate and the fingertip coordinate of a plurality of fingers in the coordinates of the hand key point, and the included angles between the finger direction vector of a plurality of fingers and the horizontal direction are calculated to obtain a plurality of included angles; and calculating the average of the included angles to obtain the included angles of the finger direction and the horizontal direction of the hand in the image.
In one possible embodiment, the determining the included angle between the finger direction and the horizontal direction of the hand in the image, where the condition that the finger is oriented in the first direction is satisfied, includes: and judging the included angle between the finger direction and the horizontal direction of the hand in the image, wherein the included angle is in the range of the included angle corresponding to the upward direction of the finger.
In one possible embodiment, if the angle between the finger direction of the hand in the image and the horizontal direction is determined, before the condition that the finger is oriented to the first direction is met, the method further includes: and judging the included angle between the finger direction and the horizontal direction of the hand in the image, wherein the condition that the finger faces the third direction is not satisfied.
In this possible embodiment, the third direction is another direction, for example, the first direction is an upward direction, the second direction is a downward direction, and the third direction may refer to a left or right direction, or refer to both a left and right direction. Based on the method, the included angle between the finger direction and the horizontal direction of the hand in the image is judged, and before the condition that the finger faces the first direction is met, the included angle between the finger direction and the horizontal direction of the hand in the image is judged, the condition that the finger faces the third direction is not met, the included angle between the finger direction and the horizontal direction of the hand in the image meets the condition of the third direction can be avoided, and misjudgment that the condition of the first direction is met leads to display of the hand-shaped icon on the display screen.
In one possible embodiment, determining the included angle between the finger direction and the horizontal direction of the hand in the image does not satisfy the condition that the finger is oriented toward the third direction includes: and judging the included angle between the finger direction and the horizontal direction of the hand in the image, wherein the included angle is not in the range of the included angle corresponding to the left and right directions of the finger.
In one possible embodiment, if it is determined that the angle between the finger direction and the horizontal direction of the hand in the image satisfies the condition that the finger is oriented in the second direction, displaying, on the display screen, a hand icon with the finger oriented in the second direction includes: and if the included angle between the finger direction of the hand in the image and the horizontal direction is judged to not meet the condition that the finger faces the first direction, displaying a hand-shaped icon with the finger facing the second direction on the display screen.
In a second aspect, the present application provides an electronic device comprising: one or more processors, memory, and a display screen; the memory and the display screen are coupled to one or more processors, the memory being for storing a computer program comprising computer instructions which, when executed by the one or more processors, cause the electronic device to perform the method of detecting hand direction as claimed in any one of the first aspects.
In a third aspect, the present application provides a computer-readable storage medium storing a computer program, which when executed is specifically adapted to carry out the method for detecting a hand direction according to any one of the first aspects.
In a fourth aspect, the present application provides a computer program product for, when run on a computer, causing the computer to perform the method of detecting hand direction as claimed in any one of the first aspects.
Drawings
Fig. 1 is an application scenario diagram of a method for detecting a hand direction according to an embodiment of the present application;
fig. 2 is an application scenario diagram of another hand direction detection method according to an embodiment of the present application;
fig. 3 is a hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a software framework diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a signaling diagram of a method for detecting a hand direction according to an embodiment of the present application;
FIG. 6 is an illustration of 21 hand keypoints of a human hand provided by an embodiment of the application;
fig. 7 is a display diagram of an included angle between a vector of average coordinates of a finger root pointing to average coordinates of a fingertip and a horizontal direction, which is provided by an embodiment of the present application;
Fig. 8 is a signaling diagram of another method for detecting a hand direction according to an embodiment of the present application;
FIG. 9 is a diagram showing angles between direction vectors and horizontal directions of a plurality of fingers according to an embodiment of the present application;
fig. 10 is a flowchart of a method for detecting a hand direction according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in embodiments of the present application, "one or more" means one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The plurality of the embodiments of the present application is greater than or equal to two. It should be noted that, in the description of the embodiments of the present application, the terms "first," "second," and the like are used for distinguishing between the descriptions and not necessarily for indicating or implying a relative importance, or alternatively, for indicating or implying a sequential order.
Currently, electronic devices may support contactless spaced-apart gesture interaction techniques. The space gesture interaction technology can perform man-machine interaction under the use scene that the user is inconvenient to contact with the electronic equipment, and the use experience of the user is improved. And before the user performs man-machine interaction with the electronic device by adopting the air-separation gesture, the user can trigger the electronic device to prepare for the man-machine interaction with the user by inputting the initial gesture. Correspondingly, after the user inputs the initial gesture, the electronic device can also output feedback information to prompt the user that the initial gesture is completed, and man-machine interaction operation can be performed by making a space-free gesture.
Taking a mobile phone as an example of the electronic device, fig. 1 and fig. 2 respectively show two modes of prompting a user to make a space gesture by the mobile phone. After the display screen of the mobile phone is on and unlocked, the hand of the user is positioned in the visual angle FOV of the front camera of the mobile phone, the visual angle FOV refers to the coverage range of the lens of the front camera, and the display screen of the mobile phone can display icons of the hand so as to prompt the user to perform man-machine interaction with the mobile phone by means of a blank gesture.
In the example shown in fig. 1 (a), the display screen of the mobile phone displays a setting interface, as shown in fig. 1 (b), the user faces the display screen with the fingers of the hand facing upward and the palm of the hand facing the display screen, the hand is within 20cm-40cm from the mobile phone and within the view angle FOV of the front camera of the mobile phone, the mobile phone displays a small hand icon 101 on the setting interface, and the fingers of the small hand icon 101 also belong to the upward direction.
In the example shown in fig. 2 (a), the display screen of the mobile phone also displays a setting interface, as shown in fig. 2 (b), the user faces the display screen with the fingers of the hand facing downwards and the back of the hand, the hand is within 20cm-40cm from the mobile phone and within the view angle FOV of the front camera of the mobile phone, the mobile phone displays the small hand icon 102 on the setting interface, and the fingers of the small hand icon 102 belong to the downward direction.
After the user views the display screen of the mobile phone and displays the small hand icon 101 shown in (b) in fig. 1 or displays the small hand icon 102 shown in (b) in fig. 2, the user can perform man-machine interaction with the mobile phone by using a space-free gesture to control the mobile phone.
It should be noted that, the finger provided by the application faces upwards, which means that the finger faces towards the top end of the electronic equipment such as the mobile phone, and the finger tip of the finger is close to the top end of the equipment, the top end is usually provided with a camera, the finger faces downwards, which means that the finger faces towards the bottom of the electronic equipment such as the mobile phone, and the finger tip of the finger is close to the bottom of the equipment.
The hand icons shown in fig. 1 (b) and fig. 2 (b), which may also be referred to as hand icons, are exemplary patterns and do not constitute limitations on the display of the hand icons on the display screen. It will be appreciated that the display of the handset with the finger facing up displays the hand icon with the finger facing up, the display of the handset with the finger facing down displays the hand icon with the finger facing down.
The hand shape of the user triggering the electronic device such as the mobile phone to display the small hand icon 101 shown in (b) of fig. 1 on the display screen is not limited to the hand shape shown in (b) of fig. 1. The hand shape of the user's hand may be: the fingers are upward, the fingers are partially or completely opened, or the five fingers are gathered; the palm of the user's hand may face the display screen or the back of the hand may face the display screen. Only the hand type of the hand of the user needs to be ensured to meet: the hand faces the display screen in a mode that most of fingers face upwards, and the hand is required to be in the visual angle FOV of the front camera of the mobile phone.
The hand shape of the user triggering the electronic device such as the mobile phone to display the small hand icon 102 shown in (b) of fig. 2 on the display screen is not limited either. The hand shape of the user's hand may be: the fingers are downward, the fingers are partially or completely opened, or the five fingers are gathered; the palm of the user's hand may face the display screen or the back of the hand may face the display screen. Only the hand type of the hand of the user needs to be ensured to meet: the hand faces the display screen with most of the fingers facing downwards, and is required to be within the field of view FOV of the front camera of the handset.
In order to realize that the small hand icons shown in fig. 1 and fig. 2 are displayed on the display screen to prompt a user to make a space-free gesture, the embodiment of the application provides a hand direction detection method. The method for detecting the hand direction provided by the embodiment of the application can be applied to electronic equipment such as mobile phones, tablet computers, personal digital assistants (Personal Digital Assistant, PDA), desktop, laptop, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, wearable equipment and the like, and the specific form of the electronic equipment is not particularly limited.
Taking a mobile phone as an example, fig. 3 is a composition example of an electronic device according to an embodiment of the present application. As shown in fig. 3, the electronic device 300 may include a processor 310, an internal memory 320, a sensor module 330, a mobile communication module 340, a wireless communication module 350, a display 360, an audio module 370, and the like.
It is to be understood that the structure illustrated in this embodiment does not constitute a specific limitation on the electronic device 300. In other embodiments, electronic device 300 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 310 may include one or more processing units, such as: the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a smart sensor hub (sensor hub) and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 310 for storing instructions and data. In some embodiments, the memory in the processor 310 is a cache memory. The memory may hold instructions or data that the processor 310 has just used or recycled. If the processor 310 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 310 is reduced, thereby improving the efficiency of the system.
The internal memory 320 may be used to store computer-executable program code that includes instructions. The processor 310 executes various functional applications of the electronic device 300 and data processing by executing instructions stored in the internal memory 320. The internal memory 320 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 300 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 320 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 310 performs various functional applications of the electronic device 300 and data processing by executing instructions stored in the internal memory 320 and/or instructions stored in a memory provided in the processor.
In some embodiments, internal memory 320 stores instructions for performing detecting hand direction. The processor 310 may implement control electronics to detect that the hand is within the field of view FOV of the front camera by executing instructions stored in the internal memory 320, displaying a small hand icon.
The electronic device implements display functions through the GPU, the display screen 360, and the application processor, etc. The GPU is a microprocessor for image processing, connected to the display screen 360 and the application processor. GPUs are used for image rendering by performing mathematical and geometric calculations. Processor 310 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 360 is used to display images, video interfaces, and the like. The display screen 360 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro-led, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 360, N being a positive integer greater than 1.
The wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 340, the wireless communication module 350, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 300 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 340 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied on the electronic device 300. The mobile communication module 340 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 340 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 340 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 340 may be disposed in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 340 may be disposed in the same device as at least some of the modules of the processor 310.
The wireless communication module 350 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 300. The wireless communication module 350 may be one or more devices that integrate at least one communication processing module. The wireless communication module 350 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 350 may also receive a signal to be transmitted from the processor 310, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In the sensor module 330, the pressure sensor 330A is configured to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 330A may be disposed on display screen 360. The pressure sensor 330A is of various kinds, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 330A. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 360, the electronic device detects the touch operation intensity from the pressure sensor 330A. The electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor 330A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions.
The touch sensor 330B, also referred to as a "touch device". The touch sensor 330B may be disposed on the display screen 360, and the touch sensor 330B and the display screen 360 form a touch screen, which is also referred to as a "touch screen". The touch sensor 330B is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 360. In other embodiments, the touch sensor 330B may also be disposed on the surface of the electronic device at a different location than the display 360.
The electronic device 300 may implement audio functionality through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an ear-headphone interface 370D, and an application processor, among others. Such as music playing, recording, etc.
The audio module 370 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be disposed in the processor 310, or some of the functional modules of the audio module 370 may be disposed in the processor 310.
Speaker 370A, also known as a "horn," is used to convert audio electrical signals into sound signals. The electronic device may listen to music, or to hands-free conversations, through speaker 370A.
A receiver 370B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the electronic device picks up a phone call or voice message, the voice can be picked up by placing the receiver 330B close to the human ear.
Microphone 370C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 370C through the mouth, inputting a sound signal to the microphone 370C. The electronic device may be provided with at least one microphone 370C. In other embodiments, the electronic device may be provided with two microphones 370C, which may also perform a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may also be provided with three, four, or more microphones 370C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 370D is for connecting a wired earphone. The headset interface 330D may be a USB interface or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
In addition, above the above components, the electronic device runs an operating system. For exampleOperating System (OS)>Operating System (OS)>An operating system, etc. Running applications may be installed on the operating system.
Fig. 4 is a schematic software structure of an electronic device according to an embodiment of the present application.
The layered architecture divides the operating system of the electronic device into several layers, each layer having distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the operating system of the electronic device is an Android system. The Android system is respectively an Application (APP) layer, an application framework layer (abbreviated as FWK), a system library, a kernel layer and a hardware layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 4, the application packages may include applications such as gallery, map, calendar, call, camera, and system user interface (SystemUI).
The SystemUI is a system level application. In some embodiments, the SystemUI may be used to draw the hand icon 101 shown in (b) of fig. 1 or the hand icon 102 shown in (b) of fig. 2, and control the display of the hand icon 101 or the hand icon 102 on the display screen.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 4, the application framework layers may include a window management service (WindowManagerService, WMS), an activity management service (ActivityManagerService, AMS), a content provider, a telephony manager, and a resource manager, among others.
The window management service is used to manage window programs. The window management service can realize the addition, deletion, display, hiding control and the like of the window.
The Activity management service is used for managing Activity (Activity) behavior, controlling the life cycle of Activity, dispatching message time, memory management and other functions.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. In some embodiments of the present application, an application cold start may run in an Android run time, and the Android run time obtains an optimized file state parameter of the application from the running start, and further the Android run time may determine whether an optimized file is outdated due to a system upgrade through the optimized file state parameter, and return a determination result to an application management and control module.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
A Hardware Abstraction Layer (HAL) is an interface layer located between the operating system kernel and the hardware circuitry, which aims at abstracting the hardware. The hardware interface details of a specific platform are hidden, a virtual hardware platform is provided for an operating system, so that the operating system has hardware independence, and can be transplanted on various platforms.
In some embodiments, the hardware abstraction layer may include an image processing module, where the image processing module is configured to perform type recognition and hand direction recognition on an image captured by the camera, and obtain a recognition result. The specific operation of the image processing module can be seen in the following.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a sensor driver, a camera driver and the like. In some embodiments, the display driver is used to control the display screen to display an image; the sensor drive is used to control operation of a plurality of sensors, such as control operation of a pressure sensor and a touch sensor. The camera drive is used for driving the camera to run.
Under the above five-layer architecture, the electronic device is further provided with a hardware layer, and the hardware layer may include the aforementioned hardware components of the electronic device. By way of example, fig. 4 shows a display screen and a camera.
It should be noted that although the embodiments of the present application are described in the followingThe system is described by way of example, but its basic principle is equally applicable to the +.>And the like operating the electronic device of the system.
Taking a mobile phone as an example, when electronic equipment such as the mobile phone detects that the hand is positioned in the visual angle FOV of the front camera of the mobile phone, the display screen of the mobile phone can display icons of the hand so as to prompt a user to perform man-machine interaction with the mobile phone for introduction.
Fig. 5 shows a signaling diagram of a method for detecting a hand direction according to an embodiment of the present application.
As shown in fig. 5, the method for detecting a hand direction provided by the embodiment of the application includes:
s501, a camera shoots a first image.
After the display screen of the mobile phone is on and unlocked, the camera of the mobile phone is triggered to start shooting images. In some embodiments, the cameras of the mobile phone may capture images at intervals of a certain duration, and the duration of the intervals may be set.
One image taken by the camera of the cell phone may be referred to as a first image. The camera of the mobile phone generally refers to a front camera of the mobile phone, and of course, the scheme of the application is not limited to the front camera.
In some embodiments, the display screen of the mobile phone is on the screen locking interface, and the camera of the mobile phone can be triggered to start shooting images.
S502, an image processing module acquires a first image shot by a camera.
The image processing module is arranged on the hardware abstraction layer, and the image processing module and the camera are provided with data transmission channels. After the camera shoots a first image, the image processing module acquires the first image shot by the camera through the data transmission channel.
S503, the image processing module detects whether the hand in the first image is of a palm and back type.
After the image processing module acquires the first image, hand detection is carried out on the first image, and the type of the hand in the first image is detected. The types of hands may generally include: palm and back of hand type, fist type, other gesture type, and background type. Palm and back of hand type refers to the first image including part or all of the back of the hand or the palm; the fist-making type means that the first image includes a hand shape of a hand; the background type then refers to the first image not including a hand, and the other gesture types refer to the first image including a hand, but the hand is not fist-shaped and does not show the shape of the palm or back of the hand.
In some embodiments, the image processing module invokes a hand-type detection model to detect the hand in the first image, so as to obtain confidence degrees of the types of the hands, and the type of the hand with the highest confidence degree is the type of the hand in the first image.
The hand-type detection model can be obtained by training a convolutional neural network by adopting a sample image comprising hands, and the training process of the convolutional neural network is not specifically described in detail in the application, and can be seen in the conventional technology.
The image processing module detects that the hand in the first image is not of the palm and back type, and may execute step S504 described below. The image processing module detects that the hand in the first image is of the palm and back type, and may perform steps S505 to S513 described below.
In some embodiments, in the process of performing hand detection on the first image, the image processing module may further obtain position information and confidence of a hand in the first image; the hand position information is used to indicate a bounding box of the hand in the first image, which may be referred to as a hand detection box, and the confidence is used to indicate the confidence of the hand position information, and the higher the confidence is, the higher the feasibility of the hand position information is. In some embodiments, the hand position information may be coordinate information of four vertices of a matrix of the hand detection frame.
S504, the image processing module sends a first message to the SystemUI, wherein the first message indicates that the direction of the hand in the first image is ambiguous.
The SystemUI is a system application of the application layer, the image processing module detects that the hand in the first image is not a palm and back of hand type by using step S503, the image processing module may send a first message to a module of the application frame layer in a transparent transmission manner, and the module of the application frame layer sends the first message to the SystemUI in an interface callback manner, where the first message may indicate that the direction of the hand in the first image is ambiguous, that is, the hand in the first image faces towards unknown. The system ui receives the first message and typically does not perform an operation.
The image processing module detects that the hand in the first image is not the palm and the back of the hand type by using the step S503, which indicates that the first image shot by the camera does not include the hand, or the hand in the first image belongs to a fist making gesture or other gestures, so that it can be presumed that the intention of the user does not operate the mobile phone through the space-saving gesture, the mobile phone can be prevented from executing the following steps S505 to S513, and when the user does not need to operate the mobile phone through the space-saving gesture, a small hand icon is displayed on the display screen to remind the user to input the space-saving gesture, which causes trouble to the user, and saves power consumption.
In some embodiments, the first message sent by the image processing module to the SystemUI may also indicate that the hand in the first image is facing left or right. The SystemUI displays on the display screen a small hand icon with the finger facing left or right.
S505, the image processing module cuts out the hand area image from the first image.
The image processing module detects the palm and back type of the hand in the first image by using step S503, which indicates that the intention of the user is to operate the mobile phone by the spaced gesture, and the image processing module can further identify the region image of the hand by checking and predicting whether the intention of the user is to operate the mobile phone by the spaced gesture.
In some embodiments, the image processing module performs hand detection on the first image to obtain a hand detection frame. The image processing module can cut the hand detection frame from the first image to obtain a hand region image.
S506, the image processing module detects coordinates of hand key points in the hand area image.
The hand keypoints may be located at one or more finger joints of different fingers, at fingertips, at finger-palm junctions, and at the wrist, and may generally comprise the fingertips, knuckles, and finger roots of multiple fingers, and may also comprise the wrist center point.
Illustratively, the hand keypoints of a human hand may be 21, with 21 hand keypoints of a human hand being shown in fig. 6.
As shown in fig. 6, the 21 hand keypoints of a human hand include: wrist center point 0, finger roots 1, 5, 9, 13 and 17 of five fingers, fingertips 4, 8, 12, 16 and 19 of five fingers, joint points 2, 3, 6, 7, 10, 11, 14, 15, 18 and 19 of five fingers. The coordinates of the hand key points in the hand region image are detected in this step, and can be understood as follows: the image processing module obtains the coordinate values of each hand key point in the xy coordinate system shown in fig. 6.
The embodiment of the application is described by taking the coordinates of 21 hand key points detected by the image processing module as an example, but the coordinates of 21 hand key points can only be detected by the image processing module without limitation. In some embodiments, the image processing module may obtain a finger tip including a plurality of fingers and finger root hand keypoints, including, illustratively, finger roots 1, 5, 9, 13, and 17, and finger tips 4, 8, 12, 16, and 19.
In some embodiments, the image processing module invokes the pre-trained keypoint detection model to identify the hand region image and obtain coordinates of the hand keypoints in the hand region image. In some embodiments, the keypoint detection model may employ a deep neural network model, which may be, for example, a ResNet network structure, a MobileNet network structure, a NASNet network structure, or the like.
S507, the image processing module calculates a first vector by utilizing the coordinates of the key points, wherein the first vector is a vector of the average coordinates of the finger roots and the average coordinates of the fingertips.
The trend of the fingers reflected by the index finger direction, the middle finger direction and the ring finger direction of the human hand is generally the same as the overall trend of the fingers of the human hand, so that the image processing module can calculate the trend of the fingers reflected by the index finger direction, the middle finger direction and the ring finger direction of the human hand.
In some embodiments, the image processing module derives coordinates of the finger roots and fingertips of the index finger, the middle finger, and the ring finger from coordinates of the hand keypoints. As shown in fig. 7, the image processing module calculates the average coordinates P of the finger root 5 of the index finger, the finger root 9 of the middle finger, and the finger root 13 of the ring finger 1 Average coordinate P 1 Then the average coordinates of the finger root; the image processing module calculates average coordinates P2 of the tip 8 of the index finger, the tip 12 of the middle finger and the tip 16 of the ring finger, wherein the average coordinates P2 areAverage coordinates of the fingertip. The image processing module uses the average coordinates P 1 Average coordinate P2, calculate average coordinate P 1 The vector pointing to the average coordinate P2, referred to as the first vector, the direction of which can then be understood as the overall trend of the finger.
After the image processing module calculates the first vector, an included angle θ between the first vector and the x-axis in the horizontal direction may also be calculated.
In some cases, the middle finger direction of the hand is also the same as the overall trend of the fingers of the human hand, and therefore, the image processing module can calculate the direction of the middle finger. In some embodiments, the image processing module calculates a vector, also referred to as a first vector, of the middle finger, the root 9 of the middle finger pointing towards the fingertip 12, using the coordinates of the root 9 of the middle finger and the coordinates of the fingertip 12.
In some embodiments, the image processing module may also calculate the average coordinates of the finger root 5 of the index finger and the finger root 9 of the middle finger, and calculate the average coordinates of the tip 8 of the index finger and the tip 12 of the middle finger, and the image processing module calculates a vector in which the average coordinates point to the average coordinates as the first vector.
In other embodiments, the image processing module may also calculate the average coordinates of the finger root 9 of the middle finger and the finger root 13 of the ring finger, and calculate the average coordinates of the fingertip 12 of the middle finger and the fingertip 16 of the ring finger, where the image processing module calculates a vector in which the average coordinates point to the average coordinates as the first vector.
The user inputs the initial gesture, which may input the hand shape with the middle finger facing down and all or part of the other fingers facing up, so the image processing module calculates the vector of the middle finger root 9 pointing to the fingertip 12 as the first vector, and the mobile phone performs the following steps S508 to S513 to obtain the error detection result of the downward finger in the first image. The image processing module calculates a vector in which the finger root of at least two fingers of the index finger, the middle finger, and the ring finger points to the fingertip as a first vector, and performs steps S508 to S513 described below, it is possible to ensure accuracy of the direction detection of the hand in the first image.
It should be noted that, the coordinates of the fingertips of the index finger, the middle finger and the ring finger proposed in the foregoing embodiments may be replaced by the coordinates of the first joint point of the index finger, the middle finger and the ring finger, or replaced by the coordinates of the second joint point of the index finger, the middle finger and the ring finger.
S508, the image processing module judges whether the included angle between the first vector and the horizontal direction is in a range of (-30 degrees, 30 degrees), (-180 degrees, -150 degrees) or (150 degrees, 180 degrees).
The image processing module calculates an included angle theta between the first vector and the x-axis in the horizontal direction, compares the included angle theta with (30 degrees ), the included angle theta is (180 degrees, 150 degrees) and (150 degrees, 180 degrees), judges whether the included angle theta is (30 degrees ),
within a range of (-180 °, -150 °) or (150 °,180 °).
Three intervals of (-30 °,30 °), (-180 °, -150 °) and (150 °,180 °) are exemplary illustrations, the endpoints of the three intervals being adjustable by other values. The image processing module judges whether the included angle theta between the first vector and the horizontal x-axis is in the range of three sections, and can infer whether the finger of the human hand faces to the left or to the right.
Thus, the image processing module determines whether the angle between the first vector and the horizontal is (-30 °,30 °), (-180 °, -150 °) and (150 °,180 °), which can be understood as: it is determined whether the angle θ between the first vector and the horizontal x-axis is in an embodiment for indicating an angle interval in which the finger of the human hand is directed to the left or right.
The image processing module determines that the angle between the first vector and the horizontal direction is (-30 °,30 °), (-180 °, -150 °) and (150 °,180 °), and may perform step S504. The image processing module determines that the angle between the first vector and the horizontal direction is not (-30 °,30 °), (-180 °, -150 °) and (150 °,180 °), and may perform steps S509 to S513 described below.
The image processing module judges that the included angle between the first vector and the horizontal direction is (-30 degrees, 30 degrees), 180 degrees, 150 degrees and 150 degrees, 180 degrees, and whether the finger of the human hand faces to the left or to the right can be presumed, and further presumes that the intention of the user does not operate the mobile phone through the space gesture, the mobile phone can be prevented from executing the following steps S510 to S513, and when the user does not need to operate the mobile phone through the space gesture, the small hand icon is displayed on the display screen to remind the user to input the space gesture, the trouble is caused to the user, and the power consumption is saved.
S509, the image processing module judges whether the included angle between the first vector and the horizontal direction is within the range of [30 degrees, 150 degrees ].
The image processing module judges that the included angle between the first vector and the horizontal direction is not in the range of (-30 degrees, 30 degrees), 180 degrees, 150 degrees or 150 degrees, 180 degrees, and the included angle between the first vector and the horizontal direction is 30 degrees, 150 degrees or 150 degrees, 30 degrees; wherein, the included angle between the first vector and the horizontal direction is [30 degrees, 150 degrees ] ], the finger of the human hand can be presumed to be upward, the included angle between the first vector and the horizontal direction is [ -150 degrees, -30 degrees ] ], and the finger of the human hand can be presumed to be downward.
In this embodiment, the image processing module needs to determine whether the included angle between the first vector and the horizontal direction is [30 °,150 ° ], or [ -150 °, -30 ° ], so as to determine the finger orientation of the human hand. The step is described by taking an example that an image processing module judges whether the included angle between the first vector and the horizontal direction is 30 degrees and 150 degrees. In some embodiments, the image processing module may also determine whether the angle between the first vector and the horizontal direction is [ -150 °, -30 ° ]; in other embodiments, the image processing module may also determine whether the angle between the first vector and the horizontal is at [30 °,150 ° ] and at [ -150 °, -30 ° ].
The image processing module judges that the included angle between the first vector and the horizontal direction is [30 degrees, 150 degrees ] ], and can execute step S510 and step S511, and the image processing module judges that the included angle between the first vector and the horizontal direction is not [30 degrees, 150 degrees ] ], and can execute step S512 and step S513.
S510, the image processing module sends a second message to the SystemUI, wherein the second message indicates that the hand in the first image is in an upward direction.
The image processing module judges that the included angle between the first vector and the horizontal direction is within the range of [30 degrees, 150 degrees ], the finger of the human hand can be presumed to be upward, and the image processing module can send a second message to the SystemUI in a transparent transmission mode.
S511, displaying a small-hand icon with the finger facing upwards on a display screen by the SystemUI.
When the SystemUI receives the second message and determines that the finger of the hand is upward, the system controls to display a finger-upward small hand icon on the display screen of the mobile phone, and the finger-upward small hand icon may be, for example, the small hand icon 101 in fig. 1 (b).
And S512, the image processing module sends a third message to the SystemUI, wherein the third message indicates that the hand in the first image is in the downward direction.
The image processing module judges that the included angle between the first vector and the horizontal direction is not within the range of [30 degrees, 150 degrees ], the finger of the human hand can be presumed to face downwards, and the image processing module can send a third message to the SystemUI in a transparent transmission mode.
S513, displaying a small hand icon with the downward finger on the display screen by the SystemUI.
When the SystemUI receives the third message and determines that the finger of the hand is downward, the system controls to display a finger-downward small hand icon on the display screen of the mobile phone, and the finger-upward small hand icon may be, for example, the small hand icon 102 in fig. 2 (b).
In step S511 and step S513, the SystemUI displays a small hand icon on the interface currently displayed on the display screen. Further, the embodiment in which the SystemUI displays the finger-up and finger-down hand icons can be referred to as conventional technology, and will not be described in detail herein.
The system UI receives the second message or the third message, determines that the hand in the first image is in the upward or downward direction, and displays the small hand icon on the display screen, when the first message is received, determines that the direction of the hand in the first image is not clear, and does not display the small hand on the display screen, so that the problem that the mobile phone inputs a blank gesture by displaying the small hand icon on the display screen after the mobile phone misdetects that the hand in the first image is of the palm and back type through step S503 can be avoided, the user is bothered, and power consumption is saved.
Further, the SystemUI displays a small hand icon with the finger facing up on the display screen through step S511, so that the user can be reminded that the initial gesture of the blank gesture has been input, and the blank gesture such as sliding down or grasping can be input. The SystemUI displays a small hand icon with a downward finger on the display screen through step S513, so that the user can be reminded that the initial gesture of the blank gesture is already input, and the blank gesture such as a slide up gesture can be input. The user can also check whether the initial gesture input by the user is correct according to the small hand icon displayed on the display screen.
The SystemUI displays the small hand icon with the upward finger on the display screen through step S511, which means that the mobile phone predicts that the user wants to input the blank gesture such as sliding down or grasping, and the mobile phone can call the software flow for responding to the blank gesture such as sliding down or grasping in the process of displaying the small hand icon with the upward finger on the display screen, so as to improve the response speed of the mobile phone. Similarly, the SystemUI displays the small hand icon with the downward finger on the display screen through step S513, which means that the mobile phone predicts that the user wants to input the blank gesture such as the upward sliding, and the mobile phone can invoke the software flow for responding to the blank gesture such as the upward sliding in the process of displaying the small hand icon with the downward finger on the display screen, so as to improve the response speed of the mobile phone.
The foregoing embodiments provide one implementation for detecting hand orientation using coordinates of hand keypoints. Other embodiments exist for detecting the hand direction by using the coordinates of the hand key points, and the method for detecting the hand direction provided by the following embodiment of the present application discloses another embodiment for detecting the hand direction by using the coordinates of the hand key points.
As shown in fig. 8, the method for detecting a hand direction provided in this embodiment includes:
s801, a camera shoots a first image.
S802, an image processing module acquires a first image shot by a camera.
S803, the image processing module detects whether the hand in the first image is of a palm and back type.
The image processing module detects that the hand in the first image is not of the palm and back type, and may execute step S804 described below. The image processing module detects that the hand in the first image is of palm and back type, and may perform steps S805 to S814 described below.
S804, the image processing module sends a first message to the SystemUI, wherein the first message indicates that the direction of the hand in the first image is ambiguous.
In some embodiments, the first message may also be used to indicate left or right hand orientation in the first image.
S805, the image processing module cuts out the hand area image from the first image.
S806, the image processing module detects coordinates of hand key points in the hand area image.
The specific implementation of step S801 to step S806 can be referred to the content of step S501 to step S506 in the foregoing embodiment, and will not be described herein.
S807, the image processing module calculates the direction vectors of the plurality of fingers by using the coordinates of the key points of the hand.
In some embodiments, the plurality of fingers may refer to all of the fingers of a human hand. The image processing module calculates the direction vectors of the plurality of fingers by using the coordinates of the key points, as shown in fig. 9, the image processing module calculates the vector v1 of the finger root 1 of the thumb pointing to the fingertip 4, the vector v2 of the finger root 5 of the index finger pointing to the fingertip 8, the vector v3 of the finger root 9 of the middle finger pointing to the fingertip 12, the vector v4 of the finger root 13 of the ring finger pointing to the fingertip 16, and the vector v5 of the little finger root 17 pointing to the fingertip 20.
In other embodiments, the plurality of fingers may refer to the index finger, middle finger, and ring finger of a human hand. The image processing module calculates the direction vectors of a plurality of fingers by using the coordinates of the key points as follows: the image processing module calculates a vector v2 with the root 5 of the index finger pointing at the fingertip 8, a vector v3 with the root 9 of the middle finger pointing at the fingertip 12, and a vector v4 with the root 13 of the ring finger pointing at the fingertip 16.
In other embodiments, the plurality of fingers may refer to middle and ring fingers of a human hand. The image processing module calculates the direction vectors of a plurality of fingers by using the coordinates of the key points as follows: the image processing module calculates a vector v3 with the root 9 of the middle finger pointing at the fingertip 12 and a vector v4 with the root 13 of the ring finger pointing at the fingertip 16.
In other embodiments, the plurality of fingers may refer to the index finger and middle finger of a human hand. The image processing module calculates the direction vectors of a plurality of fingers by using the coordinates of the key points as follows: the image processing module calculates a vector v2 with the root 5 of the index finger pointing at the fingertip 8 and a vector v3 with the root 9 of the middle finger pointing at the fingertip 12.
In other embodiments, the plurality of fingers refers only to the middle finger of a human hand. The image processing module calculates the direction vectors of a plurality of fingers by using the coordinates of the key points as follows: the image processing module calculates a vector v3 with the finger root 9 pointing towards the fingertip 12.
In some scenes, the middle finger direction of the hand is the same as the overall trend of the hand, the image processing module calculates the finger direction vector of the middle finger, and the overall direction of the hand can be determined by using the finger direction vector of the middle finger. In some scenarios, the middle finger direction of the human hand is not the same as the overall direction of the human hand, and the image processing module may calculate finger direction vectors for at least 2 of the total fingers. Of course, the image processing module calculates the direction vectors of the index finger, the middle finger and the ring finger as optimal.
S808, the image processing module calculates the average value of the included angles between the direction vectors of the fingers and the horizontal direction.
In some embodiments, as shown in fig. 9, the image processing module calculates an angle θ1 between a vector v1 of the root 1 of the thumb pointing to the fingertip 4 and a horizontal direction, an angle θ2 between a vector v2 of the root 5 of the index finger pointing to the fingertip 8 and a horizontal direction, an angle θ3 between a vector v3 of the root 9 of the middle finger pointing to the fingertip 12 and a horizontal direction, an angle θ4 between a vector v4 of the root 13 of the ring finger pointing to the fingertip 16 and a horizontal direction, and an angle θ5 between a vector v5 of the root 17 of the little finger pointing to the fingertip 20.
The image processing module calculates the average value of the included angles theta 1 to theta 5 between the vector of each finger and the horizontal direction, and the average value is used as the average value theta of the included angles between the direction vectors of the five fingers and the horizontal direction.
In other embodiments, the image processing module calculates the angle θ2 between the vector v2 of the finger root 5 of the index finger pointing to the fingertip 8 and the horizontal direction, the angle θ3 between the vector v3 of the finger root 9 of the middle finger pointing to the fingertip 12 and the horizontal direction, and the angle θ4 between the vector v4 of the finger root 13 of the ring finger pointing to the fingertip 16 and the horizontal direction.
The image processing module calculates the average value of the angles theta 2 to theta 4 between the vectors of the index finger, the middle finger and the ring finger and the horizontal direction, and the average value is used as the average value theta of the angles between the direction vectors of the fingers and the horizontal direction.
In other embodiments, the image processing module calculates the angle θ2 between the vector v2 of the finger root 5 of the index finger pointing to the fingertip 8 and the horizontal direction, and the angle θ3 between the vector v3 of the finger root 9 of the middle finger pointing to the fingertip 12 and the horizontal direction.
The image processing module calculates the average value of the angles theta 2 and theta 3 between the vectors of the index finger and the middle finger and the horizontal direction as the average value theta of the angles between the direction vectors of the fingers and the horizontal direction.
In other embodiments, the image processing module calculates the angle θ3 between the vector v3 of the middle finger root 9 pointing to the fingertip 12 and the horizontal direction, and the angle θ4 between the vector v4 of the ring finger root 13 pointing to the fingertip 16 and the horizontal direction.
The image processing module calculates the average value of the included angles theta 2 to theta 4 between the vectors of the middle finger and the ring finger and the horizontal direction, and the average value is used as the average value theta of the included angles between the direction vectors of the fingers and the horizontal direction.
In some scenes, the middle finger of the human hand can also indicate the overall trend of the human hand, and the image processing module can also calculate the vector v3 of the finger root 9 of the middle finger pointing to the fingertip 12 and the included angle theta 3 in the horizontal direction, and take the vector v3 of the finger root 9 of the middle finger pointing to the fingertip 12 and the included angle theta 3 in the horizontal direction as the average value theta of the included angles of the direction vectors of the fingers and the horizontal direction.
S809, the image processing module judges whether the average value of the included angles is in a range of (-30 degrees, 30 degrees), (-180 degrees, -150 degrees) or (150 degrees, 180 degrees).
The average value theta of the included angles between the direction vectors of the fingers and the horizontal direction can indicate the overall trend of the human hand, namely the human hand direction, the image processing module judges whether the average value theta of the included angles between the direction vectors of the fingers and the horizontal direction is (-30 degrees, 30 degrees), (-180 degrees, (-150 degrees) or (150 degrees, 180 degrees), and whether the human hand faces to the right or to the left can be presumed.
The image processing module judges that the average value theta of the included angles between the direction vectors of the fingers and the horizontal direction is (-30 degrees, 30 degrees), 180 degrees, 150 degrees or 150 degrees, 180 degrees, and the human hand is directed to the left or to the right; the image processing module judges that the average value theta of the included angles between the direction vectors of the fingers and the horizontal direction is not (-30 degrees, 30 degrees), 180 degrees, 150 degrees or 150 degrees, 180 degrees, and the average value theta indicates that the hand faces upwards or downwards.
The image processing module determines that the average value of the angles between the direction vectors and the horizontal directions of the plurality of fingers is (-30 °,30 °), (-180 °, -150 °) or (150 °,180 °), and may perform step S804. The image processing module determines that the average value of the angles between the direction vectors and the horizontal directions of the plurality of fingers is not (-30 °,30 °), (-180 °, -150 °) or (150 °,180 °), and may perform the following steps S810 to S814.
Three intervals of (-30 °,30 °), (-180 °, -150 °) and (150 °,180 °) are exemplary illustrations, the endpoints of the three intervals being adjustable by other values. The image processing module judges whether the average value of the included angles between the direction vectors of the fingers and the horizontal direction is in the range of three sections, and can infer whether the finger of the human hand faces to the left or to the right.
Thus, the image processing module determines whether the average of the angles between the direction vectors and the horizontal direction of the plurality of fingers is at (-30 °,30 °), (-180 °, -150 °) and (150 °,180 °), which can be understood as: it is determined whether an average value of the angles between the direction vectors of the plurality of fingers and the horizontal direction is in an embodiment for indicating an angle section of the finger of the human hand toward the left or right.
S810, the image processing module judges whether the average value of the included angles is within the range of [30 degrees, 150 degrees ].
The image processing module judges whether the average value theta of the included angles between the direction vectors of the fingers and the horizontal direction is in [30 degrees, 150 degrees ]. The average value theta of the included angles between the direction vectors of the plurality of fingers and the horizontal direction can indicate the overall trend of the human hand, namely the direction of the human hand, and the image processing module judges that the average value theta of the included angles between the direction vectors of the plurality of fingers and the horizontal direction is in [30 degrees, 150 degrees ], and can infer that the finger of the human hand in the first image faces upwards; the image processing module judges that the average value theta of the included angles between the direction vectors of the plurality of fingers and the horizontal direction is not [30 degrees, 150 degrees ], and can infer that the fingers of the human hand in the first image face downwards.
The image processing module determines that the average value of the angles between the direction vectors and the horizontal directions of the plurality of fingers is [30 °,150 ° ], and may perform step S811 and step S812, and the image processing module determines that the average value of the angles between the direction vectors and the horizontal directions of the plurality of fingers is not [30 °,150 ° ], and may perform step S813 and step S814.
In some embodiments, the image processing module may also determine whether the angle between the first vector and the horizontal direction is [ -150 °, -30 ° ]; in other embodiments, the image processing module may also determine whether the angle between the first vector and the horizontal is at [30 °,150 ° ] and at [ -150 °, -30 ° ].
S811, the image processing module sends a second message to the SystemUI, wherein the second message indicates that the hand in the first image is in an upward direction.
S812, displaying a small-hand icon with the finger facing upwards on a display screen by the SystemUI.
And S813, the image processing module sends a third message to the SystemUI, wherein the third message indicates that the hand in the first image is in the downward direction.
S814, displaying a small hand icon with the downward finger on the display screen by the SystemUI.
The specific implementation of step S811 to step S814 can be referred to the content of step S510 to step S513 in the foregoing embodiment, and will not be described herein.
As can be seen from the two embodiments described above: in order to realize that a user input initiation gesture triggers a man-machine interaction of an electronic device to prepare for a blank gesture with a user, a method for detecting a hand direction according to an embodiment of the present application may, as shown in fig. 10, include:
s1001, the electronic equipment acquires an image shot by the camera.
After the electronic equipment is on the screen, the camera is started to shoot images, and the electronic equipment acquires the images shot by the camera.
S1002, the electronic device detects whether the hand in the image is of the palm and back type.
The electronic device detects that the hand in the image is of the palm and back type, and executes step S1003; the electronic device detects that the hand in the image is not of the palm and back type, and then steps S1004 to S1010 are performed.
The electronic device adopts a hand detection algorithm to detect whether the hand in the image shot by the camera is of the palm and back of hand type, and the implementation of the electronic device to detect whether the hand in the image shot by the camera is of the palm and back of hand type can be referred to the content of step S503 in the foregoing embodiment, which is not described herein again.
S1003, the electronic equipment determines the direction of the human hand in the image to be ambiguous.
The electronic device detects that the hand in the image is not of the palm and back of hand type, and can infer that the intention of the user does not interact with the electronic device through a blank gesture, so that the electronic device can not further recognize the finger direction of the hand in the image.
S1004, the electronic equipment cuts out the hand region image of the human hand from the image.
The specific implementation manner of this step may be referred to the content of step S505 in the foregoing embodiment, and will not be described herein.
S1005, the electronic equipment detects coordinates of the hand key points in the hand area image.
The specific implementation manner of this step may be referred to the content of step S506 in the foregoing embodiment, and will not be described herein.
S1006, the electronic equipment calculates the included angle between the finger direction vector and the horizontal direction of the human hand by using the coordinates of the key points of the hand.
Step S507 in the foregoing embodiment, and step S807 and step S808 in the foregoing embodiment are two implementation manners of the electronic device to perform this step, and the specific implementation manner of this step may refer to the content of the foregoing embodiment, which is not repeated herein.
S1007, the electronic equipment judges whether the included angle between the finger direction vector of the human hand and the horizontal direction is (-30 degrees, 30 degrees),
within a range of (-180 °, -150 °) or (150 °,180 °).
The electronic device judges that the included angle between the finger direction vector and the horizontal direction of the human hand is (-30 degrees, 30 degrees), (-180 degrees, -150 degrees) or (150 degrees, 180 degrees), and then the step S1003 is executed; the electronic device determines whether the included angle between the finger direction vector and the horizontal direction of the human hand is (-30 °,30 °), (-180 °, -150 °) or (150 °,180 °), and then performs steps S1008 to S1010.
For the specific implementation of this step, reference may be made to the content of step S508 and step S809 in the foregoing embodiments, which are not described herein.
S1008, the electronic equipment judges whether the included angle between the finger direction vector of the human hand and the horizontal direction is within the range of [30 degrees, 150 degrees ].
The electronic device judges that the included angle between the finger direction vector and the horizontal direction of the human hand is within the range of [30 °,150 ° ], and the electronic device executes step S1009; the electronic device determines that the included angle between the finger direction vector and the horizontal direction of the human hand is not within the range of [30 °,150 ° ], and the electronic device executes step S1010.
For the specific implementation of this step, reference may be made to the content of step S509 and step S810 in the foregoing embodiments, which are not described herein again.
S1009, the electronic device displays a small-hand icon with the finger facing upward on the display screen.
The electronic device judges that the included angle between the finger direction vector of the human hand and the horizontal direction is in the range of [30 degrees, 150 degrees ], and can presume that the finger of the human hand faces upwards in the image shot by the camera, based on the fact, the electronic device displays a small hand icon with the finger facing upwards on the display screen, and the small hand icon is shown in (b) in fig. 1.
S1010, the electronic equipment displays a small hand icon with the downward finger on a display screen.
The electronic device judges that the included angle between the finger direction vector and the horizontal direction of the human hand is not within the range of [30 degrees, 150 degrees ], and can presume that the finger of the human hand faces downwards in the image shot by the camera, based on the fact, the electronic device displays a small hand icon with the finger facing downwards on the display screen, and the small hand icon is shown in (b) of fig. 2.
Another embodiment of the application also provides a computer-readable storage medium having instructions stored therein, which when run on a computer or processor, cause the computer or processor to perform one or more steps of any of the methods described above.
The computer readable storage medium may be a non-transitory computer readable storage medium, for example, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Another embodiment of the application also provides a computer program product containing instructions. The computer program product, when run on a computer or processor, causes the computer or processor to perform one or more steps of any of the methods described above.

Claims (12)

1. A method for detecting a hand direction, comprising:
the method comprises the steps that a display screen of electronic equipment is in a bright screen state, and an image shot by a camera of the electronic equipment is obtained;
calculating an included angle between the finger direction and the horizontal direction of the hand in the image;
if the included angle between the finger direction and the horizontal direction of the hand in the image is judged, the condition that the finger faces the first direction is met, and a hand icon with the finger facing the first direction is displayed on the display screen; and if the included angle between the finger direction and the horizontal direction of the hand in the image meets the condition that the finger faces the second direction, displaying a hand-shaped icon with the finger facing the second direction on the display screen.
2. The method for detecting a hand direction according to claim 1, wherein after the capturing of the image captured by the camera of the electronic device, further comprising: detecting that the hand in the image is of the palm and back type.
3. The method for detecting a hand direction according to claim 1 or 2, wherein after the capturing of the image captured by the camera of the electronic device, further comprises:
clipping a hand region image from the image;
detecting coordinates of a hand key point of the hand region image;
The calculating the included angle between the finger of the hand in the image and the horizontal direction comprises the following steps: and calculating and obtaining the included angle between the finger direction vector and the horizontal direction of the hand by using the coordinates of the hand key points.
4. The method for detecting a hand direction according to claim 3, wherein the calculating an angle between a finger direction vector of the hand and a horizontal direction by using coordinates of the hand key points comprises:
calculating a vector of finger root coordinates pointing to fingertip coordinates in coordinates of the hand key points to obtain a finger direction vector of the hand in the image;
and calculating the included angle between the finger direction vector of the hand in the image and the horizontal direction.
5. The method according to claim 4, wherein calculating the vector of the finger-root coordinates of the hand among the coordinates of the hand keypoint to the fingertip coordinates, to obtain the finger-direction vector of the hand in the image, comprises:
calculating to obtain the average coordinates of the finger roots of the plurality of fingers in the image by using the coordinates of the finger roots of the plurality of fingers in the coordinates of the hand key points, and calculating to obtain the average coordinates of the finger tips of the plurality of fingers in the image by using the coordinates of the finger tips of the plurality of fingers in the coordinates of the hand key points, wherein the plurality of fingers at least comprise middle fingers;
And calculating the vector of the average coordinates of the finger roots of the plurality of fingers pointing to the average coordinates of the finger tips of the plurality of fingers, and obtaining the finger direction vector of the hand in the image.
6. The method for detecting a hand direction according to claim 3, wherein the calculating an angle between a finger direction vector of the hand and a horizontal direction by using coordinates of the hand key points comprises:
calculating finger direction vectors of a plurality of fingers of the hand in the image by utilizing the finger root coordinates and the fingertip coordinates of the plurality of fingers in the coordinates of the hand key points, wherein the finger direction vectors of the plurality of fingers at least comprise finger direction vectors of middle fingers;
calculating the included angles between the finger direction vectors and the horizontal directions of a plurality of fingers to obtain a plurality of included angles;
and calculating the average of the included angles to obtain the included angles of the finger direction and the horizontal direction of the hand in the image.
7. The method for detecting a hand direction according to any one of claims 1 to 6, wherein the determining an angle between a finger direction of a hand in the image and a horizontal direction, in which the condition that the finger is oriented in the first direction is satisfied, includes:
and judging the included angle between the finger direction and the horizontal direction of the hand in the image, wherein the included angle is in the range of the included angle corresponding to the upward direction of the finger.
8. The method according to any one of claims 1 to 7, further comprising, before determining that an angle between a finger direction of a hand in the image and a horizontal direction satisfies a condition that the finger is oriented in the first direction:
and judging the included angle between the finger direction and the horizontal direction of the hand in the image, wherein the condition that the finger faces the third direction is not satisfied.
9. The method for detecting a hand direction according to claim 8, wherein the determining that the angle between the finger direction and the horizontal direction of the hand in the image does not satisfy the condition that the finger is oriented toward the third direction includes:
and judging the included angle between the finger direction and the horizontal direction of the hand in the image, wherein the included angle is not in the range of the included angle corresponding to the left and right directions of the finger.
10. The method for detecting a hand direction according to claim 9, wherein if it is determined that the angle between the finger direction and the horizontal direction of the hand in the image satisfies the condition that the finger is oriented in the second direction, displaying a hand icon with the finger oriented in the second direction on the display screen includes:
and if the included angle between the finger direction and the horizontal direction of the hand in the image is judged to not meet the condition that the finger faces the first direction, displaying a hand-shaped icon with the finger facing the second direction on the display screen.
11. An electronic device, comprising:
one or more processors, memory, and a display screen;
the memory and the display screen being coupled to the one or more processors, the memory being for storing computer program code comprising computer instructions which, when executed by the one or more processors, cause the electronic device to perform the hand direction detection method of any one of claims 1 to 10.
12. A computer-readable storage medium for storing a computer program, which, when executed, is adapted to carry out a method of detecting a hand direction according to any one of claims 1 to 10.
CN202310189865.6A 2023-02-22 2023-02-22 Hand direction detection method, electronic device and readable medium Pending CN117111727A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310189865.6A CN117111727A (en) 2023-02-22 2023-02-22 Hand direction detection method, electronic device and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310189865.6A CN117111727A (en) 2023-02-22 2023-02-22 Hand direction detection method, electronic device and readable medium

Publications (1)

Publication Number Publication Date
CN117111727A true CN117111727A (en) 2023-11-24

Family

ID=88797184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310189865.6A Pending CN117111727A (en) 2023-02-22 2023-02-22 Hand direction detection method, electronic device and readable medium

Country Status (1)

Country Link
CN (1) CN117111727A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090217211A1 (en) * 2008-02-27 2009-08-27 Gesturetek, Inc. Enhanced input using recognized gestures
CN111062360A (en) * 2019-12-27 2020-04-24 恒信东方文化股份有限公司 Hand tracking system and tracking method thereof
US20200346546A1 (en) * 2017-12-26 2020-11-05 Lg Electronics Inc. In-vehicle display device
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN112527093A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Gesture input method and electronic equipment
CN112799574A (en) * 2021-02-23 2021-05-14 京东方科技集团股份有限公司 Display control method and display device
CN114942721A (en) * 2021-02-10 2022-08-26 夏普株式会社 Display device, display method, and recording medium having display program recorded thereon

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090217211A1 (en) * 2008-02-27 2009-08-27 Gesturetek, Inc. Enhanced input using recognized gestures
US20200346546A1 (en) * 2017-12-26 2020-11-05 Lg Electronics Inc. In-vehicle display device
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN112527093A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Gesture input method and electronic equipment
CN111062360A (en) * 2019-12-27 2020-04-24 恒信东方文化股份有限公司 Hand tracking system and tracking method thereof
CN114942721A (en) * 2021-02-10 2022-08-26 夏普株式会社 Display device, display method, and recording medium having display program recorded thereon
CN112799574A (en) * 2021-02-23 2021-05-14 京东方科技集团股份有限公司 Display control method and display device

Similar Documents

Publication Publication Date Title
CN110795007B (en) Method and device for acquiring screenshot information
KR101873405B1 (en) Method for providing user interface using drawn patten and mobile terminal thereof
KR20160113903A (en) The mobile terminal and control method thereof
CN110673783B (en) Touch control method and electronic equipment
KR20150086032A (en) Mobile terminal and method for controlling the same
CN111602108B (en) Application icon display method and terminal
US11941906B2 (en) Method for identifying user's real hand and wearable device therefor
CN111338530A (en) Control method of application program icon and electronic equipment
KR20150087024A (en) Mobile terminal and method for controlling the same
CN114422710B (en) Video recording control method for electronic equipment, electronic equipment and readable medium
CN113409041B (en) Electronic card selection method, device, terminal and storage medium
CN113168257A (en) Method for locking touch operation and electronic equipment
CN115131789A (en) Character recognition method, character recognition equipment and storage medium
CN114090140A (en) Interaction method between devices based on pointing operation and electronic device
EP3082015B1 (en) Method and apparatus for displaying image information
CN111880647A (en) Three-dimensional interface control method and terminal
KR20180106731A (en) Mobile terminal having artificial intelligence agent
KR20160122119A (en) Mobile terminal and control method therefor
WO2022222688A1 (en) Window control method and device
CN117111727A (en) Hand direction detection method, electronic device and readable medium
KR20150135844A (en) Mobile terminal and method for controlling the same
KR20120057256A (en) Mobile terminal and operation method thereof
KR20150139236A (en) Mobile terminal and method for controlling the same
KR20150139235A (en) Electronic device and method for controlling the same
KR20150094243A (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination