CN110908514A - Palm posture recognition method and device - Google Patents

Palm posture recognition method and device Download PDF

Info

Publication number
CN110908514A
CN110908514A CN201911141999.0A CN201911141999A CN110908514A CN 110908514 A CN110908514 A CN 110908514A CN 201911141999 A CN201911141999 A CN 201911141999A CN 110908514 A CN110908514 A CN 110908514A
Authority
CN
China
Prior art keywords
mobile terminal
finger
screen image
touch
palm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911141999.0A
Other languages
Chinese (zh)
Inventor
赵梓宏
周荣刚
谭北平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing Mininglamp Software System Co ltd
Original Assignee
Beihang University
Beijing Mininglamp Software System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University, Beijing Mininglamp Software System Co ltd filed Critical Beihang University
Priority to CN201911141999.0A priority Critical patent/CN110908514A/en
Publication of CN110908514A publication Critical patent/CN110908514A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a palm gesture recognition method and a device, which are applied to a mobile terminal, wherein a touch sensor is arranged on a frame of the mobile terminal, and the touch sensor is used for recognizing touch information of each finger which is in contact with the frame of the mobile terminal when a user holds the mobile terminal, and the method comprises the following steps: acquiring touch information of each finger output by a touch sensor, and determining a pattern shape formed by the touch information of each finger according to the touch information of each finger; obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the corresponding relation between the pre-stored sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.

Description

Palm posture recognition method and device
Technical Field
The application relates to the field of mobile terminals, in particular to a palm posture recognition method and device.
Background
With the development of mobile terminal devices, people can only realize work or entertainment modes on some fixed and inconvenient portable terminal devices, especially smart phones, which become an indispensable tool in people's lives.
Some existing basic functions of the smart phone (such as screen rotation, helper buttons, a mobile phone keyboard and the like) are switched through gravity sensing or manual setting. However, in some specific scenarios, it is complicated for the user to perform these operations, for example, in a case where the user is lying on his side, the user wants to watch a video using the smartphone in a landscape mode, but the rule of controlling the screen rotation according to the gravity sensing of the smartphone is used, at this time, the gravity sensing of the smartphone is still in a portrait mode, and the user must rotate the smartphone to change the screen image direction into a landscape mode, lock the screen image direction, and watch the video again. This is related to the gravity sensing of the smart phone only using gravity as a reference datum, but it is troublesome for the user.
Disclosure of Invention
In view of the above, an object of the present application is to provide a palm posture identifying method, which is used to solve the problem of how to determine a palm posture used by a user holding a mobile terminal in the prior art.
In a first aspect, an embodiment of the present application provides a palm gesture recognition method, which is applied to a mobile terminal, where a frame of the mobile terminal is provided with a touch sensor, and the touch sensor is used to recognize touch information of each finger that is in contact with the frame of the mobile terminal when a user holds the mobile terminal, and the method includes:
acquiring touch information of each finger output by a touch sensor, and determining a pattern shape formed by the touch information of each finger according to the touch information of each finger;
obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the corresponding relation between the pre-stored sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.
According to a first aspect, embodiments of the present application provide a first possible implementation manner of the first aspect, where the touch information includes: touch area information;
determining a pattern shape formed by the touch information of each finger according to the touch information of each finger, wherein the pattern shape comprises the following steps:
and determining touch areas corresponding to the touch area information of the fingers on the mobile terminal frame, and forming the pattern shape according to the touch areas corresponding to the fingers.
According to the first aspect, an embodiment of the present application provides a second possible implementation manner of the first aspect, where the touch information includes: fingerprint information and touch area information;
determining a pattern shape formed by the touch information of each finger according to the touch information of each finger, wherein the pattern shape comprises the following steps:
calling a fingerprint library of the user;
determining the finger type corresponding to each finger according to the fingerprint information of each finger and the fingerprint database;
and determining the pattern shape formed by each finger according to the finger type and the touch area information of each finger.
According to the first aspect, an embodiment of the present application provides a third possible implementation manner of the first aspect, where the mobile terminal is further provided with a touch sensor on the back surface, and the touch sensor is configured to identify touch information of each finger that is in contact with the back surface of the mobile terminal when a user holds the mobile terminal;
determining a pattern shape formed by the touch information of each finger according to the touch information of each finger, wherein the pattern shape comprises the following steps:
and determining a touch area corresponding to the touch area information of each finger on the frame of the mobile terminal and a touch area corresponding to the touch area information of each finger on the back of the mobile terminal, and forming the pattern shape according to the touch area corresponding to each finger.
According to the first aspect, embodiments of the present application provide a fourth possible implementation manner of the first aspect, where after obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal, the method further includes:
obtaining the direction of the screen image mapped by the current palm posture according to the corresponding relation between the pre-stored sample palm posture and the direction of the screen image;
judging whether the direction of the screen image mapped by the current palm posture is consistent with the direction of the current screen image of the mobile terminal;
if not, changing the screen image direction of the mobile terminal to be the screen image direction mapped by the current palm posture;
and if the screen image direction is consistent with the current screen image direction, the mobile terminal keeps the current screen image direction.
According to the first aspect, this application provides a fifth possible implementation manner of the first aspect, where after obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal, the method further includes:
obtaining an input panel display scheme mapped by the current palm posture according to a pre-stored corresponding relation between the sample palm posture and the input panel display scheme;
judging whether the display scheme of the input panel mapped by the current palm posture is consistent with the display scheme of the current input panel of the mobile terminal;
if not, changing the input panel display scheme of the mobile terminal to the input panel display scheme mapped by the current palm posture;
and if the input panel display scheme is consistent with the input panel display scheme, the mobile terminal keeps the current input panel display scheme.
In a second aspect, an embodiment of the present application provides a palm gesture recognition device, which is applied to a mobile terminal, a touch sensor is disposed on a frame of the mobile terminal, the touch sensor is used for recognizing touch information of each finger that is in contact with the frame of the mobile terminal when a user holds the mobile terminal, and the device includes:
the processing module is used for acquiring touch information of each finger output by the touch sensor and determining a pattern shape formed by the touch information of each finger according to the touch information of each finger;
the first matching module is used for obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the corresponding relation between the pre-stored sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.
According to a second aspect, the present embodiments provide a first possible implementation manner of the second aspect, where after the first matching module, the apparatus further includes:
the second matching module is used for obtaining the screen image direction mapped by the current palm posture according to the corresponding relation between the pre-stored sample palm posture and the screen image direction;
the judging module is used for judging whether the direction of the screen image mapped by the current palm posture is consistent with the direction of the current screen image of the mobile terminal; if not, changing the screen image direction of the mobile terminal to be the screen image direction mapped by the current palm posture; and if the screen image direction is consistent with the current screen image direction, the mobile terminal keeps the current screen image direction.
In a third aspect, an embodiment of the present application provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor implements the steps of the method according to any one of the first aspect and possible implementation manners when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, performs the steps of the method of any one of the above first aspect and possible implementations thereof.
According to the palm posture recognition method and device provided by the embodiment of the application, the touch information of each finger is obtained by using the touch sensor arranged on the frame of the mobile terminal, so that the pattern shape formed by each finger on the frame of the mobile terminal is determined, and then the palm posture of the user holding the mobile terminal is determined according to the corresponding relation between the pre-stored sample pattern shape and the palm posture. The palm posture identifying method and device provided by the embodiment of the application can quickly identify the palm posture of a user, further apply the palm posture to the realization of the functions of the mobile terminal, and improve the use convenience and use efficiency of the user.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic flowchart of a palm posture recognition method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a palm posture recognition method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a palm pose provided by an embodiment of the present application;
fig. 4 is a schematic structural diagram of a palm posture identifying device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a palm gesture recognition method, which is applied to a mobile terminal, wherein a touch sensor is arranged on a frame of the mobile terminal, and the touch sensor is used for recognizing touch information of fingers which are in contact with the frame of the mobile terminal when a user holds the mobile terminal, and as shown in fig. 1, the palm gesture recognition method comprises the following steps:
step S101, acquiring touch information of each finger output by a touch sensor, and determining a pattern shape formed by the touch information of each finger according to the touch information of each finger;
step S102, obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the corresponding relation between the pre-stored sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.
When a user holds the mobile terminal, a touch sensor arranged on a frame of the mobile terminal identifies touch information of each finger placed on the frame of the mobile terminal by the user, the mobile terminal determines that each finger of the user forms a pattern shape on the frame of the mobile terminal by analyzing the touch information, matches the pattern shape with a pre-stored sample pattern shape, and takes a palm posture of the handheld mobile terminal corresponding to the sample pattern shape with the highest matching degree as a palm posture of the handheld mobile terminal mapped by the pattern shape.
Specifically, the palm gestures include horizontally holding the mobile terminal and vertically holding the mobile terminal, while horizontally holding the mobile terminal includes horizontally holding the mobile terminal to the left (with the left frame of the mobile terminal facing below the user's field of view) and horizontally holding the mobile terminal to the right (with the right frame of the mobile terminal facing below the user's field of view), and vertically holding the mobile terminal includes vertically holding the mobile terminal to the front (with the lower frame of the mobile terminal facing below the user's field of view) and vertically holding the mobile terminal to the upside down (with the upper frame of the mobile terminal facing below the user's field of view).
In an optional embodiment, the touch information includes: touch area information;
in the step S101, determining a pattern shape composed of the touch information of each finger according to the touch information of each finger includes:
step 1011, determining touch areas corresponding to the touch area information of each finger on the frame of the mobile terminal, and forming the pattern shape according to the touch areas corresponding to each finger.
Specifically, the pattern shape formed on the frame when the user holds the mobile terminal is obtained by analyzing the size, shape and position of the touch area of each finger on the frame on the mobile terminal when the user holds the mobile terminal, so as to be used for matching the palm posture subsequently.
In an optional embodiment, the touch information includes: fingerprint information and touch area information;
in step S101, determining a pattern shape formed by the touch information of each finger according to the touch information of each finger includes:
step 1012, calling a fingerprint database of the user;
1013, according to the fingerprint information of each finger and the fingerprint database, the finger type corresponding to each finger;
and 1014, determining the pattern shape formed by the fingers according to the finger type and the touch area information of each finger.
Specifically, the fingers used for touching the frame of the mobile terminal when the user holds the mobile terminal are matched according to the fingerprints of the fingers of the two hands of the user, which are input by the user in advance, and the pattern shape formed by the fingers of the mobile terminal held by the user is confirmed according to the matched position relationship between the fingers, so as to be used for matching the palm posture subsequently.
Because the number and the positions of the touch areas on each frame of the mobile terminal are only identified, the interference is more likely to occur, and therefore the unstable factor of palm gesture identification can be effectively eliminated by confirming which finger of the user specifically corresponds to each touch area through the fingerprint information.
In an optional embodiment, the mobile terminal is further provided with a touch sensor on the back surface, and the touch sensor is used for identifying touch information of each finger which is in contact with the back surface of the mobile terminal when a user holds the mobile terminal;
in the step S101, determining a pattern shape formed by the touch information of each finger according to the touch information of each finger includes:
step 1014, determining a touch area corresponding to the touch area information of each finger on the frame of the mobile terminal and a touch area corresponding to the touch area information of each finger on the back of the mobile terminal, and forming the pattern shape according to the touch area corresponding to each finger.
Specifically, since not all fingers may touch the frame of the mobile terminal when the user holds the mobile terminal, in order to more accurately determine the shape of the pattern formed by the user holding the mobile terminal and further improve the accuracy of matching the palm posture according to the shape of the pattern, it is necessary to identify the touch information of the fingers that may be placed on the back of the mobile terminal when the user holds the mobile terminal.
The pattern shape is formed by the touch information of each finger of the user on the frame and the back of the mobile terminal, and the obtained result is closer to the reality when the mobile terminal performs analysis processing.
In an optional embodiment, after obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal in step S102, as shown in fig. 2, the method further includes:
step S1031, obtaining a screen image direction mapped by the current palm posture according to a pre-stored corresponding relation between the sample palm posture and the screen image direction;
step S1032, judging whether the direction of the screen image mapped by the current palm posture is consistent with the direction of the current screen image of the mobile terminal;
step S1033, if the palm posture is not consistent with the palm posture mapping image, changing the direction of the screen image of the mobile terminal into the direction of the screen image mapped by the current palm posture;
and S1034, if the image direction is consistent with the current screen image direction, the mobile terminal keeps the current screen image direction.
Specifically, after analyzing the palm gesture of the user holding the mobile terminal, the mobile terminal applies the palm gesture to the screen image direction rotation of the mobile terminal. The mobile terminal needs to store a corresponding relationship between the sample palm posture and the screen image direction in advance, that is, under a certain palm posture of the user holding the mobile terminal, which frame of the mobile terminal is below the visual field of the corresponding user, and the screen image direction should be below the frame as an image, so that the user can see the image which is forward for the user.
The sample palm posture is a general palm posture, and may be different from a palm posture of the user actually holding the mobile terminal, for example, the number of fingers possibly used when the user actually holds the mobile terminal is small, or a distance between some fingers of the user actually holding the mobile terminal is large or small or overlapped, or the user holds the mobile terminal with both hands, or two users hold the mobile terminal at the same time, so that the palm posture obtained in step S102 needs to be matched with the sample palm posture in similarity.
And matching and comparing the analyzed palm posture with a pre-stored sample palm posture to obtain a sample palm posture with the highest similarity, and taking the screen image direction corresponding to the sample palm posture as the screen image direction mapped by the current palm posture.
Comparing the screen image direction mapped by the current palm posture with the screen image direction of the current mobile terminal, if the screen image direction mapped by the current palm posture is the same as the screen image direction of the current mobile terminal, not changing the screen image direction of the mobile terminal, and if the screen image direction mapped by the current palm posture is not the same as the screen image direction mapped by the current palm posture, adjusting the screen image direction of the mobile terminal to the screen image direction mapped by the current palm posture so as to provide a forward image for the.
As shown in FIG. 3, panel groups a1, a2, b1, b2, c1, c2 are several basic sample palm poses.
For diagram group a1, the palm gesture of the left diagram is that the left hand vertically holds the mobile terminal, the palm gesture of the right diagram is that the right hand vertically holds the mobile terminal, both the palm gestures have 1 touch area and 2 touch areas on the side frame, respectively, and the upper frame (the horizontal frame with the receiver side) of the mobile terminal has 1 touch area. The touch sensor identifies the touch area of each finger identified on the frame of the mobile terminal held by the user, the corresponding palm gesture is obtained according to the pattern shape formed by the touch area of each finger, when the similarity between the palm gesture and the palm gesture corresponding to the group a1 is the highest, the vertical screen image with the upper end facing the lower frame of the mobile terminal is determined according to the corresponding relation between the sample palm gesture and the screen image direction, for example, the screen image direction of the current mobile terminal is a horizontal screen image, and the screen image direction is rotated into the vertical screen image with the upper end facing the lower frame of the mobile terminal.
For diagram group a2, the palm gesture of the left diagram is that the left hand vertically holds the mobile terminal, the palm gesture of the right diagram is that the right hand vertically holds the mobile terminal, both the palm gestures have 1 touch area and 2 touch areas on the side frame, respectively, and there are 1 touch area on the lower frame (the horizontal frame on the side without the receiver) of the mobile terminal. The touch sensor identifies the touch area of each finger identified on the frame of the mobile terminal held by the user, the corresponding palm gesture is obtained according to the pattern shape formed by the touch area of each finger, when the similarity between the palm gesture and the palm gesture corresponding to the drawing group a2 is the highest, the vertical screen image with the upper end facing the upper frame of the mobile terminal is determined according to the corresponding relation between the sample palm gesture and the screen image direction, for example, when the screen image direction of the mobile terminal is a horizontal screen image, the screen image direction is rotated to the vertical screen image with the upper end facing the upper frame of the mobile terminal.
For diagram group b1, the palm gesture of the upper diagram is that the left screen side transversely holds the mobile terminal, the palm gesture of the lower diagram is that the right screen side transversely holds the mobile terminal, both the palm gestures have 1 touch area on the left side frame of the mobile terminal, 3 touch areas on the right side frame of the mobile terminal, and no touch areas are on the upper and lower frames of the mobile terminal. The touch sensor identifies the touch area of each finger identified on the frame of the mobile terminal held by the user, the corresponding palm gesture is obtained according to the pattern shape formed by the touch area of each finger, when the similarity between the palm gesture and the palm gesture corresponding to the group b1 is the highest, the direction of the screen image is determined to be a horizontal screen image with the upper end facing the right side frame of the mobile terminal according to the corresponding relation between the sample palm gesture and the direction of the screen image, for example, the direction of the screen image of the current mobile terminal is a vertical screen image, and the direction of the screen image is rotated to be a horizontal screen image with the upper end facing the right side frame of the mobile terminal.
For diagram group b2, the palm gesture of the upper diagram is that the mobile terminal is held transversely on the back of the left hand, the palm gesture of the lower diagram is that the mobile terminal is held transversely on the back of the right hand, both the palm gestures have 1 touch region on the left side frame of the mobile terminal, 1 touch region is on the right side frame of the mobile terminal, the area of the touch region on the left side frame is smaller than that of the touch region on the right side frame, and there is no touch region on the upper and lower frames of the mobile terminal. The touch sensor identifies the touch area of each finger identified on the frame of the mobile terminal held by the user, the corresponding palm gesture is obtained according to the pattern shape formed by the touch area of each finger, when the similarity between the palm gesture and the palm gesture corresponding to the group b2 is the highest, the direction of the screen image is determined to be a horizontal screen image with the upper end facing the left frame of the mobile terminal according to the corresponding relation between the sample palm gesture and the direction of the screen image, for example, the direction of the screen image of the current mobile terminal is a vertical screen image, and the direction of the screen image is rotated to be a horizontal screen image with the upper end facing the left frame of the mobile terminal.
For diagram group c1, the palm gesture of the upper diagram is that the mobile terminal is held transversely on the back side of the left hand, the palm gesture of the lower diagram is that the mobile terminal is held transversely on the back side of the right hand, both the palm gestures have 1 touch region on the left side frame of the mobile terminal, 1 touch region is on the right side frame of the mobile terminal, the area of the touch region on the left side frame is larger than that of the touch region on the right side frame, and no touch regions are on the upper and lower frames of the mobile terminal. The touch sensor identifies the touch area of each finger identified on the frame of the mobile terminal held by the user, the corresponding palm gesture is obtained according to the pattern shape formed by the touch area of each finger, when the similarity between the palm gesture and the palm gesture corresponding to the group c1 is the highest, the direction of the screen image is determined to be a horizontal screen image with the upper end facing the right side frame of the mobile terminal according to the corresponding relation between the sample palm gesture and the direction of the screen image, for example, the direction of the screen image of the current mobile terminal is a vertical screen image, and the direction of the screen image is rotated to be a horizontal screen image with the upper end facing the right side frame of the mobile terminal.
For diagram group c2, the palm gesture of the upper diagram is that the left screen side transversely holds the mobile terminal, the palm gesture of the lower diagram is that the right screen side transversely holds the mobile terminal, both the palm gestures have 3 touch areas on the left side frame of the mobile terminal, 1 touch area on the right side frame of the mobile terminal, and no touch areas are on the upper and lower frames of the mobile terminal. The touch sensor identifies the touch area of each finger identified on the frame of the mobile terminal held by the user, the corresponding palm gesture is obtained according to the pattern shape formed by the touch area of each finger, when the similarity between the palm gesture and the palm gesture corresponding to the group c2 is the highest, the direction of the screen image is determined to be a horizontal screen image with the upper end facing the right side frame of the mobile terminal according to the corresponding relation between the sample palm gesture and the direction of the screen image, for example, the direction of the screen image of the current mobile terminal is a vertical screen image, and the direction of the screen image is rotated to be a horizontal screen image with the upper end facing the right side frame of the mobile terminal.
Compared with the method for changing the screen image direction by using the gravity sensing of the mobile terminal in the prior art, the method provided by the embodiment of the application is more convenient and faster, when a user uses the mobile terminal in some special postures, such as lying on side, the user needs to keep the screen image direction forward in the prior art, the screen image direction needs to be adjusted to the screen image direction the user wants by using the gravity sensing method, then the screen image direction is locked, and then the mobile terminal is normally used; according to the method provided by the embodiment of the application, the mobile terminal can automatically adjust the direction of the screen image according to the palm posture of the user holding the mobile terminal only by the user holding the mobile terminal normally. The use efficiency and the convenience of the user for the mobile terminal are improved.
In combination with the fingerprint information in steps 1012 to 1014 described above, it is possible to determine which fingers are specifically those of which hand on each border of the mobile terminal. In the group a1, the middle finger and the ring finger correspond to the touch areas on the side frame having 2 touch areas, the thumb corresponds to the touch area on the side frame having 1 touch area, and the small thumb corresponds to the touch area on the upper frame. When the recognized current palm postures of the users are similar, the current correct screen image direction of the mobile terminal can be judged to be a vertical screen image with the upper end facing the lower frame of the mobile terminal.
In the group a2, the middle finger and the ring finger correspond to the touch areas on the side frame having 2 touch areas, the thumb corresponds to the touch area on the side frame having 1 touch area, and the small thumb corresponds to the touch area on the lower frame. When the recognized current palm postures of the users are similar, the current correct screen image direction of the mobile terminal can be judged to be a vertical screen image with the upper end facing the upper frame of the mobile terminal.
For panel b1, the touch areas on the left side border of the mobile terminal correspond to the thumb and the touch areas on the right side border of the mobile terminal correspond to the index, middle and ring fingers. When the recognized current palm postures of the users are similar, the current correct screen image direction of the mobile terminal can be judged to be a transverse screen image with the upper end facing the right side frame of the mobile terminal.
For panel b2, the touch area on the left side border of the mobile terminal corresponds to the index finger and the touch area on the right side border of the mobile terminal corresponds to the thumb. When the recognized current palm postures of the users are similar, the current correct screen image direction of the mobile terminal can be judged to be a transverse screen image with the upper end facing the left side frame of the mobile terminal.
For the group c1, the touch area on the left side frame of the mobile terminal corresponds to the thumb and the touch area on the right side frame of the mobile terminal corresponds to the index finger. When the recognized current palm postures of the users are similar, the current correct screen image direction of the mobile terminal can be judged to be a transverse screen image with the upper end facing the right side frame of the mobile terminal.
For the group c2, the touch areas on the left side frame of the mobile terminal correspond to the index finger, middle finger and ring finger, and the touch areas on the right side frame of the mobile terminal correspond to the thumb. When the recognized current palm postures of the users are similar, the current correct screen image direction of the mobile terminal can be judged to be a transverse screen image with the upper end facing the left side frame of the mobile terminal.
In an optional embodiment, after obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal in step S102, the method further includes:
1041, obtaining an input panel display scheme mapped by the current palm posture according to a pre-stored corresponding relation between the sample palm posture and the input panel display scheme;
1042, judging whether the input panel display scheme mapped by the current palm posture is consistent with the current input panel display scheme of the mobile terminal;
1043, if not, changing the input panel display scheme of the mobile terminal to the input panel display scheme mapped by the current palm posture;
and step 1044, if the input panel display scheme is consistent with the input panel display scheme, the mobile terminal keeps the current input panel display scheme.
Specifically, as the screen of the mobile terminal is larger and larger at present, most of the input panels of the input methods for the mobile terminal provide the function of a one-handed keyboard, but the user can only set the one-handed keyboard on the side of inputting a familiar hand, and when the user needs to change the hand for inputting, the user needs to switch the display scheme of the input panel by clicking a switch button on the side of the one-handed keyboard.
After the palm gesture of the user holding the mobile terminal is analyzed, the palm gesture is applied to the display scheme switching of the input panel. The mobile terminal needs to pre-store the corresponding relation between the sample palm posture and the display scheme of the input panel, and then obtains the sample palm posture with the highest similarity by matching and comparing the current palm posture of the user with the sample palm posture. Then inquiring an input panel display scheme corresponding to the sample palm posture as an input panel display scheme mapped by the current palm posture, comparing the corresponding input panel display scheme with the current input panel display scheme of the mobile terminal, and if the input panel display scheme is the same as the current input panel display scheme, not needing to switch; and if the palm posture mapping information is different, switching the display scheme of the input panel of the mobile terminal to the display scheme of the input panel mapped by the current palm posture.
Further, the method can also be applied to switching between a single-hand keyboard and a double-hand keyboard, because the current palm posture of the user also includes two categories of holding the mobile terminal by a single hand and holding the mobile terminal by double hands, if the current palm posture of the user is holding the mobile terminal by a single hand, the switching of the display scheme of the input panel of the single-hand keyboard can be performed according to the steps 1041 to 1044; if the user's current palm posture is both hands holding the mobile terminal, a full keyboard (two-handed keyboard) is displayed.
This function, when implemented, may be combined with the fingerprint information in steps 1012 through 1014 above to confirm which hand the user is using.
If it is recognized from the fingerprint information in the contact information of the user holding the mobile terminal that the user uses only the left hand, the display scheme of the input panel corresponding to the palm posture should be a one-handed keyboard with the input panel on the left side of the current screen image.
In contrast, if it is recognized from fingerprint information in contact information of a user holding the mobile terminal that only the right hand is used by the user, the display scheme of the input panel corresponding to the palm posture should be a one-handed keyboard with the input panel on the right side of the current screen image.
If it is recognized from the fingerprint information in the contact information of the user holding the mobile terminal that the user uses both hands, the display scheme of the input panel corresponding to the palm posture should be a two-handed keyboard.
The embodiment of the application provides a palm gesture recognition device, is applied to mobile terminal, is provided with touch sensor on above-mentioned mobile terminal frame, and above-mentioned touch sensor is used for discerning the touch information of each finger that contacts with mobile terminal frame when the handheld mobile terminal of user, as shown in fig. 4, the device includes:
the processing module 30 is configured to obtain touch information of each finger output by the touch sensor, and determine a pattern shape formed by the touch information of each finger according to the touch information of each finger;
the first matching module 31 is configured to obtain the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to a pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.
In an alternative embodiment, after the first matching module 31, the apparatus further comprises:
the second matching module 32 is configured to obtain a screen image direction mapped by the current palm posture according to a pre-stored correspondence between the sample palm posture and the screen image direction;
a judging module 33, configured to judge whether a screen image direction mapped by the current palm posture is consistent with a current screen image direction of the mobile terminal; if not, changing the screen image direction of the mobile terminal to the screen image direction mapped by the current palm posture; and if the screen image direction is consistent with the current screen image direction, the mobile terminal keeps the current screen image direction.
Corresponding to the palm posture identifying method in fig. 1, an embodiment of the present application further provides a computer device 400, as shown in fig. 5, the device includes a memory 401, a processor 402, and a computer program stored on the memory 401 and operable on the processor 402, wherein the processor 402 implements the palm posture identifying method when executing the computer program.
Specifically, the memory 401 and the processor 402 can be general-purpose memory and processor, which are not limited in particular, and when the processor 402 runs the computer program stored in the memory 401, the palm gesture recognition method can be executed, so that the problem of how to determine the palm gesture used by the user to hold the mobile terminal in the prior art is solved.
Corresponding to a palm posture identifying method in fig. 1, the present application further provides a computer readable storage medium, on which a computer program is stored, and the computer program is executed by a processor to perform the steps of the palm posture identifying method.
Specifically, the storage medium can be a general-purpose storage medium, such as a mobile disk, a hard disk, and the like, and when a computer program on the storage medium is executed, the palm posture identifying method can be executed, so as to solve the problem of how to determine the palm posture used by the user for holding the mobile terminal in the prior art. The palm posture identifying method and device provided by the embodiment of the application can quickly identify the palm posture of a user, further apply the palm posture to the realization of the functions of the mobile terminal, and improve the use convenience and use efficiency of the user.
In the embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided in the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The above functions, if implemented in the form of software functional units and sold or used as a separate product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-described method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present application, which are used for illustrating the technical solutions of the present application and not for limiting the same, and the protection scope of the present application is not limited thereto, although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the present disclosure, which should be construed in light of the above teachings. Are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A palm gesture recognition method is applied to a mobile terminal, a touch sensor is arranged on a frame of the mobile terminal and used for recognizing touch information of fingers which are in contact with the frame of the mobile terminal when a user holds the mobile terminal, and the method comprises the following steps:
acquiring touch information of each finger output by a touch sensor, and determining a pattern shape formed by the touch information of each finger according to the touch information of each finger;
obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the corresponding relation between the pre-stored sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.
2. The method of claim 1, wherein the touch information comprises: touch area information;
determining a pattern shape formed by the touch information of each finger according to the touch information of each finger, wherein the pattern shape comprises the following steps:
and determining touch areas corresponding to the touch area information of the fingers on the mobile terminal frame, and forming the pattern shape according to the touch areas corresponding to the fingers.
3. The method of claim 1, wherein the touch information comprises: fingerprint information and touch area information;
determining a pattern shape formed by the touch information of each finger according to the touch information of each finger, wherein the pattern shape comprises the following steps:
calling a fingerprint library of the user;
determining the finger type corresponding to each finger according to the fingerprint information of each finger and the fingerprint database;
and determining the pattern shape formed by each finger according to the finger type and the touch area information of each finger.
4. The method according to claim 1, wherein the mobile terminal is further provided with a touch sensor on the back surface, the touch sensor is used for identifying the touch information of each finger which is in contact with the back surface of the mobile terminal when the user holds the mobile terminal;
determining a pattern shape formed by the touch information of each finger according to the touch information of each finger, wherein the pattern shape comprises the following steps:
and determining a touch area corresponding to the touch area information of each finger on the frame of the mobile terminal and a touch area corresponding to the touch area information of each finger on the back of the mobile terminal, and forming the pattern shape according to the touch area corresponding to each finger.
5. The method according to claim 1, wherein after obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal, the method further comprises:
obtaining the direction of the screen image mapped by the current palm posture according to the corresponding relation between the pre-stored sample palm posture and the direction of the screen image;
judging whether the direction of the screen image mapped by the current palm posture is consistent with the direction of the current screen image of the mobile terminal;
if not, changing the screen image direction of the mobile terminal to be the screen image direction mapped by the current palm posture;
and if the screen image direction is consistent with the current screen image direction, the mobile terminal keeps the current screen image direction.
6. The method according to claim 1, wherein after obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the pre-stored correspondence between the sample pattern shape and the palm posture of the handheld mobile terminal, the method further comprises:
obtaining an input panel display scheme mapped by the current palm posture according to a pre-stored corresponding relation between the sample palm posture and the input panel display scheme;
judging whether the display scheme of the input panel mapped by the current palm posture is consistent with the display scheme of the current input panel of the mobile terminal;
if not, changing the input panel display scheme of the mobile terminal to the input panel display scheme mapped by the current palm posture;
and if the input panel display scheme is consistent with the input panel display scheme, the mobile terminal keeps the current input panel display scheme.
7. The palm gesture recognition device is applied to a mobile terminal, a touch sensor is arranged on a frame of the mobile terminal, and the touch sensor is used for recognizing touch information of fingers which are in contact with the frame of the mobile terminal when a user holds the mobile terminal, and the device comprises:
the processing module is used for acquiring touch information of each finger output by the touch sensor and determining a pattern shape formed by the touch information of each finger according to the touch information of each finger;
the first matching module is used for obtaining the palm posture of the handheld mobile terminal mapped by the determined pattern shape according to the corresponding relation between the pre-stored sample pattern shape and the palm posture of the handheld mobile terminal; the palm posture comprises a horizontal holding mobile terminal and a vertical holding mobile terminal.
8. The apparatus of claim 7, wherein after the first matching module, the apparatus further comprises:
the second matching module is used for obtaining the screen image direction mapped by the current palm posture according to the corresponding relation between the pre-stored sample palm posture and the screen image direction;
the judging module is used for judging whether the direction of the screen image mapped by the current palm posture is consistent with the direction of the current screen image of the mobile terminal; if not, changing the screen image direction of the mobile terminal to be the screen image direction mapped by the current palm posture; and if the screen image direction is consistent with the current screen image direction, the mobile terminal keeps the current screen image direction.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of the preceding claims 1-6 are implemented by the processor when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, is adapted to carry out the steps of the method of any one of the preceding claims 1 to 6.
CN201911141999.0A 2019-11-20 2019-11-20 Palm posture recognition method and device Pending CN110908514A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911141999.0A CN110908514A (en) 2019-11-20 2019-11-20 Palm posture recognition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911141999.0A CN110908514A (en) 2019-11-20 2019-11-20 Palm posture recognition method and device

Publications (1)

Publication Number Publication Date
CN110908514A true CN110908514A (en) 2020-03-24

Family

ID=69818283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911141999.0A Pending CN110908514A (en) 2019-11-20 2019-11-20 Palm posture recognition method and device

Country Status (1)

Country Link
CN (1) CN110908514A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100093293A (en) * 2009-02-16 2010-08-25 주식회사 팬택 Mobile terminal with touch function and method for touch recognition using the same
CN103140822A (en) * 2010-10-13 2013-06-05 Nec卡西欧移动通信株式会社 Mobile terminal device and display method for touch panel in mobile terminal device
CN103946788A (en) * 2011-10-27 2014-07-23 夏普株式会社 Portable information terminal
CN105955632A (en) * 2016-04-19 2016-09-21 上海卓易科技股份有限公司 Horizontal and vertical screen rotation control method and device for handheld equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100093293A (en) * 2009-02-16 2010-08-25 주식회사 팬택 Mobile terminal with touch function and method for touch recognition using the same
CN103140822A (en) * 2010-10-13 2013-06-05 Nec卡西欧移动通信株式会社 Mobile terminal device and display method for touch panel in mobile terminal device
CN103946788A (en) * 2011-10-27 2014-07-23 夏普株式会社 Portable information terminal
CN105955632A (en) * 2016-04-19 2016-09-21 上海卓易科技股份有限公司 Horizontal and vertical screen rotation control method and device for handheld equipment

Similar Documents

Publication Publication Date Title
CN109428969B (en) Edge touch method and device of double-screen terminal and computer readable storage medium
CN104932809B (en) Apparatus and method for controlling display panel
CN109583320A (en) Fingerprint identification method and relevant apparatus
WO2011142317A1 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
CN106941780B (en) Human-computer interaction method and device of user terminal and user terminal
EP2859471A2 (en) Text recognition driven functionality
CN103473012A (en) Screen capturing method, device and terminal equipment
US20140362002A1 (en) Display control device, display control method, and computer program product
WO2014127697A1 (en) Method and terminal for triggering application programs and application program functions
CN111142674B (en) Control method and electronic equipment
US9846529B2 (en) Method for processing information and electronic device
KR20100062899A (en) Inputting method and device using touch pattern
EP3113047A1 (en) Search system, server system, and method for controlling search system and server system
US20170285904A1 (en) Direct data transfer electronic device and method
WO2013021879A1 (en) Information processing device, screen display method, control program and recording medium
CN112486346A (en) Key mode setting method and device and storage medium
EP3293605B1 (en) Widget displaying method and apparatus for use in flexible display device, computer program and recording medium
CN106843672A (en) A kind of terminal screen locking operation device and method
CN103927114A (en) Display method and electronic equipment
CN110658976B (en) Touch track display method and electronic equipment
CN112346597A (en) Touch processing method and device and electronic equipment
CN107422854A (en) Action identification method and terminal applied to virtual reality
WO2021128414A1 (en) Wearable device and input method thereof
EP4290338A1 (en) Method and apparatus for inputting information, and storage medium
CN110908514A (en) Palm posture recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200324

RJ01 Rejection of invention patent application after publication