CN112799582A - Terminal, holding posture recognition method and device of terminal, and storage medium - Google Patents

Terminal, holding posture recognition method and device of terminal, and storage medium Download PDF

Info

Publication number
CN112799582A
CN112799582A CN202110204737.5A CN202110204737A CN112799582A CN 112799582 A CN112799582 A CN 112799582A CN 202110204737 A CN202110204737 A CN 202110204737A CN 112799582 A CN112799582 A CN 112799582A
Authority
CN
China
Prior art keywords
terminal
detection information
back shell
screen
force acting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110204737.5A
Other languages
Chinese (zh)
Inventor
姜皓
钟桂林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110204737.5A priority Critical patent/CN112799582A/en
Publication of CN112799582A publication Critical patent/CN112799582A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a terminal, a holding gesture recognition method and device of the terminal, and a storage medium. The terminal includes: the display screen is arranged in the back shell and exposes the display surface out of the back shell; the touch panel is stacked with the display screen and used for detecting acting force acting on the display surface to obtain first detection information; the sensor is positioned between the display screen and the back shell and used for detecting acting force acting on the back shell to obtain second detection information; the processing module is connected with the touch panel and the sensor and used for receiving the first detection information and the second detection information and determining the holding posture of the terminal according to the first detection information and the second detection information; the gripping gesture includes: horizontal screen holding and/or vertical screen holding. Through this kind of mode, promoted the rate of accuracy of judgement to and user's use experience.

Description

Terminal, holding posture recognition method and device of terminal, and storage medium
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a terminal, a method and an apparatus for recognizing a holding gesture of the terminal, and a storage medium.
Background
With the development of communication technology and terminal devices, more and more users choose to use the terminal devices to watch text information, video information and the like. In order to enable the user to have better experience, the terminal device can switch the display mode of the display screen according to the state of the terminal device when the user uses the terminal device, namely, the display mode is switched between horizontal screen display and vertical screen display.
In the related art, the terminal device may determine the holding state of the terminal and switch the display mode of the display screen depending on information provided by the gravity sensor, but the accuracy of the detection of the holding state of the terminal is not high by the scheme, so that the user experience is affected.
Disclosure of Invention
The disclosure provides a terminal, a holding gesture recognition method and device of the terminal, and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a terminal, including:
a back shell;
the display screen is arranged in the back shell and exposes the display surface out of the back shell;
the touch panel is stacked with the display screen and used for detecting acting force acting on the display surface to obtain first detection information;
the sensor is positioned between the display screen and the back shell and used for detecting acting force acting on the back shell to obtain second detection information;
the processing module is connected with the touch panel and the sensor and used for receiving the first detection information and the second detection information and determining the holding posture of the terminal according to the first detection information and the second detection information; the gripping gesture includes: horizontal screen holding and/or vertical screen holding.
Optionally, the first detection information includes: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
the processing module is used for determining an action area acting on the display surface according to the information of the first position, determining whether the acting force acting on the back shell is located in a preset target area according to the information representing the second position, and determining the holding state of the terminal according to the action area and the preset target area.
Optionally, the processing module is configured to determine that the terminal is held in a vertical screen mode when the length of the region of the action region is greater than a length threshold, the action region and the preset target region are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target region; and/or when the area length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are positioned on the same side of the terminal, and the acting force acting on the back shell is positioned in the preset target area, the terminal is determined to be held in a transverse screen mode.
Optionally, the preset target area is distributed on the back shell, or on a side edge connecting the back shell and the display screen.
Optionally, the terminal further includes:
the gravity sensor is positioned below the display screen and used for detecting the gravity data of the terminal;
the processing module is connected with the gravity sensor and used for receiving the gravity data and determining the holding posture of the terminal according to the gravity data when the holding posture of the terminal cannot be determined according to the first detection information and the second detection information.
Optionally, the sensor is a Sar sensor, and the terminal further includes:
the antenna is positioned between the display screen and the back shell and covers the back shell;
and the Sar sensor is used for obtaining the second detection information through the antenna.
According to a second aspect of the embodiments of the present disclosure, there is provided a holding gesture recognition method for a terminal, which is applied to a terminal, where the terminal includes a touch screen and a back shell, the touch screen is installed in the back shell and exposes a display surface of the touch screen outside the back shell, and the method includes:
acquiring first detection information of the acting force acting on the display surface and detected by the touch screen;
acquiring second detection information of acting force acting on the back shell detected by the sensor;
determining the holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
Optionally, the first detection information includes: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
determining the holding posture of the terminal according to the first detection information and the second detection information, including:
determining an acting area acting on the display surface according to the information of the first position, and determining whether the acting force acting on the back shell is positioned in a preset target area according to the information representing the second position; and determining the holding posture of the terminal according to the action area and the preset target area.
Optionally, the determining the holding posture of the terminal according to the action region and the preset target region includes:
if the area length of the action area is greater than a length threshold value, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target area, the terminal is determined to be held in a vertical screen mode;
if the length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is located in the preset target area, so that the terminal is determined to be held in a transverse screen mode.
Optionally, the acquiring first detection information of the acting force of the touch screen on the display surface and acquiring second detection information of the acting force of the sensor on the back shell includes:
and acquiring the first detection information and the second detection information according to a time interval.
Optionally, the terminal further includes a gravity sensor, and the method further includes:
acquiring gravity data detected by the gravity sensor;
and if the holding posture of the terminal cannot be determined according to the first detection information and the second detection information, determining the holding posture of the terminal according to the gravity data.
According to a third aspect of the embodiments of the present disclosure, there is provided a holding gesture recognition apparatus for a terminal, which is applied to a terminal, the terminal includes a touch screen and a back shell, the touch screen is installed in the back shell and exposes a display surface of the touch screen outside the back shell, the apparatus includes:
the first acquisition module is configured to acquire first detection information of the acting force acting on the display surface and detected by the touch screen;
the second acquisition module is configured to acquire second detection information of the acting force acting on the back shell, which is detected by the sensor;
the determining module is configured to determine the holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
Optionally, the first detection information includes: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
the determining module is specifically configured to determine an acting area acting on the display surface according to the information of the first position, determine whether an acting force acting on the back shell is located in a preset target area according to the information representing the second position, and determine a holding posture of the terminal according to the acting area and the preset target area.
Optionally, the determining module is specifically configured to determine that the terminal is held in a vertical screen mode if the length of the region of the action region is greater than a length threshold, the action region and the preset target region are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target region; if the length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is located in the preset target area, so that the terminal is determined to be held in a transverse screen mode.
Optionally, the first obtaining module is specifically configured to obtain the first detection information at time intervals; the second obtaining module is specifically configured to obtain the second detection information at time intervals.
Optionally, the terminal further includes a gravity sensor, and the apparatus further includes:
a third acquisition module configured to acquire gravity data detected by the gravity sensor;
the determining module is further configured to determine the holding posture of the terminal according to the gravity data if the holding posture of the terminal cannot be determined according to the first detection information and the second detection information.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a terminal, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the grip gesture recognition method as described in the first aspect above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a storage medium including:
the instructions in the storage medium, when executed by a processor of a terminal, enable the terminal to perform the grip gesture recognition method as described in the above first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the present disclosure, not only data of the touch screen (first detection information detected by the touch panel) is relied upon, but also second detection information is obtained by detecting an acting force acting on the back shell through a sensor in the terminal, and the holding posture of the terminal is determined according to the first detection information and the second detection information. By the method, on one hand, data sources are increased, the judgment result of the holding gesture is obtained through comprehensive analysis, misjudgment caused by judgment only by means of single touch screen data is reduced, and the judgment accuracy is improved; on the other hand, the requirement of the contact area of the touch screen when being touched is reduced, and the method is also suitable for the scene that the user uses the mobile phone when lying on side or lying on back, so that the scheme disclosed by the invention is more universal, and the use experience of the user is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure.
Fig. 2 is an exemplary diagram for detecting a holding posture of a cellular phone by using a gravity sensor.
Fig. 3 is an exemplary view of a user lying on his back while using the handset.
Fig. 4 is an exemplary view of a user lying on his back while using the handset.
FIG. 5 is an exemplary illustration of the length of an active area on a display of a cell phone.
Fig. 6 is an exemplary view of a preset target area on the back side of a cellular phone.
FIG. 7 is an exemplary length of a grip region.
FIG. 8 is a diagram illustrating an exemplary determination of the length of an active region on a display surface.
Fig. 9 is an exemplary view of a handset held by a portrait screen.
Fig. 10 is an exemplary view of a mobile phone held in a landscape.
Fig. 11 is an exemplary diagram of the Sar sensor and antenna layout in a handset.
Fig. 12 is a flowchart illustrating a grip gesture recognition method according to an embodiment of the disclosure.
Fig. 13 is a diagram illustrating a grip posture identifying apparatus according to an exemplary embodiment.
Fig. 14 is a block diagram of a terminal shown in an embodiment of the disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a schematic structural diagram of a terminal shown in an embodiment of the present disclosure, and as shown in fig. 1, the terminal 100 includes:
a back shell 101;
a display screen 102 installed in the back case 101 and exposing a display surface outside the back case 101;
the touch panel 103 is stacked on the display screen 102 and used for detecting an acting force acting on the display surface to obtain first detection information;
the sensor 104 is positioned between the display screen 102 and the back shell 101 and is used for detecting acting force acting on the back shell 101 to obtain second detection information;
a processing module 105, connected to the touch panel 103 and the sensor 104, configured to receive the first detection information and the second detection information, and determine a holding posture of the terminal 100 according to the first detection information and the second detection information; the gripping gesture includes: horizontal screen holding and/or vertical screen holding.
In the embodiments of the present disclosure, the terminal may be a mobile phone, a tablet computer, a wearable device, or the like. The terminal includes backshell 101 and display screen 102, and the display surface of display screen 102 exposes outside backshell 101, and display screen 102 and backshell 101 after the installation form an accommodation space, can hold various functional device in the accommodation space, for example camera module, antenna module and audio frequency module etc.. In the embodiment of the present disclosure, if the display surface of the display screen 102 is referred to as a "front surface", the surface on which the back case 101 is located may be referred to as a "back surface".
The terminal further includes a touch panel 103, where the touch panel 103 is stacked on the display screen 102, for example, the touch panel 103 is located on the display screen 102, or the touch panel 103 is disposed inside the display screen 102 as a sensing layer. The touch panel 103 is configured to detect an acting force acting on the display surface to obtain first detection information. The first detection information may include: the coordinate information can also comprise force information of acting force and the like.
The terminal further includes a sensor 104, the sensor 104 is located in an accommodating space formed by the display screen 102 and the back shell 101, for example, the sensor 104 may be disposed on a side of the housing 101b facing away from the display surface of the display screen 102. The sensor 104 may be a pressure or proximity sensor that can detect both proximity and touch events of an object. In the embodiment of the present disclosure, the sensor 104 is used to detect the acting force acting on the back shell 101, and obtain the second detection information. The second detection information may also be coordinate information, or other information characterizing the action position of the acting force, for example, a "0" characterizing the acting force acts on the preset target area, and a "1" characterizing the acting force acts on the non-preset target area.
The terminal further comprises a processing module 104, wherein the processing module 104 is also located in an accommodating space formed by the display screen 102 and the back shell 101, is connected with the touch panel 103 and the sensor 104, and is used for receiving the first detection information and the second detection information and determining a holding posture of the terminal according to the first detection information and the second detection information, and the holding posture comprises horizontal screen holding and vertical screen holding.
The terminal may control the display screen 102 to display in a display mode matching the determined holding posture, for example, switch between a landscape display and a portrait display. In addition, when a motion sensing game is played by the terminal, a corresponding game control can be performed based on the determined holding posture.
In one embodiment, there is a method of determining a grip posture of a terminal based on a gravity sensor. Fig. 2 is an exemplary diagram of detecting a holding posture of a mobile phone by using a gravity sensor, as shown in fig. 2, when a screen of the mobile phone is perpendicular to the ground, a direction of the mobile phone can be recognized more accurately; however, when the mobile phone screen is parallel to the ground, the gravity sensor cannot accurately judge the direction. The gravity sensor judges the horizontal and vertical screen directions according to the components of gravity on the x and y axes, when the mobile phone screen is parallel to the ground, the gravity only has a component on the z axis, and no component on the x and y axes, so the gravity sensor cannot judge whether the mobile phone is a horizontal screen or a vertical screen.
For example, fig. 3 is an exemplary diagram of a user who lies on his back to use a mobile phone, as shown in fig. 3, the mobile phone screen is parallel to the ground, so that the horizontal and vertical screen directions cannot be accurately determined, which may cause an abnormal picture direction. Fig. 4 is an exemplary diagram of a user using a mobile phone while lying on side, and as shown in fig. 4, when the user lies on side, a scene that the mobile phone is a landscape screen relative to the ground but a portrait screen relative to the user may appear, and at this time, a landscape screen picture may be displayed only by means of a gravity sensor. Or, a scene that the mobile phone is vertical relative to the ground but horizontal relative to the user appears, and at the moment, the vertical screen picture can be displayed only by the gravity sensor. The result misjudges the posture of the mobile phone relative to the user, so that the image direction is abnormal.
In another embodiment, whether the mobile phone is a horizontal screen or a vertical screen is distinguished by collecting data of a touch screen when a user holds the mobile phone. For example, when the mobile phone is held by the horizontal screen or the vertical screen, different hand types enable the characteristics of the data of the touch screen to be different, so that the horizontal screen or the vertical screen state of the mobile phone can be distinguished by holding and collecting the data of the touch screen and analyzing the data. However, this method has two problems as follows: 1. the gesture of the mobile phone is judged by a single dimension only by considering the holding characteristic of the palm on the front screen, and in addition, the sufficient contact between the palm of the user and the display screen is required to be detected, for example, the touch area is larger than a certain threshold value, namely, the holding gesture of the user is highly dependent; 2. the judgment is carried out only by means of the touch screen data, the data source is single, and when the touch screen data is interfered, if water stains exist on the screen or the touch screen data is in an environment with large radio frequency signals, misjudgment is easily caused.
In contrast, the present disclosure does not rely solely on data of the touch screen (first detection information detected by the touch panel), but also detects an acting force acting on the back case by a sensor in the terminal to obtain second detection information, and determines the holding posture of the terminal according to the first detection information and the second detection information. By the method, on one hand, data sources are increased, the judgment result of the holding gesture is obtained through comprehensive analysis, misjudgment caused by judgment only by means of single touch screen data is reduced, and the judgment accuracy is improved; on the other hand, the requirement of the contact area of the touch screen when being touched is reduced, and the method is also suitable for the scene that the user uses the mobile phone when lying on side or lying on back, so that the scheme disclosed by the invention is more universal, and the use experience of the user is improved.
In one embodiment, the first detection information includes: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
the processing module 105 is configured to determine an action area acting on the display surface according to the information of the first position, determine whether an action force acting on the back shell 101 is located in a preset target area according to the information representing the second position, and determine a holding state of the terminal according to the action area and the preset target area.
In an embodiment, the processing module is configured to determine that the terminal is held in a vertical screen mode when the length of the region of the action region is greater than a length threshold, the action region and the preset target region are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target region; and/or when the area length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are positioned on the same side of the terminal, and the acting force acting on the back shell 101 is positioned in the preset target area, the terminal is determined to be held by a transverse screen.
In an embodiment of the present disclosure, the first detection information includes information of a first position of the applied force acting on the display surface. Since the terminal is usually held based on the boundary area when it is held, the first position may be a position on the display screen 102 that is located on the boundary of the display screen 102. After receiving the information of the first position, the processing module 105 may determine an action region acting on the display surface according to the first position, and further determine information such as a region length, a region area, and a region orientation of the action region.
In an embodiment of the present disclosure, the second detection information includes information characterizing a second position of the force acting on the back case 101. As described above, the action force acting on the back cover 101 is represented by "0" in the preset target area, and the action force acting on the back cover 101 is represented by "1" in the non-preset target area. The processing module 105 can determine whether the acting force acting on the back shell 101 is located in the preset target area after receiving the second detection information.
Since the terminal is held, typically based on the contact of the area of the side of the back cover 101 with the user, in one embodiment, the predetermined target area is distributed on the back cover 101 or on the side connecting the back cover 101 and the display screen 102. In the embodiment of the present disclosure, the preset target region may be a local region of the back case 101, for example, a portion located near an edge of the back case 101. The preset target area may also be located on a side of the terminal, which refers to a portion connecting the back case 101 and the display screen 102. It should be noted that the position and size of each preset target area in the present embodiment can be determined according to the size of the terminal and the hand shape characteristic of the user in general.
FIG. 5 is an exemplary graph of the active area length on the display of a cell phone, as shown in FIG. 5, the contact length on the border area of the display is available to the cell phone.
Fig. 6 is an exemplary diagram of the preset target areas on the back of the mobile phone, and as shown in fig. 6, the preset target areas are distributed on four sides of the mobile phone, which are A, B, C, D four areas respectively. The area A and the area D are symmetrically arranged, and the area B and the area C are arranged on different sides in a staggered manner based on the common holding habit of a user.
Fig. 7 is a length example of a grip region, and as shown in fig. 7, the length of the region of action of the force acting on the display screen is L1 and L2, and the grip region is close to the D region.
In the embodiment of the present disclosure, after the processing module 105 determines the area length of the acting area on the display surface according to the first detection information and determines whether the acting force acting on the back shell 101 is located in the preset target area according to the second detection information, that is, after the area length of the acting area is greater than the length threshold, the acting area and the preset target area are located on the same side of the terminal, and when the acting force acting on the back shell 101 is located in the preset target area, it is determined that the terminal is held in a vertical screen mode. It should be noted that, when the acting force acting on the back shell 101 is located in the preset target area, it can be understood that a touch is detected in the preset target area.
In addition, in the embodiment of the present disclosure, the length threshold is preset, and may be adjusted according to actual conditions. For example, the length threshold may be determined by detecting data when a large number of users hold the terminal, thereby presetting a statistical value obtained based on analysis of big data; alternatively, the length threshold may be obtained based on a historical operation of the user of the terminal, in which case, for example, a value obtained based on statistical data of a large number of users may be set in advance and then updated to a value obtained based on a historical operation of the user to which the terminal belongs.
FIG. 8 is a diagram illustrating an exemplary determination of the length of the region of action on the display screen, as shown in FIG. 8, after determining the length of the region of action of the action on the display screen 102 as L1 and L2, L1 and L2 can be compared with L2thresholdA comparison is made, wherein LthresholdIs a length threshold.
In the disclosed embodiments, when both L1 and L2 are greater than LthresholdRegion of action and prognosisIf the target area is located on the same side of the terminal and contact is made on the preset target area, the terminal can be determined to be held in a vertical screen mode.
Fig. 9 is an exemplary diagram of a mobile phone held by a vertical screen, and as shown in fig. 9, when the mobile phone is held by the vertical screen, the contact distance between the palm and the long side of the front surface of the mobile phone is long and the palm is not easy to contact the short side of the back surface. Therefore, when the mobile phone detects that the lengths L1 and L2 of the action area on the display screen are both larger than the length threshold value, the action area is close to the D area, and no contact exists on the A, B, C area and the D area on the back of the mobile phone, the terminal is determined to be held in a vertical screen mode.
In the embodiment of the present disclosure, when the area length of the action area is less than or equal to the length threshold, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell 101 is not located in the preset target area, the terminal is determined to be held in a landscape mode.
Illustratively, when both L1 and L2 are less than LthresholdIf the action area and the preset target area are located on the same side of the terminal, but the preset target area is not in contact with the action area, the terminal can be determined to be held in a horizontal screen mode.
Fig. 10 is an illustration of a mobile phone held by a horizontal screen, and as shown in fig. 10, when the mobile phone is held by the horizontal screen, the hand shape is generally C-shaped, and the hand of the user has a short contact distance with the long side of the front of the mobile phone and a short contact with the short side of the back of the mobile phone. Therefore, when the mobile phone detects that the lengths L1 and L2 of the action area on the display screen are both smaller than the threshold value, the action area is close to the area D, and the contact exists on the area D, the mobile phone is determined to be held in a transverse screen mode.
It should be noted that, the above-mentioned fig. 9 and fig. 10 both take D end as an example for explanation, and the detection manner of the holding close to a end is similar to the method mentioned in the above-mentioned description of fig. 9 and fig. 10, and the present disclosure is not detailed again.
In one embodiment, the sensor 104 is a Sar sensor, and the terminal further includes:
an antenna 106, located between the display screen 102 and the back shell 101, covering the back shell 101;
the Sar sensor is configured to obtain the second detection information through the antenna 106.
In an embodiment of the present disclosure, the sensor 104 is a Specific absorption rate sensor (Sar sensor). The Sar sensor recognizes both object approach and touch events by detecting a change in capacitance between the antenna 106 and the human hand.
For example, when the Sar sensor does not detect that the capacitance value changes through the antenna 106, second detection information representing that no object approaches or touches the area connected with the antenna 106 is generated; when the Sar sensor detects that the change of the capacitance value is in a first range through the antenna 106, second detection information representing that an object approaches to an area connected with the antenna 106 is generated; when the Sar sensor detects that the change of the capacitance value is in the second range through the antenna 106, second detection information representing that an object touches the area connected with the antenna 106 is generated. In the embodiment of the present disclosure, when it is obtained that no object is approaching or touching in the area to which the antenna 106 is connected, and an object is approaching but not touching, the second detection information may be identified by a number "0"; and when it is obtained that the area to which the antenna 106 is connected is touched by an object, the second detection information may be identified by the number "1".
In one embodiment, at least two of the sensors 104 are included in the terminal.
In the embodiment of the present disclosure, in order to simplify the wiring in the accommodating space formed by the back case 101 and the display screen 102 of the terminal and to improve the instantaneity of detecting the acting force acting on each area of the back case 101, a plurality of sensors 104 may be included in the terminal.
Fig. 11 is an exemplary diagram of the layout of Sar sensors and antennas in a mobile phone, and as shown in fig. 11, 2 Sar sensors, Sar1 and Sar2, are included in the mobile phone. The mobile phone back shell is provided with antennas which are respectively connected with the four areas of ABCD, the SAR1 monitors contact events on the area A, the area B and the area C through the antennas arranged on the back shell, and the SAR2 monitors contact events on the area D through the antennas arranged on the back shell.
In one implementation, the terminal further includes:
the gravity sensor 107 is positioned below the display screen 102 and used for detecting the gravity data of the terminal;
the processing module 105 is connected to the gravity sensor 107, and configured to receive the gravity data, and determine a holding posture of the terminal according to the gravity data when the holding posture of the terminal cannot be determined according to the first detection information and the second detection information.
In the embodiment of the present disclosure, the terminal further includes a gravity sensor 107 capable of detecting gravity data, and when the processing module 105 cannot determine the holding posture of the terminal according to the first detection information of the display screen 102 and the second detection information of the back shell 101, the holding posture of the terminal can be determined according to the gravity data. The principle of determining the gripping posture based on the gravity sensor is as described above, and will not be described herein.
Furthermore, it should be noted that, in the embodiment of the present disclosure, the terminal may periodically obtain the first detection information, the second detection information, and the gravity data at time intervals, and determine the holding state of the terminal based on the first detection information and the second detection information, and also determine the holding state of the terminal based on the gravity data. When the result of the holding state obtained based on the first detection information and the second detection information is different from the result of the holding state obtained based on the gravity data, the result based on the first detection information and the second detection information is still used as a criterion. For example, when it is determined that the holding state of the terminal is D based on the first detection information and the second detection informationsarWhen the holding state of the terminal is determined to be D based on the gravity dataaccIf D issarAnd DaccOtherwise, it is expressed as DsarThe result of (2) is true.
Fig. 12 is a flowchart of a holding gesture recognition method according to an embodiment of the disclosure, where the method is applied to the terminal shown in fig. 1, and as shown in fig. 12, the method includes:
s11, acquiring first detection information of the acting force acting on the display surface detected by the touch screen;
s12, acquiring second detection information of the acting force acting on the back shell detected by the sensor;
s13, determining the holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
In the embodiments of the present disclosure, the terminal may be a mobile phone, a tablet computer, a wearable device, or the like.
The terminal obtains first detection information of the acting force acting on the display surface through the touch screen, and the first detection information can comprise coordinate information and force information of the acting force. The terminal can also acquire second detection information of the acting force acting on the back shell through the sensor. The second detection information may also be coordinate information, or other information characterizing the action position of the acting force, for example, a "0" characterizing the acting force acts on the preset target area, and a "1" characterizing the acting force acts on the non-preset target area.
In the embodiment of the disclosure, if the display surface of the touch screen is referred to as the "front surface", the surface on which the back shell is located may be referred to as the "back surface". The terminal can determine the holding gesture of the terminal through the acquired first detection information and second detection information, and the holding gesture comprises horizontal screen holding and vertical screen holding.
It is understood that the present disclosure does not rely only on data of the touch screen (first detection information detected by the touch panel), but also detects an acting force acting on the back cover through a sensor in the terminal to obtain second detection information, and determines the holding posture of the terminal according to the first detection information and the second detection information. By the method, on one hand, data sources are increased, the judgment result of the holding gesture is obtained through comprehensive analysis, misjudgment caused by judgment only by means of single touch screen data is reduced, and the judgment accuracy is improved; on the other hand, the requirement of the contact area of the touch screen when being touched is reduced, and the method is also suitable for the scene that the user uses the mobile phone when lying on side or lying on back, so that the scheme disclosed by the invention is more universal, and the use experience of the user is improved.
In the embodiment of the disclosure, the terminal may control the touch screen to display in a display mode matched with the holding gesture according to the determined holding gesture, for example, switch between horizontal screen display and vertical screen display. In addition, when a motion sensing game is played by the terminal, a corresponding game control can be performed based on the determined holding posture.
In one implementation, the first detection information includes: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
step S13, including:
determining an acting area acting on the display surface according to the information of the first position, and determining whether the acting force acting on the back shell is positioned in a preset target area according to the information representing the second position; and determining the holding posture of the terminal according to the action area and the preset target area.
In an embodiment, the determining the holding posture of the terminal according to the action region and the preset target region includes:
if the area length of the action area is greater than a length threshold value, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target area, the terminal is determined to be held in a vertical screen mode;
if the length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is located in the preset target area, so that the terminal is determined to be held in a transverse screen mode.
In an embodiment of the present disclosure, the first detection information includes information of a first position of the applied force acting on the display surface. Since the terminal is usually held based on the boundary area when being held, the first position may be a position on the touch screen located on the boundary. The second detection information includes information indicative of a second position of the applied force applied to the back shell. As described above, the action force acting on the back shell is represented by "0" in the preset target region, and the action force acting on the back shell is represented by "1" in the non-preset target region. Since the terminal is held, typically based on the area of the sides of the back cover being in contact with the user, in one embodiment the predetermined target areas are distributed on each side of the back cover.
The detailed description of the present disclosure for determining the holding state according to the acting region on the display surface and according to whether the acting force acting on the back shell is located in the preset target region is the same as the previous description of fig. 5 to 10, and is not repeated here.
In an embodiment of the present disclosure, the acquiring first detection information of the acting force acting on the display surface detected by the touch screen and acquiring second detection information of the acting force acting on the back cover detected by the sensor include:
and acquiring the first detection information and the second detection information according to a time interval.
In the embodiment of the disclosure, when the terminal acquires the first detection information and the second detection information, the terminal may periodically acquire the first detection information and the second detection information at time intervals and determine the holding state in consideration of the fact that the holding state of the terminal does not change greatly every moment, so as to save power consumption of the terminal.
In one embodiment, the terminal further includes a gravity sensor therein, and the method further includes:
acquiring gravity data detected by the gravity sensor;
and if the holding posture of the terminal cannot be determined according to the first detection information and the second detection information, determining the holding posture of the terminal according to the gravity data.
In the embodiment of the disclosure, when the holding posture of the terminal cannot be determined according to the first detection information and the second detection information, the holding posture of the terminal can be further determined through the gravity data. When the terminal determines the holding state of the terminal based on the first detection information and the second detection information, the holding state of the terminal may be determined based on the gravity data in synchronization. When the result of the holding state obtained based on the first detection information and the second detection information is different from the result of the holding state obtained based on the gravity data, the result based on the first detection information and the second detection information is still used as a criterion.
Fig. 13 is a diagram illustrating a grip posture identifying apparatus according to an exemplary embodiment. Referring to fig. 13, the apparatus is applied to a terminal, where the terminal includes a touch screen and a back shell, the touch screen is installed in the back shell and exposes a display surface of the touch screen outside the back shell, and the apparatus includes:
a first obtaining module 101 configured to obtain first detection information of an acting force acting on the display surface detected by the touch screen;
a second acquiring module 102 configured to acquire second detection information of the acting force acting on the back shell detected by the sensor;
a determining module 103 configured to determine a holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
Optionally, the first detection information includes: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
the determining module 103 is specifically configured to determine an acting area acting on the display surface according to the information of the first position, determine whether an acting force acting on the back shell is located in a preset target area according to the information representing the second position, and determine a holding posture of the terminal according to the acting area and the preset target area.
Optionally, the determining module 103 is specifically configured to determine that the terminal is held in a vertical screen mode if the length of the region of the action region is greater than a length threshold, the action region and the preset target region are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target region; if the length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is located in the preset target area, so that the terminal is determined to be held in a transverse screen mode.
Optionally, the first obtaining module 101 is specifically configured to obtain the first detection information at time intervals; the second obtaining module 102 is specifically configured to obtain the second detection information at time intervals.
Optionally, the terminal further includes a gravity sensor, and the apparatus further includes:
a third obtaining module 104 configured to obtain gravity data detected by the gravity sensor;
the determining module 103 is further configured to determine the holding posture of the terminal according to the gravity data if the holding posture of the terminal cannot be determined according to the first detection information and the second detection information.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 14 is a block diagram illustrating a terminal apparatus 800 according to an example embodiment. For example, the device 800 may be a cell phone, tablet computer, or the like.
Referring to fig. 14, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as Wi-Fi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer-readable storage medium in which instructions, when executed by a processor of a terminal, enable the terminal to perform a grip gesture recognition method, the terminal including a touch screen and a back cover, the touch screen being mounted within the back cover and exposing a display surface of the touch screen outside the back cover, the method comprising:
acquiring first detection information of the acting force acting on the display surface and detected by the touch screen;
acquiring second detection information of acting force acting on the back shell detected by the sensor;
determining the holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. A terminal, comprising:
a back shell;
the display screen is arranged in the back shell and exposes the display surface out of the back shell;
the touch panel is stacked with the display screen and used for detecting acting force acting on the display surface to obtain first detection information;
the sensor is positioned between the display screen and the back shell and used for detecting acting force acting on the back shell to obtain second detection information;
the processing module is connected with the touch panel and the sensor and used for receiving the first detection information and the second detection information and determining the holding posture of the terminal according to the first detection information and the second detection information; the gripping gesture includes: horizontal screen holding and/or vertical screen holding.
2. The terminal of claim 1, wherein the first detection information comprises: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
the processing module is used for determining an action area acting on the display surface according to the information of the first position, determining whether the acting force acting on the back shell is located in a preset target area according to the information representing the second position, and determining the holding state of the terminal according to the action area and the preset target area.
3. The terminal of claim 2,
the processing module is used for determining that the terminal is held in a vertical screen mode when the length of the region of the action region is larger than a length threshold value, the action region and the preset target region are positioned on the same side of the terminal, and the acting force acting on the back shell is not positioned in the preset target region; and/or when the area length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are positioned on the same side of the terminal, and the acting force acting on the back shell is positioned in the preset target area, the terminal is determined to be held in a transverse screen mode.
4. The terminal of claim 2, wherein the predetermined target area is distributed on the back cover or on a side connecting the back cover and the display screen.
5. The terminal according to claim 1, further comprising:
the gravity sensor is positioned below the display screen and used for detecting the gravity data of the terminal;
the processing module is connected with the gravity sensor and used for receiving the gravity data and determining the holding posture of the terminal according to the gravity data when the holding posture of the terminal cannot be determined according to the first detection information and the second detection information.
6. The terminal of claim 1, wherein the sensor is a Sar sensor, and the terminal further comprises:
the antenna is positioned between the display screen and the back shell and covers the back shell;
and the Sar sensor is used for obtaining the second detection information through the antenna.
7. A holding gesture recognition method of a terminal is applied to the terminal, the terminal comprises a touch screen and a back shell, the touch screen is installed in the back shell, and a display surface of the touch screen is exposed outside the back shell, and the method comprises the following steps:
acquiring first detection information of the acting force acting on the display surface and detected by the touch screen;
acquiring second detection information of acting force acting on the back shell detected by the sensor;
determining the holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
8. The method of claim 7, wherein the first detection information comprises: information of a first position of an applied force acting on the display surface; the second detection information includes: information characterizing a second position of the force acting on the dorsal shell;
determining the holding posture of the terminal according to the first detection information and the second detection information, including:
determining an acting area acting on the display surface according to the information of the first position, and determining whether the acting force acting on the back shell is positioned in a preset target area according to the information representing the second position;
and determining the holding posture of the terminal according to the action area and the preset target area.
9. The method according to claim 8, wherein the determining the holding posture of the terminal according to the action region and the preset target region comprises:
if the area length of the action area is greater than a length threshold value, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is not located in the preset target area, the terminal is determined to be held in a vertical screen mode;
if the length of the action area is smaller than or equal to the length threshold, the action area and the preset target area are located on the same side of the terminal, and the acting force acting on the back shell is located in the preset target area, so that the terminal is determined to be held in a transverse screen mode.
10. The method according to claim 7, wherein the acquiring first detection information of the acting force acting on the display surface detected by the touch screen and the acquiring second detection information of the acting force acting on the back shell detected by the sensor comprises:
and acquiring the first detection information and the second detection information according to a time interval.
11. The method of claim 7, further comprising a gravity sensor in the terminal, the method further comprising:
acquiring gravity data detected by the gravity sensor;
and if the holding posture of the terminal cannot be determined according to the first detection information and the second detection information, determining the holding posture of the terminal according to the gravity data.
12. The utility model provides a gesture recognition device grips of terminal which characterized in that, is applied to in the terminal, the terminal includes touch-control screen and backshell, the touch-control screen is installed in the backshell and will the display surface of touch-control screen shows outside the backshell, the device includes:
the first acquisition module is configured to acquire first detection information of the acting force acting on the display surface and detected by the touch screen;
the second acquisition module is configured to acquire second detection information of the acting force acting on the back shell, which is detected by the sensor;
the determining module is configured to determine the holding posture of the terminal according to the first detection information and the second detection information; wherein the grip gesture comprises: horizontal screen holding and/or vertical screen holding.
13. A terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the grip posture recognition method of the terminal according to any one of claims 7 to 11.
14. A non-transitory computer-readable storage medium, wherein instructions, when executed by a processor of a terminal, enable the terminal to perform the method of grip gesture recognition of a terminal according to any one of claims 7 to 11.
CN202110204737.5A 2021-02-23 2021-02-23 Terminal, holding posture recognition method and device of terminal, and storage medium Pending CN112799582A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110204737.5A CN112799582A (en) 2021-02-23 2021-02-23 Terminal, holding posture recognition method and device of terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110204737.5A CN112799582A (en) 2021-02-23 2021-02-23 Terminal, holding posture recognition method and device of terminal, and storage medium

Publications (1)

Publication Number Publication Date
CN112799582A true CN112799582A (en) 2021-05-14

Family

ID=75815577

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110204737.5A Pending CN112799582A (en) 2021-02-23 2021-02-23 Terminal, holding posture recognition method and device of terminal, and storage medium

Country Status (1)

Country Link
CN (1) CN112799582A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114356153A (en) * 2021-12-30 2022-04-15 广东明创软件科技有限公司 Control method, control device, electronic equipment and storage medium
CN116016760A (en) * 2021-10-21 2023-04-25 北京小米移动软件有限公司 Mobile terminal charging control method, device, equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867787A (en) * 2016-03-25 2016-08-17 青岛海信移动通信技术股份有限公司 Method and device for detecting mode of holding terminal by user and terminal
CN106572207A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Terminal single hand mode identification device and method
CN106843672A (en) * 2016-12-20 2017-06-13 努比亚技术有限公司 A kind of terminal screen locking operation device and method
CN108958484A (en) * 2018-06-29 2018-12-07 努比亚技术有限公司 A kind of control method for screen display, terminal and computer readable storage medium
CN110851067A (en) * 2019-10-29 2020-02-28 华为技术有限公司 Screen display mode switching method and device and electronic equipment
CN111328132A (en) * 2020-02-25 2020-06-23 维沃移动通信有限公司 Method for adjusting transmitting power and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867787A (en) * 2016-03-25 2016-08-17 青岛海信移动通信技术股份有限公司 Method and device for detecting mode of holding terminal by user and terminal
CN106572207A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Terminal single hand mode identification device and method
CN106843672A (en) * 2016-12-20 2017-06-13 努比亚技术有限公司 A kind of terminal screen locking operation device and method
CN108958484A (en) * 2018-06-29 2018-12-07 努比亚技术有限公司 A kind of control method for screen display, terminal and computer readable storage medium
CN110851067A (en) * 2019-10-29 2020-02-28 华为技术有限公司 Screen display mode switching method and device and electronic equipment
CN111328132A (en) * 2020-02-25 2020-06-23 维沃移动通信有限公司 Method for adjusting transmitting power and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116016760A (en) * 2021-10-21 2023-04-25 北京小米移动软件有限公司 Mobile terminal charging control method, device, equipment and medium
CN114356153A (en) * 2021-12-30 2022-04-15 广东明创软件科技有限公司 Control method, control device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102171616B1 (en) Method and device for preventing incorrect contact of terminal
EP3179711B1 (en) Method and apparatus for preventing photograph from being shielded
US20160202834A1 (en) Unlocking method and terminal device using the same
US20180039332A1 (en) Terminal and touch response method and device
US20170123587A1 (en) Method and device for preventing accidental touch of terminal with touch screen
CN103533154A (en) Mobile terminal and a voice recognition method
CN106873834B (en) Method and device for identifying triggering of key and mobile terminal
EP3046042A1 (en) Apparatus for implementing home button and fingerprint identification on single sensor
CN106484284B (en) Method and device for switching single-hand mode
EP3444752A2 (en) Fingerprint recognition process
US10248855B2 (en) Method and apparatus for identifying gesture
EP3136206B1 (en) Method and apparatus for setting threshold
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN110262692B (en) Touch screen scanning method, device and medium
KR20170132833A (en) Pressure detection method, apparatus, program, and recording medium
KR20170038178A (en) Method, apparatus, and mobile terminal for identificating fingerprint
US20180238748A1 (en) Pressure detection method and apparatus, and storage medium
US10885298B2 (en) Method and device for optical fingerprint recognition, and computer-readable storage medium
CN110875769A (en) Wireless communication device and antenna switching method
CN112799582A (en) Terminal, holding posture recognition method and device of terminal, and storage medium
US20220300141A1 (en) Detection method, device, and electronic equipment
CN107239184B (en) Touch screen touch device and method and mobile terminal
CN106778169B (en) Fingerprint unlocking method and device
CN113010127A (en) Display switching method and device, mobile terminal and storage medium
CN107688765B (en) Fingerprint acquisition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination