CN111756907A - Operation execution method, device and storage medium - Google Patents

Operation execution method, device and storage medium Download PDF

Info

Publication number
CN111756907A
CN111756907A CN201910234591.1A CN201910234591A CN111756907A CN 111756907 A CN111756907 A CN 111756907A CN 201910234591 A CN201910234591 A CN 201910234591A CN 111756907 A CN111756907 A CN 111756907A
Authority
CN
China
Prior art keywords
display screen
user
key
target
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910234591.1A
Other languages
Chinese (zh)
Inventor
陈朝喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201910234591.1A priority Critical patent/CN111756907A/en
Publication of CN111756907A publication Critical patent/CN111756907A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/23Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof
    • H04M1/236Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof including keys on side or rear faces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0279Improving the user comfort or ergonomics
    • H04M1/0281Improving the user comfort or ergonomics for providing single handed use or left/right hand conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure discloses an operation execution method, an operation execution device and a storage medium. The method comprises the following steps: acquiring state information of the terminal, wherein the state information is used for indicating the screen state of the terminal; determining a target display screen interacted by a user according to the state information, wherein the target display screen is a first display screen or a second display screen; determining a target key for responding to the thumb operation of a user according to the target display screen and the left-right hand setting information, wherein the target key is a first key or a second key; the left-right hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user; receiving an operation signal corresponding to a target key; and executing the operation corresponding to the operation signal. The technical scheme that this disclosure provided, when the user uses different display screens to interact with the terminal, can use different button to control the terminal, no matter which display screen uses, the user all can use unified use habit of gripping to operate the button, convenient operation, the convenience is high.

Description

Operation execution method, device and storage medium
Technical Field
The disclosed embodiments relate to the field of terminal technologies, and in particular, to an operation execution method, an operation execution device, and a storage medium.
Background
Mobile terminals such as mobile phones are generally provided with a power key through which screen-off and screen-on control, on-off control, and the like can be implemented.
In the related art, the mobile terminal has a power key, and the power key is disposed on a side frame (i.e., a middle frame) of the mobile terminal. For example, the power key is disposed on a right side frame of the mobile terminal in a state where a front panel of the mobile terminal is directed upward.
For a mobile terminal with a double-sided screen, namely a mobile terminal with a display screen arranged on a front panel and a back panel respectively, if a power key is arranged on a right side frame of the mobile terminal and a user is used to hold the mobile terminal by holding the mobile terminal by a right hand, the user can simply touch the power key by a thumb when the front panel is upward.
However, when the user turns over the mobile terminal and uses the display screen on the back plate, the power key can be turned over to the other side, and the user cannot operate the power key according to the previous holding and using habit, so that the operation convenience is affected.
Disclosure of Invention
The embodiment of the disclosure provides an operation execution method, an operation execution device and a storage medium. The technical scheme is as follows:
according to a first aspect of the embodiments of the present disclosure, an operation execution method is provided, which is applied to a terminal, where the terminal includes a first display screen and a second display screen, the first display screen is disposed on a front panel of the terminal, and the second display screen is disposed on a back panel of the terminal; the method comprises the following steps:
acquiring state information of the terminal, wherein the state information is used for indicating the screen state of the terminal;
determining a target display screen interacted by a user according to the state information, wherein the target display screen is the first display screen or the second display screen;
determining a target key for responding to the thumb operation of the user according to the target display screen and the left-right hand setting information, wherein the target key is a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
receiving an operation signal corresponding to the target key;
and executing the operation corresponding to the operation signal.
Optionally, the determining a target key for responding to the thumb operation of the user according to the target display screen and the left-right hand setting information includes:
if the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key;
if the target display screen is the first display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
if the target display screen is the second display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
and if the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
Optionally, the performing an operation corresponding to the operation signal includes:
if the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
or,
if the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
or,
and if the operation signal is the third operation signal, starting an AI (Artificial Intelligence) function.
Optionally, the state information includes acceleration data acquired by an acceleration sensor of the terminal;
the determining a target display screen of user interaction according to the state information comprises:
if the acceleration data in the gravity direction belongs to a first value range, determining that the target display screen interacted by the user is the first display screen;
and if the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes touch information, and the touch information includes a first touch area on the first display screen and a second touch area on the second display screen;
the determining a target display screen of user interaction according to the state information comprises:
if the first touch area is smaller than the second touch area, determining that the target display screen interacted by the user is the first display screen;
and if the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes holding information, and the holding information includes the number of touch points on the right side frame and the number of touch points on the left side frame;
the determining a target display screen of user interaction according to the state information comprises:
if the user is the right-handed operation user, determining that the target display screen interacted by the user is the first display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the second display screen;
if the user is the left-handed operation user, determining that the target display screen interacted by the user is the second display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
According to a second aspect of the embodiments of the present disclosure, there is provided an operation execution device, which is applied to a terminal, where the terminal includes a first display screen and a second display screen, the first display screen is disposed on a front panel of the terminal, and the second display screen is disposed on a back panel of the terminal; the device comprises:
an information acquisition module configured to acquire state information of the terminal; the state information of the terminal is used for indicating the screen state of the terminal;
a display screen determination module configured to determine a target display screen for user interaction according to the state information, the target display screen being the first display screen or the second display screen;
a key determination module configured to determine a target key for responding to a thumb operation of the user according to the target display screen and the left-right hand setting information, the target key being a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
a signal receiving module configured to receive an operation signal corresponding to the target key;
an operation execution module configured to execute an operation corresponding to the operation signal.
Optionally, the key determining module is configured to:
when the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key;
when the target display screen is the first display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
and when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
Optionally, the operation execution module is configured to:
when the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
or,
when the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
or,
and when the operation signal is a third operation signal, starting an AI function.
Optionally, the state information includes acceleration data acquired by an acceleration sensor of the terminal;
the display screen determination module configured to:
when the acceleration data in the gravity direction belongs to a first value range, determining the target display screen interacted by the user as the first display screen;
and when the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes touch information, and the touch information includes a first touch area on the first display screen and a second touch area on the second display screen;
the display screen determination module configured to:
when the first touch area is smaller than the second touch area, determining that the target display screen interacted by the user is the first display screen;
and when the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes holding information, and the holding information includes the number of touch points on the right side frame and the number of touch points on the left side frame;
the display screen determination module configured to:
if the user is the right-handed operation user, determining that the target display screen interacted by the user is the first display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the second display screen;
if the user is the left-handed operation user, determining that the target display screen interacted by the user is the second display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
According to a third aspect of the embodiments of the present disclosure, there is provided an operation execution device, which is applied to a terminal, where the terminal includes a first display screen and a second display screen, the first display screen is disposed on a front panel of the terminal, and the second display screen is disposed on a back panel of the terminal;
the device comprises:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to:
acquiring state information of the terminal; the state information of the terminal is used for indicating the screen state of the terminal;
determining a target display screen interacted by a user according to the state information, wherein the target display screen is the first display screen or the second display screen;
determining a target key for responding to the thumb operation of the user according to the target display screen and the left-right hand setting information, wherein the target key is a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
receiving an operation signal corresponding to the target key;
and executing the operation corresponding to the operation signal.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method according to the first aspect.
The beneficial effects brought by the technical scheme provided by the embodiment of the disclosure can include:
the terminal can determine a target key operated by a thumb of a user in the two keys according to a display screen interacted by the user and left and right hand setting information, and executes an operation corresponding to an operation signal when the terminal receives the operation signal corresponding to the target key. Because double-sided screen terminal has two buttons, when the user used different display screens to interact with the terminal, can use different buttons to control the terminal, no matter which display screen was used, the user all can use unified use habit of gripping to operate the button, convenient operation, the convenience is high.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating a terminal with a front panel facing upward in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a terminal with a back plate up in accordance with an exemplary embodiment;
FIG. 3 is a flowchart illustrating a method of operation execution in accordance with an exemplary embodiment;
fig. 4 is a schematic diagram illustrating a right-hand grip terminal according to an example embodiment;
FIG. 5 is a schematic diagram illustrating a left-handed holding terminal according to an exemplary embodiment;
fig. 6 is a schematic diagram illustrating a right-hand grip terminal according to another exemplary embodiment;
FIG. 7 is a schematic illustration of a left-handed holding terminal according to another exemplary embodiment;
FIGS. 8 and 9 are schematic diagrams of coordinate systems shown in accordance with exemplary embodiments;
FIG. 10 is a block diagram illustrating an operation performing apparatus in accordance with an exemplary embodiment;
fig. 11 is a block diagram illustrating an operation performing apparatus according to another exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the method provided by the embodiment of the disclosure, the execution subject of each step may be a terminal. The terminal may be a mobile terminal such as a mobile phone, a tablet computer, an e-book reader, a multimedia playing device, a wearable device, and the like.
Referring collectively to fig. 1 and 2, a schematic diagram of a terminal 10 is illustratively shown. The terminal 10 has a double-sided screen. The double-sided screen includes a first display screen 11 and a second display screen 12. The first display 11 is disposed on a front panel of the terminal 10, and the second display 12 is disposed on a rear panel of the terminal 10. The first display screen 11 and the second display screen 12 can realize division display and have a human-computer interaction function.
In one example, both the first display screen 11 and the second display screen 12 can be designed as full-screen, i.e. the screen ratio of the first display screen 11 on the front panel is equal to or close to 100%, and the screen ratio of the second display screen 12 on the back panel is also equal to or close to 100%.
In another example, the first display screen 11 may be designed as a full screen and the second display screen 12 may be designed as a non-full screen, i.e. the size of the second display screen 12 is smaller than the size of the first display screen 11. Alternatively, a back plate of the terminal 10 may be provided with a camera, a light sensor, and other functional devices.
Of course, the above descriptions of the first display screen 11 and the second display screen 12 are only exemplary and explanatory, and the disclosed embodiment does not limit the configuration and size of the first display screen 11 and the second display screen 12.
As shown in fig. 1 and 2, a first key 13 and a second key 14 are provided on a side frame of the terminal 10. In the state where the front panel is upward, the first key 13 is located on the right side frame of the terminal, and the second key 14 is located on the left side frame of the terminal.
Alternatively, the first key 13 and the second key 14 have the same function. For example, the first key 13 and the second key 14 each have a function of a power key, and can implement operations such as turning on and off a screen, turning on and off the screen, and turning off the power supply.
In addition, the first key 13 and the second key 14 may be physical keys, or may also be virtual keys, or one of them is a physical key and the other is a virtual key, which is not limited in this disclosure.
FIG. 3 is a flowchart illustrating a method of operation execution in accordance with an exemplary embodiment. In the present embodiment, the method is mainly exemplified as being applied to the terminal 10 having the double-sided screen described above. The method may comprise the following steps (301-305):
in step 301, status information of the terminal is acquired.
The status information is used to indicate the screen status of the terminal, for example, to indicate whether the terminal is a first display to a user holding the terminal or a second display to a user holding the terminal.
In step 302, a target display screen for user interaction is determined according to the status information.
The target display screen is the first display screen or the second display screen. The target display screen may be a display screen that the user is currently interacting with, or may be a display screen that the user is about to interact with. The "interaction" refers to the user using the target display screen, such as viewing the display content in the target display screen, performing a touch operation such as clicking, sliding, pressing, and the like in the target display screen, and the like.
In step 303, a target key for responding to a thumb operation of the user is determined based on the target display screen and the left and right hand setting information.
As described above, the terminal includes the first key and the second key, and the description of the first key and the second key can refer to fig. 1 and fig. 2, which are not described herein again. The target key is the first key or the second key.
The left-right hand setting information is used to indicate whether the user is a right-hand operation user or a left-hand operation user. The left-right hand setting information may be set in the terminal in advance by the user, and for example, the user may set the left-right hand setting information according to a habit of using the terminal at ordinary times.
Optionally, this step has the following four cases:
firstly, if the target display screen is a first display screen and the left-hand and right-hand setting information is used for indicating that the user is a right-hand operation user, determining that a target key used for responding to the thumb operation of the user is a first key;
as shown in fig. 4, the target display screen for the user interaction is the first display screen 11, and the user is the right-handed user, then the target key for responding to the thumb operation of the user is determined to be the first key 13.
Secondly, if the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is a left-hand operation user, determining that a target key used for responding to the thumb operation of the user is a second key;
as shown in fig. 5, the target display screen for the user interaction is the first display screen 11, and the user is a left-handed user, then the target key for responding to the thumb operation of the user is determined to be the second key 14.
Thirdly, if the target display screen is a second display screen and the left-hand and right-hand setting information is used for indicating that the user is a right-hand operation user, determining that a target key used for responding to the thumb operation of the user is a second key;
as shown in fig. 6, the target display screen for the user interaction is the second display screen 12, and the user is a right-handed user, the target key for responding to the thumb operation of the user is determined to be the second key 14.
And fourthly, if the target display screen is the second display screen and the left-hand and right-hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
As shown in fig. 7, the target display screen for the user interaction is the second display screen 12, and the user is a left-handed user, then the target key for responding to the thumb operation of the user is determined to be the first key 13.
In step 304, an operation signal corresponding to the target key is received.
The operation signal is triggered by the user clicking or pressing the target key. For example, when the target key is a physical key, the user presses the target key with a thumb, and accordingly, the terminal receives an operation signal corresponding to the target key.
In step 305, an operation corresponding to the operation signal is performed.
When receiving the operation signal corresponding to the target key, the terminal executes the corresponding operation according to the operation signal.
Optionally, this step has the following three cases:
firstly, if the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
optionally, the first operation signal is a click operation signal. Taking the terminal in the power-on state as an example, when the target display screen is in the bright screen state, and the terminal receives a first operation signal corresponding to the target key, executing the screen-off operation corresponding to the target display screen, that is, controlling the target display screen to be switched from the bright screen state to the screen-off state; when the target display screen is in the screen-off state, and the terminal receives a first operation signal corresponding to the target key, the screen-on operation corresponding to the target display screen is executed, namely the target display screen is controlled to be switched from the screen-off state to the screen-on state.
Secondly, if the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
optionally, the second operation signal is a long-press operation signal whose duration belongs to the first value interval. The first value interval may be predetermined, for example, the first value interval is greater than 4 seconds. When the terminal is in a power-off state, if the terminal receives a second operation signal corresponding to the target key, executing power-on operation; and when the terminal is in a power-on state, if the terminal receives a second operation signal corresponding to the target key, executing power-off operation.
Thirdly, if the operation signal is the third operation signal, the AI function is started.
Optionally, the third operation signal is a long press operation signal whose duration belongs to the second value interval. The second value interval may be preset, for example, the second value interval is greater than 2 seconds and less than 4 seconds. And when the terminal is in a starting-up state, if the terminal receives a third operation signal, starting an AI function. The AI function is a function of simulating human consciousness and thinking. Optionally, the AI function comprises at least one of the following functions: user authentication, emotion recognition, natural language understanding, AR (Augmented Reality), AI vision, and the like. Illustratively, the terminal receives the third operation signal and detects that the user is using the terminal to take a picture, the terminal may turn on an AI taking function, and if the terminal may automatically beautify the picture according to the personal aesthetic of the user.
It should be noted that the above description of the first operation signal, the second operation signal and the third operation signal is only exemplary and explanatory, and in practical applications, it is only necessary to ensure that the first operation signal, the second operation signal and the third operation signal are different from each other. In some other examples, the first operation signal may be a single click operation signal, the second operation signal may be a long click operation signal, the third operation signal may be a double click operation signal, and the like, which is not limited by the embodiment of the present disclosure.
Optionally, the key (including the first key and the second key) is connected to a pin of a PMIC (Power Manager inter circuit) through a signal trace, and the pin always outputs a high level no matter whether the terminal is in a Power-on state or a Power-off state. When the key is touched, the pin outputs a low level signal. The terminal distinguishes whether the operation corresponding to the operation signal is power-on or power-off, whether the screen is on or off, or whether the AI function is started according to the power-on and power-off state and the time for outputting the low level.
To sum up, in the technical solution provided in the embodiment of the present disclosure, an operation execution method is provided for a terminal with a dual-sided screen, two keys are provided on two sides of a frame of the dual-sided screen terminal, the terminal can determine a target key operated by a user thumb in the two keys according to a display screen interacted by the user and left and right hand setting information, and execute an operation corresponding to an operation signal when receiving the operation signal corresponding to the target key. Because double-sided screen terminal has two buttons, when the user used different display screens to interact with the terminal, can use different buttons to control the terminal, no matter which display screen was used, the user all can use unified use habit of gripping to operate the button, convenient operation, the convenience is high.
Alternatively, the terminal may determine the target display screen for the user interaction by:
firstly, the state information comprises acceleration data collected by an acceleration sensor of the terminal.
At this time, the terminal may determine the target display screen by:
if the acceleration data in the gravity direction belongs to a first value range, determining that a target display screen interacted by the user is a first display screen; and if the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is a second display screen.
Optionally, the acceleration sensor is a three-axis acceleration sensor, and the three-axis acceleration sensor can acquire acceleration data in three directions of an x axis, a y axis and a z axis.
Alternatively, the acceleration sensor may be a gravity sensor, and the gravity sensor may collect acceleration data in a gravity direction.
It should be noted that the coordinate system of the acceleration sensor is a cartesian coordinate system, the coordinate system of the terminal itself is a google coordinate system, and an angle mapping relationship exists between the cartesian coordinate system and the google coordinate system. For example, assuming that the coordinate systems of the acceleration sensors are X, Y and Z, and the google coordinate systems are X, Y and Z, assuming that the angular difference between X and X is a, the angular difference between Y and Y is b, and the angular difference between Z and Z is c, X + a, Y + b, and Z + c.
The X axis, the Y axis and the Z axis are mutually vertical in pairs, the plane where the X axis and the Y axis are located is parallel to the plane where the first display screen is located, and the Z axis is perpendicular to the plane where the first display screen is located. The acceleration data in the gravity direction is acceleration data in the Z-axis direction.
The first value range may be less than 0, and the second value range may be greater than 0; in addition, when the first value range is greater than 0, the second area value range is less than 0, which is not limited in the embodiment of the present disclosure.
Exemplarily, referring to fig. 8 in combination, a schematic diagram of a coordinate system in the present embodiment is exemplarily shown. And the vertex A of the upper left corner of the front panel of the terminal is taken as a coordinate origin, the direction of an X axis is along the upper frame of the front panel, the direction of a Y axis is along the left frame of the front panel, and the direction of a Z axis is perpendicular to the plane where the front panel is located. The upward direction of the origin along the Z-axis direction is taken as the positive Z-axis direction, and the downward direction of the origin along the Z-axis direction is taken as the negative Z-axis direction.
With reference to part (a) in fig. 8, assuming that the first value range is smaller than 0, when the acceleration data in the gravity direction is smaller than 0, that is, the acceleration in the gravity direction is along the Z-axis negative direction, that is, the front panel of the terminal at this time, that is, the first display screen 11, is upward, so that it may be determined that the target display screen for user interaction is the first display screen 11.
With reference to part (b) of fig. 8, assuming that the second value range is greater than 0, when the acceleration data in the gravity direction is greater than 0, that is, the acceleration in the gravity direction is along the positive direction of the Z axis, that is, the front panel of the terminal is downward, and the back panel, that is, the second display screen 12 is upward, so that it may be determined that the target display screen interacted by the user is the second display screen 12.
Optionally, if the acceleration data in the gravity direction is equal to 0, determining that the target display screen interacted by the user is a default display screen, where the default display screen is the first display screen.
Exemplarily, referring to fig. 9 in combination, when the acceleration data of the gravity direction is equal to 0, that is, the acceleration of the gravity direction is along the Y-axis direction (as shown in part (a) of fig. 9) or the X-axis direction (as shown in part (b) of fig. 9), it is described that the terminal is vertically placed with the portrait screen or vertically placed with the landscape screen, and the default display screen is the first display screen 10.
And secondly, the state information comprises touch information, and the touch information comprises a first touch area on the first display screen and a second touch area on the second display screen.
At this time, the terminal may determine the target display screen by:
if the first touch area is smaller than the second touch area, determining that a target display screen interacted by the user is the first display screen; and if the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
Exemplarily, referring to fig. 4 and 6 in combination, when the user holds the terminal, the palm of the user touches the back panel of the terminal, i.e., the second display screen 12, and the finger of the user touches the front panel of the terminal, i.e., the first display screen 11, and at this time, the first touch area is smaller than the second touch area, it is determined that the target display screen interacted by the user is the first display screen 11. Otherwise, the palm of the user touches the front panel of the terminal, i.e. the first display screen 11, and the finger of the user touches the back panel of the terminal, i.e. the second display screen 12, at this time, the first touch area is larger than the second touch area, and it is determined that the target display screen interacted by the user is the second display screen 12.
And thirdly, the state information comprises holding information, and the holding information comprises the number of touch points on the right side frame and the number of touch points on the left side frame.
At this time, the terminal may determine the target display screen by:
if the user is a right-handed operation user, determining that a target display screen interacted by the user is a first display screen when the number of touch points on the right side frame is smaller than that of touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that a target display screen interacted by the user is a second display screen;
if the user is a left-handed operation user, determining that the target display screen interacted by the user is a second display screen when the number of the touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
For example, as shown in fig. 5, when the user is a left-handed user, if the number of touch points on the first frame (i.e., the right frame of the terminal in the state where the front panel is facing upward) is 4 and is greater than the number of touch points on the second frame (i.e., the left frame of the terminal in the state where the front panel is facing upward) is 1, it is determined that the target display screen interacted by the user is the first display screen 11; as shown in fig. 7, if the number of touch points on the first frame is 1 and is less than the number of touch points on the second frame is 4, it is determined that the target display screen interacted by the user is the second display screen 12.
Optionally, the terminal further includes a first pressure sensor disposed on the right side frame and a second pressure sensor disposed on the left side frame, and at this time, the terminal may further obtain sensor data acquired by the first pressure sensor and sensor data acquired by the second pressure sensor; determining the number of touch points on the right side frame according to the sensor data acquired by the first pressure sensor; and determining the number of touch points on the left side frame according to the sensor data acquired by the second pressure sensor.
The pressure sensor can be composed of different pressure subunits, the pressure sensor is low level when no pressure exists and is represented by a value 0, the pressure sensor is high level when pressure exists and is represented by a value 1, and the number of the values 1 in the data collected by the pressure sensor is the number of touch points.
In addition, the number of the touch points can be acquired through the first touch sensor arranged on the right side frame and the second touch sensor arranged on the left side frame, or through the first distance sensor arranged on the right side frame and the second distance sensor arranged on the left side frame, or through other sensors, which is not limited in the embodiment of the disclosure.
In summary, in the technical scheme provided by the embodiment of the present disclosure, the display screen interacted by the user is determined by obtaining the state information of the terminal and according to the state information, and the accuracy of determining the display screen interacted by the user is high.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 10 is a block diagram illustrating an operation performing apparatus according to an exemplary embodiment. The device has the functions of realizing the method examples, and the functions can be realized by hardware or by hardware executing corresponding software. The device can be a terminal or be arranged on the terminal. The terminal comprises a first display screen and a second display screen, wherein the first display screen is arranged on a front panel of the terminal, and the second display screen is arranged on a back panel of the terminal. The apparatus 1000 may include: the information acquisition module 1010, the display screen determination module 1020, the key determination module 1030, the signal reception module 1040, and the operation execution module 1050.
The information obtaining module 1010 is configured to obtain status information of the terminal; the state information of the terminal is used for indicating the screen state of the terminal.
The display screen determining module 1020 is configured to determine a target display screen for user interaction according to the status information, where the target display screen is the first display screen or the second display screen.
The key determination module 1030 configured to determine a target key for responding to a thumb operation of the user according to the target display screen and the left-right hand setting information, where the target key is a first key or a second key; the first key is located on a right side frame of the terminal, the second key is located on a left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards.
The signal receiving module 1040 is configured to receive an operation signal corresponding to the target key.
The operation execution module 1050 is configured to execute an operation corresponding to the operation signal.
To sum up, in the technical solution provided in the embodiment of the present disclosure, an operation execution method is provided for a terminal with a dual-sided screen, two keys are provided on two sides of a frame of the dual-sided screen terminal, the terminal can determine a target key operated by a user thumb in the two keys according to a display screen interacted by the user and left and right hand setting information, and execute an operation corresponding to an operation signal when receiving the operation signal corresponding to the target key. Because double-sided screen terminal has two buttons, when the user used different display screens to interact with the terminal, can use different buttons to control the terminal, no matter which display screen was used, the user all can use unified use habit of gripping to operate the button, convenient operation, the convenience is high.
Optionally, the key determining module 1030 is configured to:
when the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key;
when the target display screen is the first display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
and when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
Optionally, the operation execution module 1050 is configured to:
when the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
or,
when the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
or,
and when the operation signal is a third operation signal, starting an AI function.
Optionally, the state information includes acceleration data acquired by an acceleration sensor of the terminal;
the display screen determination module 1020 configured to:
if the acceleration data in the gravity direction belongs to a first value range, determining that the target display screen interacted by the user is the first display screen;
and if the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes touch information, and the touch information includes a first touch area on the first display screen and a second touch area on the second display screen;
the display screen determination module 1020 configured to:
when the first touch area is smaller than the second touch area, determining that the target display screen interacted by the user is the first display screen;
and when the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes holding information, and the holding information includes the number of touch points on the right side frame and the number of touch points on the left side frame;
the display screen determination module 1020 configured to:
if the user is the right-handed operation user, determining that the target display screen interacted by the user is the first display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the second display screen;
if the user is the left-handed operation user, determining that the target display screen interacted by the user is the second display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
An exemplary embodiment of the present disclosure also provides an operation execution apparatus, which can implement the operation execution method provided by the present disclosure. The apparatus may be the terminal described above, or may be provided in the terminal. The terminal comprises a first display screen and a second display screen, wherein the first display screen is arranged on a front panel of the terminal, and the second display screen is arranged on a back panel of the terminal. The device includes: a processor; and a memory for storing executable instructions of the processor. Wherein the processor is configured to:
acquiring state information of the terminal; the state information of the terminal is used for indicating the screen state of the terminal;
determining a target display screen interacted by a user according to the state information, wherein the target display screen is the first display screen or the second display screen;
determining a target key for responding to the thumb operation of the user according to the target display screen and the left-right hand setting information, wherein the target key is a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
receiving an operation signal corresponding to the target key;
and executing the operation corresponding to the operation signal.
Optionally, the processor is configured to:
when the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key;
when the target display screen is the first display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
and when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
Optionally, the processor is configured to:
when the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
or,
when the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
or,
and when the operation signal is a third operation signal, starting an AI function.
Optionally, the state information includes acceleration data acquired by an acceleration sensor of the terminal;
the processor is configured to:
when the acceleration data in the gravity direction belongs to a first value range, determining the target display screen interacted by the user as the first display screen;
and when the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes touch information, and the touch information includes a first touch area on the first display screen and a second touch area on the second display screen;
the processor is configured to:
when the first touch area is smaller than the second touch area, determining that the target display screen interacted by the user is the first display screen;
and when the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
Optionally, the state information includes holding information, and the holding information includes the number of touch points on the right side frame and the number of touch points on the left side frame;
the processor is configured to:
if the user is the right-handed operation user, determining that the target display screen interacted by the user is the first display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the second display screen;
if the user is the left-handed operation user, determining that the target display screen interacted by the user is the second display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
Fig. 11 is a block diagram illustrating an operation performing apparatus 1100 according to another exemplary embodiment. For example, the device 1100 includes a first display screen disposed on a front panel of the device 1100 and a second display screen disposed on a back panel of the device 1100. For example, the apparatus 1100 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 11, apparatus 1100 may include one or more of the following components: processing component 1102, memory 1104, power component 1106, multimedia component 1108, audio component 1110, input/output (I/O) interface(s) 1112, sensor component 1114, and communications component 1116.
The processing component 1102 generally controls the overall operation of the device 1100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1102 may include one or more processors 1120 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 1102 may include one or more modules that facilitate interaction between the processing component 1102 and other components. For example, the processing component 1102 may include a multimedia module to facilitate interaction between the multimedia component 1108 and the processing component 1102.
The memory 1104 is configured to store various types of data to support operations at the apparatus 1100. Examples of such data include instructions for any application or method operating on device 1100, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1104 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power component 1106 provides power to the various components of the device 1100. The power components 1106 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 1100.
The multimedia component 1108 includes a screen that provides an output interface between the device 1100 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1108 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1100 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1110 is configured to output and/or input audio signals. For example, the audio component 1110 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1100 is in operating modes, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1104 or transmitted via the communication component 1116. In some embodiments, the audio assembly 1110 further includes a speaker for outputting audio signals.
The I/O interface 1112 provides an interface between the processing component 1102 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1114 includes one or more sensors for providing various aspects of state assessment for the apparatus 1100. For example, the sensor assembly 1114 may detect an open/closed state of the apparatus 1100, the relative positioning of components, such as a display and keypad of the apparatus 1100, the sensor assembly 1114 may also detect a change in position of the apparatus 1100 or a component of the apparatus 1100, the presence or absence of user contact with the apparatus 1100, orientation or acceleration/deceleration of the apparatus 1100, and a change in temperature of the apparatus 1100. The sensor assembly 1114 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1114 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1114 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1116 is configured to facilitate wired or wireless communication between the apparatus 1100 and other devices. The apparatus 1100 may access a wireless network based on a communication standard, such as Wi-Fi, 2G, 3G, 4G, or 5G, or a subsequent evolved system, or a combination thereof. In an exemplary embodiment, the communication component 1116 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1116 also includes a Near Field Communication (NFC) module to facilitate short-range communications.
In an exemplary embodiment, the apparatus 1100 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, on which a computer program is stored, the computer program being executable by the processor 1120 of the apparatus 1100 to perform the above-described operation performing method.
For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An operation execution method is applied to a terminal, wherein the terminal comprises a first display screen and a second display screen, the first display screen is arranged on a front panel of the terminal, and the second display screen is arranged on a back panel of the terminal; the method comprises the following steps:
acquiring state information of the terminal, wherein the state information is used for indicating the screen state of the terminal;
determining a target display screen interacted by a user according to the state information, wherein the target display screen is the first display screen or the second display screen;
determining a target key for responding to the thumb operation of the user according to the target display screen and the left-right hand setting information, wherein the target key is a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
receiving an operation signal corresponding to the target key;
and executing the operation corresponding to the operation signal.
2. The method of claim 1, wherein determining the target key for responding to the thumb operation of the user based on the target display screen and the left and right hand setting information comprises:
if the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key;
if the target display screen is the first display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
if the target display screen is the second display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
and if the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
3. The method of claim 1, wherein the performing the operation corresponding to the operation signal comprises:
if the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
or,
if the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
or,
and if the operation signal is a third operation signal, starting an Artificial Intelligence (AI) function.
4. The method according to any one of claims 1 to 3, characterized in that the status information comprises acceleration data collected by an acceleration sensor of the terminal;
the determining a target display screen of user interaction according to the state information comprises:
if the acceleration data in the gravity direction belongs to a first value range, determining that the target display screen interacted by the user is the first display screen;
and if the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is the second display screen.
5. The method of any of claims 1 to 3, wherein the status information comprises touch information comprising a first touch area on the first display screen and a second touch area on the second display screen;
the determining a target display screen of user interaction according to the state information comprises:
if the first touch area is smaller than the second touch area, determining that the target display screen interacted by the user is the first display screen;
and if the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
6. The method according to any one of claims 1 to 3, wherein the status information includes holding information, the holding information including the number of touch points on the right side frame and the number of touch points on the left side frame;
the determining a target display screen of user interaction according to the state information comprises:
if the user is the right-handed operation user, determining that the target display screen interacted by the user is the first display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the second display screen;
if the user is the left-handed operation user, determining that the target display screen interacted by the user is the second display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
7. An operation execution device is applied to a terminal, wherein the terminal comprises a first display screen and a second display screen, the first display screen is arranged on a front panel of the terminal, and the second display screen is arranged on a back panel of the terminal; the device comprises:
an information acquisition module configured to acquire state information of the terminal; the state information of the terminal is used for indicating the screen state of the terminal;
a display screen determination module configured to determine a target display screen for user interaction according to the state information, the target display screen being the first display screen or the second display screen;
a key determination module configured to determine a target key for responding to a thumb operation of the user according to the target display screen and the left-right hand setting information, the target key being a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
a signal receiving module configured to receive an operation signal corresponding to the target key;
an operation execution module configured to execute an operation corresponding to the operation signal.
8. The apparatus of claim 7, wherein the key determination module is configured to:
when the target display screen is the first display screen and the left-hand and right-hand setting information is used for indicating that the user is the right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key;
when the target display screen is the first display screen and the left-right hand setting information is used for indicating that the user is the left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a right-hand operation user, determining that the target key used for responding to the thumb operation of the user is the second key;
and when the target display screen is the second display screen and the left-right hand setting information is used for indicating that the user is a left-hand operation user, determining that the target key used for responding to the thumb operation of the user is the first key.
9. The apparatus of claim 7, wherein the operation execution module is configured to:
when the operation signal is a first operation signal, executing screen-off operation or screen-on operation;
or,
when the operation signal is a second operation signal, executing a starting operation or a shutdown operation;
or,
and when the operation signal is a third operation signal, starting an Artificial Intelligence (AI) function.
10. The apparatus according to any one of claims 7 to 9, wherein the status information includes acceleration data collected by an acceleration sensor of the terminal;
the display screen determination module configured to:
when the acceleration data in the gravity direction belongs to a first value range, determining the target display screen interacted by the user as the first display screen;
and when the acceleration data in the gravity direction belongs to a second value range, determining that the target display screen interacted by the user is the second display screen.
11. The apparatus of any of claims 7 to 9, wherein the status information comprises touch information comprising a first touch area on the first display screen and a second touch area on the second display screen;
the display screen determination module configured to:
when the first touch area is smaller than the second touch area, determining that the target display screen interacted by the user is the first display screen;
and when the first touch area is larger than the second touch area, determining that the target display screen interacted by the user is the second display screen.
12. The apparatus according to any one of claims 7 to 9, wherein the status information comprises holding information, the holding information comprising the number of touch points on the right side frame and the number of touch points on the left side frame;
the display screen determination module configured to:
if the user is the right-handed operation user, determining that the target display screen interacted by the user is the first display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the second display screen;
if the user is the left-handed operation user, determining that the target display screen interacted by the user is the second display screen when the number of touch points on the right side frame is smaller than that of the touch points on the left side frame; and when the number of the touch points on the right side frame is larger than that of the touch points on the left side frame, determining that the target display screen interacted by the user is the first display screen.
13. An operation execution device is applied to a terminal, wherein the terminal comprises a first display screen and a second display screen, the first display screen is arranged on a front panel of the terminal, and the second display screen is arranged on a back panel of the terminal;
the device comprises:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to:
acquiring state information of the terminal; the state information of the terminal is used for indicating the screen state of the terminal;
determining a target display screen interacted by a user according to the state information, wherein the target display screen is the first display screen or the second display screen;
determining a target key for responding to the thumb operation of the user according to the target display screen and the left-right hand setting information, wherein the target key is a first key or a second key; the first key is located on the right side frame of the terminal, the second key is located on the left side frame of the terminal, and the left-hand and right-hand setting information is used for indicating whether the user is a right-hand operation user or a left-hand operation user in the state that the front panel faces upwards;
receiving an operation signal corresponding to the target key;
and executing the operation corresponding to the operation signal.
14. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201910234591.1A 2019-03-26 2019-03-26 Operation execution method, device and storage medium Pending CN111756907A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910234591.1A CN111756907A (en) 2019-03-26 2019-03-26 Operation execution method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910234591.1A CN111756907A (en) 2019-03-26 2019-03-26 Operation execution method, device and storage medium

Publications (1)

Publication Number Publication Date
CN111756907A true CN111756907A (en) 2020-10-09

Family

ID=72670965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910234591.1A Pending CN111756907A (en) 2019-03-26 2019-03-26 Operation execution method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111756907A (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870034A (en) * 2012-12-10 2014-06-18 国基电子(上海)有限公司 Touch device and control method thereof
CN104020694A (en) * 2014-06-16 2014-09-03 法视网络传媒技术(北京)有限公司 Energy-saving wearable device capable of being triggered through one key, mobile terminal, and triggering method and device
CN104572215A (en) * 2015-01-27 2015-04-29 四川盛利兴电子科技有限公司 Method and device for controlling startup and shutdown of intelligent terminal
CN105183156A (en) * 2015-08-31 2015-12-23 小米科技有限责任公司 Screen control method and apparatus
CN105468269A (en) * 2014-08-15 2016-04-06 深圳市中兴微电子技术有限公司 Mobile terminal capable of automatically identifying holding by left hand or right hand, and implementation method thereof
CN105898055A (en) * 2016-04-08 2016-08-24 广东欧珀移动通信有限公司 Mobile terminal dormancy method, device and mobile terminal
CN106020871A (en) * 2016-05-11 2016-10-12 青岛海信移动通信技术股份有限公司 Screen starting method and device for mobile equipment
CN106572207A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Terminal single hand mode identification device and method
CN106657667A (en) * 2017-01-09 2017-05-10 努比亚技术有限公司 Apparatus and method for lighting display screen
CN107102733A (en) * 2017-04-14 2017-08-29 宇龙计算机通信科技(深圳)有限公司 A kind of electronic equipment touch-screen control method and device
CN107145297A (en) * 2017-05-08 2017-09-08 北京小米移动软件有限公司 Electronic equipment
WO2018076506A1 (en) * 2016-10-25 2018-05-03 华为技术有限公司 Method for lighting up screen of double-screen terminal, and terminal
CN108777731A (en) * 2018-05-17 2018-11-09 Oppo广东移动通信有限公司 Key configurations method, apparatus, mobile terminal and storage medium
CN108848256A (en) * 2018-05-28 2018-11-20 维沃移动通信有限公司 A kind of key control method and double screen terminal of double screen terminal
CN108897451A (en) * 2018-06-29 2018-11-27 努比亚技术有限公司 The recognition methods of major-minor display screen, mobile terminal and computer readable storage medium
CN109063444A (en) * 2018-07-23 2018-12-21 努比亚技术有限公司 Mobile terminal screen unlocking method, mobile terminal and computer readable storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870034A (en) * 2012-12-10 2014-06-18 国基电子(上海)有限公司 Touch device and control method thereof
CN104020694A (en) * 2014-06-16 2014-09-03 法视网络传媒技术(北京)有限公司 Energy-saving wearable device capable of being triggered through one key, mobile terminal, and triggering method and device
CN105468269A (en) * 2014-08-15 2016-04-06 深圳市中兴微电子技术有限公司 Mobile terminal capable of automatically identifying holding by left hand or right hand, and implementation method thereof
CN104572215A (en) * 2015-01-27 2015-04-29 四川盛利兴电子科技有限公司 Method and device for controlling startup and shutdown of intelligent terminal
CN105183156A (en) * 2015-08-31 2015-12-23 小米科技有限责任公司 Screen control method and apparatus
CN105898055A (en) * 2016-04-08 2016-08-24 广东欧珀移动通信有限公司 Mobile terminal dormancy method, device and mobile terminal
CN106020871A (en) * 2016-05-11 2016-10-12 青岛海信移动通信技术股份有限公司 Screen starting method and device for mobile equipment
WO2018076506A1 (en) * 2016-10-25 2018-05-03 华为技术有限公司 Method for lighting up screen of double-screen terminal, and terminal
CN106572207A (en) * 2016-10-31 2017-04-19 努比亚技术有限公司 Terminal single hand mode identification device and method
CN106657667A (en) * 2017-01-09 2017-05-10 努比亚技术有限公司 Apparatus and method for lighting display screen
CN107102733A (en) * 2017-04-14 2017-08-29 宇龙计算机通信科技(深圳)有限公司 A kind of electronic equipment touch-screen control method and device
CN107145297A (en) * 2017-05-08 2017-09-08 北京小米移动软件有限公司 Electronic equipment
CN108777731A (en) * 2018-05-17 2018-11-09 Oppo广东移动通信有限公司 Key configurations method, apparatus, mobile terminal and storage medium
CN108848256A (en) * 2018-05-28 2018-11-20 维沃移动通信有限公司 A kind of key control method and double screen terminal of double screen terminal
CN108897451A (en) * 2018-06-29 2018-11-27 努比亚技术有限公司 The recognition methods of major-minor display screen, mobile terminal and computer readable storage medium
CN109063444A (en) * 2018-07-23 2018-12-21 努比亚技术有限公司 Mobile terminal screen unlocking method, mobile terminal and computer readable storage medium

Similar Documents

Publication Publication Date Title
US10750007B2 (en) Method and device for preventing terminal from being inadvertently touched
RU2630189C1 (en) Method of controlling button functions in one-hand operation mode, device and electronic device
CN108255369B (en) Method and device for displaying fingerprint icon in screen and computer readable storage medium
US11199928B2 (en) Method and apparatus for preventing false touch on edge, and storage medium
CN106055097A (en) Screen lighting control method and apparatus, and electronic device
EP3232301B1 (en) Mobile terminal and virtual key processing method
US11159703B2 (en) Shooting interface switching method and apparatus, and device and storage medium thereof
CN111610912B (en) Application display method, application display device and storage medium
US20210165670A1 (en) Method, apparatus for adding shortcut plug-in, and intelligent device
CN107390977A (en) Button operation method, apparatus and storage medium
CN110782847B (en) Page refreshing method and device for ink screen
CN111488098B (en) Method and device for adjusting parameters of touch screen, electronic equipment and storage medium
CN111991801A (en) Display method and device and electronic equipment
CN114153361B (en) Interface display method, device, terminal and storage medium
CN111756907A (en) Operation execution method, device and storage medium
CN111756911A (en) Power key selection method, device and storage medium
CN111381667A (en) Interface layout adjusting method and device and storage medium
CN107329604B (en) Mobile terminal control method and device
CN111756985A (en) Image shooting method, device and storage medium
CN112732133A (en) Message processing method and device, electronic equipment and storage medium
CN111752463A (en) Operation execution method, device and storage medium
CN113384878B (en) Virtual card control method, device, terminal and storage medium
CN113495666B (en) Terminal control method, terminal control device and storage medium
CN113220203B (en) Activity entry display method, device, terminal and storage medium
US11531426B1 (en) Edge anti-false-touch method and apparatus, electronic device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201009