CN114756165A - Equipment control method and device - Google Patents

Equipment control method and device Download PDF

Info

Publication number
CN114756165A
CN114756165A CN202210455143.6A CN202210455143A CN114756165A CN 114756165 A CN114756165 A CN 114756165A CN 202210455143 A CN202210455143 A CN 202210455143A CN 114756165 A CN114756165 A CN 114756165A
Authority
CN
China
Prior art keywords
input
display screen
main display
user
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210455143.6A
Other languages
Chinese (zh)
Inventor
曾亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210455143.6A priority Critical patent/CN114756165A/en
Publication of CN114756165A publication Critical patent/CN114756165A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a device control method and a device, belongs to the technical field of communication, and is applied to double-sided screen equipment, wherein the double-sided screen equipment comprises a main display screen and an auxiliary display screen, and the method comprises the following steps: receiving a first input of a user to a first position of the secondary display screen; mapping the first input to a second position of the main display screen according to the first position and the position corresponding relation between the auxiliary display screen and the main display screen; receiving a second input of the user to the main display screen; an operation corresponding to the second input and the input mapped on the second position is performed.

Description

Equipment control method and device
Technical Field
The application belongs to the technical field of communication, and particularly relates to a device control method and device.
Background
In recent years, with the upgrade of hardware configuration of electronic devices, screens of the electronic devices are larger and richer, and screen touch gestures for controlling the devices are richer, such as operations of zooming, dragging with two fingers, and screen capturing with three fingers sliding down. However, the larger and larger screens bring visual convenience to the user and also bring problems, such as: it is difficult for a user to operate the device with one hand.
Disclosure of Invention
The embodiment of the application aims to provide a device control method and device, and can solve the problem that the device is difficult to operate by one hand in the prior art.
In a first aspect, an embodiment of the present application provides an apparatus control method, which is applied to a dual-panel apparatus, where the dual-panel apparatus includes a main display panel and a sub display panel, and the method includes:
receiving a first input of a user to a first position of the secondary display screen;
mapping the first input to a second position of the main display screen according to the first position and the position corresponding relation between the auxiliary display screen and the main display screen;
receiving a second input of the main display screen from a user;
performing an operation corresponding to the second input and the input mapped on the second location.
In a second aspect, an embodiment of the present application provides an apparatus control device, which is applied to a dual-panel device, where the dual-panel device includes a main display panel and a sub-display panel, and the apparatus includes:
the first receiving module is used for receiving first input of a user to a first position of the secondary display screen;
a mapping module, configured to map the first input to a second position of the main display according to the first position and a position correspondence between the sub display and the main display;
The second receiving module is used for receiving a second input of the user to the main display screen;
a processing module to perform an operation corresponding to the second input and the input mapped on the second location.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, which is stored in a storage medium and executed by at least one processor to implement the method according to the first aspect.
In an embodiment of the present application, a dual-screen device may receive a first input from a user to a first position of a secondary display, map the first input to a second position of a primary display according to the first position and a position correspondence between the secondary display and the primary display, receive a second input from the user to the primary display, and perform an operation corresponding to the second input and the input mapped to the second position. In the embodiment of the application, can be according to the characteristics of the main and auxiliary screen of double-sided screen equipment and one-hand operation, pass through the technique with the help of the touch of vice display screen finger, transmit the finger touch of vice display screen to the corresponding position of main display screen on, through the linkage of the finger on vice display screen and the finger on the main display screen, realize the one-hand operation to double-sided screen equipment for the user can also conveniently operate equipment when holding equipment in one hand.
Drawings
Fig. 1 is an exemplary diagram of a dual-screen device provided in an embodiment of the present application;
FIG. 2 is a diagram of an example of another double-sided screen device provided in an embodiment of the present application
Fig. 3 is a flowchart of a device control method provided in an embodiment of the present application;
FIG. 4 is a diagram of a first example of a two-finger touch input provided by an embodiment of the application;
FIG. 5 is a diagram illustrating a second example of a two-finger touch input provided by an embodiment of the present application;
FIG. 6 is an exemplary diagram of a multi-finger touch input provided by an embodiment of the present application;
FIG. 7 is a diagram of a third example of a two-finger touch input provided by an embodiment of the present application;
fig. 8 is a block diagram of a device control apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a hardware structure diagram of an electronic device implementing various embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In recent years, with the upgrade of hardware configuration of electronic devices, screens of the electronic devices are larger and richer, and screen touch gestures for controlling the devices are richer, such as zooming, dragging with two fingers, and screen capture with three fingers sliding down. However, the larger and larger screens bring visual convenience to the user and also bring problems, such as: it becomes increasingly difficult for a user to operate the device with one hand. In order to solve the above technical problem, an embodiment of the present application provides an apparatus control method and apparatus.
The device control method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
It should be noted that the device control method provided in the embodiment of the present application is applicable to an electronic device, specifically, a dual-panel electronic device, where the dual-panel device includes a main display screen and a sub display screen, the main display screen is disposed on a front side of the dual-panel device, and the sub display screen is disposed on a back side of the dual-panel device. In practical applications, the structure of the dual-panel electronic device may be as shown in fig. 1 or fig. 2.
Fig. 3 is a flowchart of a device control method provided in an embodiment of the present application, and as shown in fig. 3, the method may include the following steps: step 301, step 302, step 303 and step 304, wherein,
In step 301, a first input from a user to a first position of a secondary display screen is received.
In this embodiment, the first input may be: a user touches a touch input (which may also be referred to as a touch screen operation) of the secondary display screen by one finger or a plurality of fingers. Alternatively, the first input may be a gesture input.
In some embodiments, considering that when a user holds the device with one hand, the thumb usually performs a touch operation on the front side of the device, the fingers except the thumb perform an operation on the back side of the device, and the other fingers perform a touch operation on the display screen on the back side in a limited range, for the convenience of the user, a target operation area (also referred to as a "sub-screen operation area") may be set on the sub-display screen, and the user performs an input in the target operation area, where the first position is located in the target operation area of the sub-display screen.
The size of the target operation area is smaller than that of the secondary display screen, and the position feature point in the target operation area has a corresponding relationship with the position feature point in the primary display screen, for example, each position feature point in the area can be mapped to the whole screen area of the primary display screen in an equal ratio. The target operation region may be located at an arbitrary position of the sub display screen, and the target operation region may be located at an upper portion of the sub display screen in consideration of that a user is more accustomed to operating in an upper region of the sub display screen at the time of operation.
In some embodiments, the user input may be performed at any position of the secondary display screen without providing a special secondary screen operating area, and at this time, the position feature point in the secondary display screen and the position feature point in the primary display screen have a corresponding relationship.
In step 302, the first input is mapped to a second position of the main display based on the first position and the positional relationship between the secondary display and the main display.
In an embodiment of the application, the second position is a position on the main display screen corresponding to the first position on the secondary display screen.
In the embodiment of the application, the first input received on the secondary display screen is mapped to the main display screen so as to simulate the user to input on the main display screen, and then the simulated input on the main display screen is combined with the subsequent real input of the user on the main display screen, so that the single-hand operation of the equipment is realized.
In this embodiment, in order to facilitate the user to intuitively understand the simulated input on the main display screen (i.e., the input mapped on the secondary display screen to the main display screen), after mapping the first input received on the secondary display screen to the second position on the main display screen, a pointer icon control for representing the input may also be displayed at the second position on the main display screen. That is, the primary display synchronously displays user input on the secondary display for indicating that the user input on the secondary display will participate in the user input of the primary display.
In practical applications, the display form of the pointer icon control (also referred to as "sub-screen pointer" or "sub-screen pointer") on the main display screen may be a small semi-transparent circle.
In step 303, a second input from the user to the main display screen is received.
In this embodiment, the second input may be: a user touches a touch input of the sub display screen by one finger (may also be referred to as a touch screen operation). Alternatively, the second input may be a gesture input.
In the embodiment of the application, when the first input and the second input are both touch inputs, after a user touches one or more fingers on the secondary display screen, the remaining other fingers can be touched on the main display screen, and at this time, a plurality of fingers on the primary display screen and the secondary display screen participate in touch screen operation.
For example, as shown in fig. 4, a user touches one finger on the secondary display screen and touches the other finger on the primary display screen, at this time, two fingers on the display screen participate in the touch screen operation, and in addition, a pointer icon control corresponding to the touch input on the secondary display screen, that is, a secondary screen pointer, is displayed on the primary display screen.
For another example, as shown in fig. 6, the user touches two fingers on the secondary display screen and touches another finger on the primary display screen, at this time, three fingers on the display screen participate in the touch screen operation, and two pointer icon controls corresponding to the touch input on the secondary display screen, that is, secondary screen pointers, are displayed on the primary display screen.
In step 304, an operation corresponding to the second input and the input mapped on the second location is performed.
In some embodiments, the zooming operation of the content displayed on the main display screen under the single-hand operation can be realized through the linkage of the second input and the first input, in this case, the second input is a sliding input, and the first input is static on the auxiliary display screen.
For example, as shown in fig. 5, when the main display screen is currently displaying an image, the finger in the target operation area of the sub-screen is stationary, and the finger on the main display screen moves away from the pointer of the sub-screen in any direction, so that the image can be enlarged; the image can be reduced by moving the finger on the main display screen to approach the pointer on the auxiliary screen in any direction.
In some embodiments, in order to implement functions such as dragging through the linkage between the finger on the secondary display screen and the finger on the primary display screen, considering that the finger on the secondary display screen is inconvenient to move, a scheme for activating the finger pointer control may be designed, and at this time, after a pointer icon control for representing input is displayed at a second position on the primary display screen, the following steps may be added:
and receiving an activation input of the pointer icon control by a user, and setting the input at the position of the pointer icon control in an activation state in response to the activation input, wherein the input at the position of the pointer icon control in the activation state moves in the same direction and at the same distance as the input of the user on the main display screen.
For example, the second input is a sliding input, as shown in fig. 7, the user may activate an input corresponding to the sub-screen pointer by double-clicking the sub-screen pointer (i.e., the pointer icon control), and when the sub-screen pointer is in an activated state and the user moves the finger on the main display screen, the sub-screen pointer may move in the same direction and at the same distance along with the finger on the main display screen, so as to implement an operation of moving the two fingers simultaneously, such as a dragging operation.
In some embodiments, in order to facilitate the user to intuitively know the input state on the secondary screen, after the input at the position of the pointer icon control is set to the active state, the display manner of the pointer icon control may be switched from the current first display manner to a preset second display manner, where the first display manner is used to represent that the input at the position of the pointer image control is in the inactive state, and the second display manner is used to represent that the input at the position of the pointer icon control is in the active state. For example, when in the activated state, the display of the pointer icon control may undergo some change to reflect the state, such as a color change, breathing animation, and the like.
In the embodiment of the application, when the fingers on the secondary display screen leave the secondary screen operation area, the input combined by the fingers on the primary display screen and the secondary display screen is finished, and the secondary screen pointer icon control on the primary display screen disappears.
It should be noted that, the solutions of the embodiments of the present application include, but are not limited to, simultaneous movement of two fingers, and simultaneous movement of multiple fingers is still applicable. When a plurality of fingers touch the screen on the secondary display screen, the fingers on the primary display screen can be moved only by the input corresponding to the activated pointer icon control.
As can be seen from the above embodiments, in this embodiment, the dual-screen device may receive a first input from a user to a first position of the secondary display, map the first input to a second position of the main display according to the first position and the corresponding relationship between the positions of the secondary display and the main display, receive a second input from the user to the main display, and perform an operation corresponding to the second input and the input mapped to the second position. Compared with the prior art, in the embodiment of the application, the characteristics of the main screen and the auxiliary screen of the double-screen equipment and one-hand operation can be realized by means of the finger touch transparent transmission technology of the auxiliary display screen, the finger touch of the auxiliary display screen is transmitted to the corresponding position of the main display screen, and the one-hand operation of the double-screen equipment is realized through the linkage of the fingers on the auxiliary display screen and the fingers on the main display screen, so that the user can conveniently operate the equipment while holding the equipment by one hand.
In the device control method provided by the embodiment of the application, the execution main body can be a device control apparatus. In the embodiment of the present application, an apparatus control device is taken as an example to execute an apparatus control method, and the apparatus control device provided in the embodiment of the present application is described.
Fig. 8 is a block diagram of a device control apparatus according to an embodiment of the present application, which is applied to a dual-panel electronic device, where the dual-panel device includes a main display panel and a sub display panel, and as shown in fig. 8, the device control apparatus 800 may include: a first receiving module 801, a mapping module 802, a second receiving module 803, and a processing module 804, wherein,
a first receiving module 801, configured to receive a first input of a user to a first position of the secondary display screen;
a mapping module 802, configured to map the first input to a second position of the main display according to the first position and the corresponding relationship between the positions of the sub display and the main display;
a second receiving module 803, configured to receive a second input to the main display screen from the user;
a processing module 804 configured to perform an operation corresponding to the second input and the input mapped on the second location.
As can be seen from the above embodiments, in this embodiment, the dual-screen device may receive a first input from a user to a first position of the secondary display screen, map the first input to a second position of the primary display screen according to the first position and the corresponding relationship between the positions of the secondary display screen and the primary display screen, receive a second input from the user to the primary display screen, and perform an operation corresponding to the two inputs according to the second input and the input mapped to the second position. Compared with the prior art, in the embodiment of the application, according to the characteristics of the main screen and the auxiliary screen of the double-sided screen equipment and the single-hand operation, the finger touch transmission technology of the auxiliary display screen is used, the finger touch of the auxiliary display screen is transmitted to the corresponding position of the main display screen, the single-hand operation of the double-sided screen equipment is realized through the linkage of the finger on the auxiliary display screen and the finger on the main display screen, and the equipment can be conveniently operated when a user holds the equipment with one hand.
Optionally, as an embodiment, the second input is a slide input, and the first input is stationary on the secondary display screen.
Optionally, as an embodiment, the apparatus control device 800 may further include:
and the display module is used for displaying a pointer icon control for representing input at the second position of the main display screen.
Optionally, as an embodiment, the apparatus control device 800 may further include:
a third receiving module, configured to receive an activation input of the pointer icon control by a user;
and the setting module is used for responding to the activation input and setting the input at the position of the pointer icon control to be in an activation state, wherein the input at the position of the pointer icon control in the activation state moves in the same direction and at the same distance with the input of the user on the main display screen.
Optionally, as an embodiment, the first position is located in a target operation area of the secondary display screen; the size of the target operation area is smaller than that of the secondary display screen, and the position characteristic point in the target operation area and the position characteristic point in the main display screen have a corresponding relation.
The device control apparatus in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook, or a Personal Digital Assistant (PDA), and may also be a server, a Network Attached Storage (Storage), a personal computer (NAS), a Television (TV), a teller machine, or a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The device control apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The device control apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiment described in fig. 3, and is not described here again to avoid repetition.
Optionally, as shown in fig. 9, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 901 and a memory 902, where the memory 902 stores a program or an instruction that can be executed on the processor 901, and when the program or the instruction is executed by the processor 901, the steps of the above device control method embodiment are implemented, and the same technical effects can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 10 is a hardware structure diagram of an electronic device implementing various embodiments of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The user input unit 1007 is configured to receive a first input by a user to a first position of the secondary display screen;
a processor 1010, configured to map the first input to a second position of the main display according to the first position and the position corresponding relationship between the sub display and the main display;
a user input unit 1007, further configured to receive a second input to the main display screen from the user;
a processor 1010 further configured to perform an operation corresponding to the second input and the input mapped at the second location.
As can be seen, in the embodiment of the present application, the dual-panel device may receive a first input from a user to a first position of the secondary display, map the first input to a second position of the primary display according to the first position and the corresponding relationship between the positions of the secondary display and the primary display, receive a second input from the user to the primary display, and perform an operation corresponding to the two inputs according to the second input and the input mapped to the second position. Compared with the prior art, in the embodiment of the application, the characteristics of the main screen and the auxiliary screen of the double-screen equipment and one-hand operation can be realized by means of the finger touch transparent transmission technology of the auxiliary display screen, the finger touch of the auxiliary display screen is transmitted to the corresponding position of the main display screen, and the one-hand operation of the double-screen equipment is realized through the linkage of the fingers on the auxiliary display screen and the fingers on the main display screen, so that the user can conveniently operate the equipment while holding the equipment by one hand.
Optionally, as an embodiment, the second input is a slide input, and the first input is stationary on the secondary display screen.
Optionally, as an embodiment, the display unit 1006 is configured to display a pointer icon control for representing input at the second position of the main display screen.
Optionally, as an embodiment, the user input unit 1007 is further configured to receive an activation input of the pointer icon control by a user;
the processor 1010 is further configured to set an input at the position of the pointer icon control to an active state in response to the activation input, wherein the input at the position of the pointer icon control in the active state moves in the same direction and at the same distance as the user input on the main display screen.
Optionally, as an embodiment, the first position is located in a target operation area of the secondary display screen;
the size of the target operation area is smaller than that of the secondary display screen, and the position characteristic points in the target operation area are in corresponding relation with the position characteristic points in the primary display screen.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first storage area for storing a program or an instruction and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, and the like) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or non-volatile memory, or the memory 1009 may include both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct bus RAM (DRRAM). The memory 1009 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor, which primarily handles operations related to the operating system, user interface, applications, etc., and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above device control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above device control method embodiment, and can achieve the same technical effect, and is not described here again to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, or a system-on-chip.
The embodiment of the present application further provides a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the above device control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A device control method applied to a double-sided device including a main display screen and a sub-display screen, the method comprising:
receiving a first input of a user to a first position of the secondary display screen;
mapping the first input to a second position of the main display screen according to the first position and the position corresponding relation between the auxiliary display screen and the main display screen;
receiving a second input of the main display screen from a user;
performing an operation corresponding to the second input and the input mapped on the second location.
2. The method of claim 1, wherein the second input is a slide input and the first input is stationary on the secondary display screen.
3. The method of claim 1, wherein after mapping the first input to the second position of the main display based on the first position and the positional correspondence of the secondary display to the main display, the method further comprises:
displaying, at the second location of the primary display screen, a pointer icon control for characterizing an input.
4. The method of claim 3, wherein after displaying a pointer icon control for characterizing an input at the second location of the primary display screen, the method further comprises:
Receiving an activation input of the pointer icon control by a user;
in response to the activation input, setting the input at the pointer icon control position to an activated state, wherein the input at the pointer icon control position in the activated state moves the same distance and in the same direction as the user input on the main display screen.
5. The method according to any one of claims 1 to 4,
the first position is located in a target operation area of the secondary display screen;
the size of the target operation area is smaller than that of the secondary display screen, and the position characteristic points in the target operation area are in corresponding relation with the position characteristic points in the primary display screen.
6. An apparatus control device applied to a double-sided device including a main display screen and a sub-display screen, the apparatus comprising:
the first receiving module is used for receiving first input of a user to a first position of the secondary display screen;
a mapping module, configured to map the first input to a second position of the main display according to the first position and a position correspondence between the sub display and the main display;
The second receiving module is used for receiving a second input of the user to the main display screen;
a processing module to perform an operation corresponding to the second input and the input mapped on the second location.
7. The apparatus of claim 6, wherein the second input is a slide input and the first input is stationary on the secondary display screen.
8. The apparatus of claim 6, further comprising:
and the display module is used for displaying a pointer icon control for representing input at the second position of the main display screen.
9. The apparatus of claim 8, further comprising:
the third receiving module is used for receiving the activation input of the pointer icon control by the user;
and the setting module is used for responding to the activation input and setting the input at the position of the pointer icon control to be in an activation state, wherein the input at the position of the pointer icon control in the activation state moves in the same direction and at the same distance with the input of the user on the main display screen.
10. The apparatus according to any one of claims 6 to 9,
The first position is located in a target operation area of the secondary display screen;
the size of the target operation area is smaller than that of the auxiliary display screen, and the position characteristic points in the target operation area are in corresponding relation with the position characteristic points in the main display screen.
CN202210455143.6A 2022-04-24 2022-04-24 Equipment control method and device Pending CN114756165A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210455143.6A CN114756165A (en) 2022-04-24 2022-04-24 Equipment control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210455143.6A CN114756165A (en) 2022-04-24 2022-04-24 Equipment control method and device

Publications (1)

Publication Number Publication Date
CN114756165A true CN114756165A (en) 2022-07-15

Family

ID=82332877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210455143.6A Pending CN114756165A (en) 2022-04-24 2022-04-24 Equipment control method and device

Country Status (1)

Country Link
CN (1) CN114756165A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105183236A (en) * 2015-10-19 2015-12-23 上海交通大学 Touch screen input device and method
CN109343788A (en) * 2018-09-30 2019-02-15 维沃移动通信有限公司 A kind of method of controlling operation thereof and mobile terminal of mobile terminal
CN109828710A (en) * 2019-01-21 2019-05-31 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN111831196A (en) * 2019-04-17 2020-10-27 成都鼎桥通信技术有限公司 Control method of folding screen, terminal device and storage medium
CN112578981A (en) * 2019-09-29 2021-03-30 华为技术有限公司 Control method of electronic equipment with flexible screen and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105183236A (en) * 2015-10-19 2015-12-23 上海交通大学 Touch screen input device and method
CN109343788A (en) * 2018-09-30 2019-02-15 维沃移动通信有限公司 A kind of method of controlling operation thereof and mobile terminal of mobile terminal
CN109828710A (en) * 2019-01-21 2019-05-31 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN111831196A (en) * 2019-04-17 2020-10-27 成都鼎桥通信技术有限公司 Control method of folding screen, terminal device and storage medium
CN112578981A (en) * 2019-09-29 2021-03-30 华为技术有限公司 Control method of electronic equipment with flexible screen and electronic equipment

Similar Documents

Publication Publication Date Title
CN111857509A (en) Split screen display method and device and electronic equipment
WO2022121790A1 (en) Split-screen display method and apparatus, electronic device, and readable storage medium
CN112162665A (en) Operation method and device
CN112433693B (en) Split screen display method and device and electronic equipment
CN113703624A (en) Screen splitting method and device and electronic equipment
WO2022262722A1 (en) Response method and apparatus of electronic device, and electronic device
WO2023155877A1 (en) Application icon management method and apparatus and electronic device
CN114518820A (en) Icon sorting method and device and electronic equipment
CN112783408A (en) Gesture navigation method and device of electronic equipment, equipment and readable storage medium
CN112148167A (en) Control setting method and device and electronic equipment
WO2021254377A1 (en) Application icon display method and apparatus, and electronic device
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
CN114415886A (en) Application icon management method and electronic equipment
WO2024037419A1 (en) Display control method and apparatus, electronic device, and readable storage medium
WO2023093661A1 (en) Interface control method and apparatus, and electronic device and storage medium
CN116107531A (en) Interface display method and device
CN111638828A (en) Interface display method and device
WO2023030307A1 (en) Screenshot method and apparatus, and electronic device
CN114416269A (en) Interface display method and display device
CN115357173A (en) Screen control method and device and electronic equipment
CN114995713A (en) Display control method and device, electronic equipment and readable storage medium
CN114594897A (en) One-hand control method and device for touch screen, electronic equipment and storage medium
CN114756165A (en) Equipment control method and device
CN113515216A (en) Application program switching method and device and electronic equipment
CN112162681A (en) Text operation execution method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination