CN111367483A - Interaction control method and electronic equipment - Google Patents
Interaction control method and electronic equipment Download PDFInfo
- Publication number
- CN111367483A CN111367483A CN202010142421.3A CN202010142421A CN111367483A CN 111367483 A CN111367483 A CN 111367483A CN 202010142421 A CN202010142421 A CN 202010142421A CN 111367483 A CN111367483 A CN 111367483A
- Authority
- CN
- China
- Prior art keywords
- input
- display screen
- interface
- target area
- sliding track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000003993 interaction Effects 0.000 title claims abstract description 11
- 230000004044 response Effects 0.000 claims description 18
- 238000012545 processing Methods 0.000 claims description 17
- 238000004891 communication Methods 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 16
- 230000008569 process Effects 0.000 description 13
- 238000004590 computer program Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 230000002452 interceptive effect Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides an interaction control method and electronic equipment, and relates to the technical field of communication. The method is applied to the electronic equipment and comprises the following steps: receiving a first input of a user on a second display screen under the condition that the first display screen of the electronic equipment is in a display state; according to the sliding track of the first input on the second display screen, under the condition that the distance between the current position of the first input and a target area meets a preset condition, displaying a first interface on the first display screen, wherein the first interface comprises the sliding track and the target area; and in the case that the end point of the sliding track is in the target area, responding to the first input, and executing the operation corresponding to the first input. The scheme of the embodiment of the invention is used for solving the problem that the electronic equipment is easy to touch by mistake when in gesture operation.
Description
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an interaction control method and an electronic device.
Background
With the development of communication technology, electronic equipment can realize shooting, watching videos, playing games and the like of users besides basic communication, and has become an important part in life of people due to more and more diversified functions.
At present, in order to facilitate the use of the electronic device by the user, a lot of shortcut operations are set, for example, when playing a game, a certain shortcut action is completed through a preset gesture: the upper slide opens the game set and the lower slide closes the game set.
However, when the user holds the device in both hands, fingers may unintentionally touch the screen, and these motions are easily recognized as preset gestures of the user, thereby causing a false touch.
Disclosure of Invention
The embodiment of the invention provides an interactive control method and electronic equipment, and aims to solve the technical problem that the electronic equipment is easy to trigger by mistake when gesture operation is carried out on the electronic equipment.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an interaction control method applied to an electronic device, including:
receiving a first input of a user on a second display screen under the condition that the first display screen of the electronic equipment is in a display state;
according to the sliding track of the first input on the second display screen, under the condition that the distance between the current position of the first input and a target area meets a preset condition, displaying a first interface on the first display screen, wherein the first interface comprises the sliding track and the target area;
and in the case that the end point of the sliding track is in the target area, responding to the first input, and executing the operation corresponding to the first input.
In a second aspect, an embodiment of the present invention further provides an electronic device, including:
the first receiving module is used for receiving a first input of a user on the second display screen under the condition that the first display screen of the electronic equipment is in a display state;
the first processing module is used for displaying a first interface on the first display screen according to the sliding track of the first input on the second display screen under the condition that the distance between the current position of the first input and a target area meets a preset condition, wherein the first interface comprises the sliding track and the target area;
and the second processing module is used for responding to the first input and executing the operation corresponding to the first input under the condition that the end point of the sliding track is in the target area.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the interaction control method described above.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the steps of the interaction control method described above.
In this way, in the embodiment of the present invention, when the first display screen is in the display state, by receiving the first input of the user on the second display screen, the first interface is displayed on the first display screen through the sliding track of the first input, and when the distance between the current position of the first input and the target area meets the preset condition, the user is guided to complete the input; and then, under the condition that the end point of the sliding track is in the target area, excluding the first input as the false touch input of the user, so that the corresponding operation is finished in response to the first input, and for the input which is stopped in advance without reaching the target area, recognizing that the false touch input is not processed, reducing the false trigger rate and reducing the power consumption of the electronic equipment.
Drawings
FIG. 1 is a schematic flow chart of a method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an application of the method according to the embodiment of the present invention;
FIG. 3 is a second schematic diagram illustrating an application of the method according to the embodiment of the present invention;
FIG. 4 is a third schematic diagram illustrating an application of the method according to the embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
fig. 6 is a schematic structural diagram of an electronic device according to another embodiment of the invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1, an interaction control method according to an embodiment of the present invention is applied to an electronic device, and includes:
In this step, under the condition that the first display screen of the electronic device is in the display state, by receiving the first input of the user on the second display screen, it is known that the user may have other requirements, that is, the next step is executed.
102, according to the sliding track of the first input on the second display screen, displaying a first interface on the first display screen under the condition that the distance between the current position of the first input and the target area meets a preset condition, wherein the first interface comprises the sliding track and the target area.
Here, the first interface is an interface for guiding the user to complete the slide input to exclude a false touch of the first input. In this step, for the first input received in step 101, the distance between the current position of the first input and the target area is matched based on the preset condition according to the sliding track, and when the distance meets the preset condition, it is preliminarily determined that the first input is a non-false touch input, so that the first interface is further displayed.
The sliding track on the first interface is a real-time sliding track of the first input from the starting point to the current position; the target area is a trigger area, and the sliding track of the first input stops in the target area, so that the user can be excluded from mistakenly touching, and the processing corresponding to the first input is triggered.
And 103, responding to the first input and executing the operation corresponding to the first input under the condition that the end point of the sliding track is in the target area.
In this step, when the end point of the sliding track is within the target area, the user is excluded from making a false touch, and then the corresponding operation is performed in response to the first input.
Therefore, according to steps 101 to 103, in the method according to the embodiment of the present invention, when the first display screen is in the display state, by receiving the first input of the user on the second display screen, through the sliding track of the first input, and under the condition that the distance between the current position of the first input and the target area meets the preset condition, the first interface is displayed on the first display screen, and the user is guided to complete the input; and then, under the condition that the end point of the sliding track is in the target area, excluding the first input as the false touch input of the user, so that the corresponding operation is finished in response to the first input, and for the input which is stopped in advance without reaching the target area, the false touch input can be identified as not to be processed, the false trigger rate is reduced, and the power consumption of the electronic equipment is reduced.
Here, the operation corresponding to the first input is display of a related interface, display of a function floating key, or the like.
Wherein the preset condition is "equal to the first threshold". Of course, in consideration of the start position of the input, it is preferable that the preset condition is that the distance L1 of the current position of the first input from the target region is equal to or less than half of the distance L2 of the start position from the target region.
The method in this embodiment can be applied to a shortcut function triggering scenario of a preset gesture: if the preset gesture is upward sliding, the corresponding shortcut function is to return to the previous layer of interface. And the electronic equipment receives the sliding and takes the sliding as a first input to monitor the sliding track of the sliding. Based on a preset condition, if the sliding is finished without meeting the preset condition, the sliding can be used as a false touch input without further response; if the distance between the current sliding position and the target area is equal to half of the distance between the initial sliding position and the target area, a first interface is displayed on the first display screen, and a user is guided to complete a preset gesture. Thereafter, as shown in fig. 2, when the sliding track 203 of the first input on the first interface 202 of the first display 201 stops in the target area 204, and the first input is slid upward, the interface returns to the previous layer in response to the first input. Of course, the preset gesture and the shortcut function thereof may be set systematically or may be customized by the user, for example, the setting of sliding down to open the currently displayed application program.
Optionally, the first display screen and the second display screen are the same display screen. Therefore, the method of the embodiment can be applied to the interactive control scene of the screen display state in the electronic equipment, and the related processing is prevented from being triggered by the false touch on the display screen in the current display state.
Optionally, the second display screen is a display screen of the electronic device except for the first display screen, and the second display screen is in a screen-off state. Therefore, the method of the embodiment can be applied to the interactive control scene on the screen of the electronic equipment with at least two display screens, and avoids the phenomenon that the relevant operation is triggered by mistaken touch on the display screen in the screen-off state at present.
It should be understood that, in this embodiment, the first interface is displayed on the first display screen in the display state, and considering that the display of the first interface may affect the display of the original display content on the first display screen, the first interface is preferably implemented by using a semi-transparent mask.
Optionally, the first interface further comprises: and prompt information displayed in the target area.
In this way, the user is guided more clearly by the prompt message, for example, the prompt message "move to this area open setting" is added with a text in the vicinity of the circular target area.
In this embodiment, the first interface may be closed when the operation body of the first input leaves the second display screen, or may be automatically closed when the operation body of the first input is operated in response to the first input, so as to avoid subsequent influence of the first interface. And the interface subsequently displayed on the first display screen in response to the first input is preferably presented in a small window pane transparent form so that the user can view the original display content of the first display screen in real time.
Further, optionally, in this embodiment, step 103 includes:
and if the first input is continuously pressed at the end point of the sliding track, displaying a second interface on the first display screen, wherein the second interface comprises candidate objects.
Here, in the case that the end point of the sliding track of the first input is within the target area, the first input is further required to be continuously pressed at the end point of the sliding track, and a second interface including candidate objects can be triggered to be displayed on the first display screen, so that the user can further select among the candidate objects.
Optionally, after the displaying the second interface on the first display screen, the method further includes:
receiving a second input of the user while the second interface is displayed;
in response to the second input, a target object is selected among the candidate objects.
In this way, in the case of displaying the second interface, by receiving a second input of the user, the selection of the target object is completed in response to the second input.
And after the candidate object is selected, highlighting the candidate object in a preset mode.
Therefore, the user can change and adjust the selected candidate object through highlighting until the required target object is selected.
Preferably, the continuous pressing of the first input at the end point of the sliding track is maintained until the user completes the selection of the target object through the second input, so as to avoid misoperation in the selection of the target object.
The method of the embodiment of the invention can be applied to the screen-off interaction control scene of the multi-screen electronic equipment: if the preset gesture is upward sliding, the corresponding shortcut function is to open the first display screen in the display state, and the setting interface of the application program is displayed currently. As shown in fig. 2-4, a user opens an application a on a first display screen of an electronic device to play a game, and slides up 302 on a second display screen 301, and the electronic device receives the slide as a first input and monitors a slide track of the slide. Based on a preset condition, if the sliding is finished without meeting the preset condition, the sliding can be used as a false touch input without further response; if the distance between the current sliding position and the target area is equal to half of the distance between the initial sliding position and the target area, a first interface is displayed on the first display screen, and a user is guided to complete a preset gesture. Thereafter, on the first interface 202 of the first display screen 201, the end point of the first input sliding track 203 is within the target area 204, and the user is slid upward by the first input while continuously pressing the end point of the first input sliding track, in response to the first input, the setting interface 401 (i.e., the second interface) of the game is displayed on the first display screen 201. And for a second input of the user on the second display screen, in response to the second input, such as the up-down sliding 303, the brightness adjustment item 402 of the setting interface 401 as a candidate is sequentially highlighted along with the second input. When the up-down slide 303 is stopped and the slide operation body is separated from the second display screen, it is considered that the currently highlighted target brightness 403 is clicked, and the brightness adjustment corresponding to the target brightness 403 is executed. After the brightness adjustment, or in the case where the pressing of the end point of the slide locus of the first input is cancelled, the setting interface 401 is exited.
During the selection process of the brightness adjustment items 402, the brightness is sequentially highlighted along with the second input, and the second interface performs real-time brightness adjustment, so that the user can know the brightness adjustment effect of each brightness adjustment item 402 to complete the required adjustment.
Of course, in this embodiment, the implementation manners of the first input and the second input are not limited to the above, and are not listed here.
In summary, in the method according to the embodiment of the present invention, when the first display screen is in the display state, by receiving the first input of the user on the second display screen, through the sliding track of the first input, and when the distance between the current position of the first input and the target area meets the preset condition, the first interface is displayed on the first display screen, and the user is guided to complete the input; and then, under the condition that the end point of the sliding track is in the target area, excluding the first input as the false touch input of the user, so that the corresponding operation is finished in response to the first input, and for the input which is stopped in advance without reaching the target area, recognizing that the false touch input is not processed, reducing the false trigger rate and reducing the power consumption of the electronic equipment.
FIG. 5 is a block diagram of an electronic device of one embodiment of the invention. The electronic device 500 shown in fig. 5 includes a first receiving module 510, a first processing module 520, and a second processing module 530.
A first receiving module 510, configured to receive a first input of a user on a second display screen of the electronic device when the first display screen is in a display state;
a first processing module 520, configured to display a first interface on the first display screen according to a sliding track of the first input on the second display screen, where a distance between a current position of the first input and a target area meets a preset condition, where the first interface includes the sliding track and the target area;
a second processing module 530, configured to, in response to the first input, execute an operation corresponding to the first input if the end point of the sliding trajectory is within the target area.
Optionally, the first interface further comprises: and prompt information displayed in the target area.
Optionally, the second processing module 530 is further configured to:
and if the first input is continuously pressed at the end point of the sliding track, displaying a second interface on the first display screen, wherein the second interface comprises candidate objects.
Optionally, the electronic device further comprises:
the second receiving module is used for receiving a second input of the user under the condition of displaying the second interface;
and the third processing module is used for responding to the second input and selecting a target object from the candidate objects.
Optionally, after the candidate object is selected, highlighting is performed in a preset manner. Optionally, the first display screen and the second display screen are the same display screen; or,
the second display screen is a display screen of the electronic equipment except the first display screen, and the second display screen is in a screen-off state.
The electronic device 500 can implement each process implemented by the electronic device in the method embodiments of fig. 1 to fig. 4, and details are not repeated here to avoid repetition. According to the embodiment of the invention, when the first display screen is in a display state, the electronic equipment receives the first input of a user on the second display screen, firstly, through the sliding track of the first input, and displays the first interface on the first display screen under the condition that the distance between the current position of the first input and the target area meets the preset condition, so as to guide the user to complete the input; and then, under the condition that the end point of the sliding track is in the target area, excluding the first input as the false touch input of the user, so that the corresponding operation is finished in response to the first input, and for the input which is stopped in advance without reaching the target area, the false touch input can be identified as not to be processed, the false trigger rate is reduced, and the power consumption of the electronic equipment is reduced.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device 600 for implementing various embodiments of the present invention, where the electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 6 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 610 is configured to receive a first input of a user on a second display screen of the electronic device when the first display screen is in a display state;
according to the sliding track of the first input on the second display screen, under the condition that the distance between the current position of the first input and a target area meets a preset condition, displaying a first interface on the first display screen, wherein the first interface comprises the sliding track and the target area;
and in the case that the end point of the sliding track is in the target area, responding to the first input, and executing the operation corresponding to the first input.
Therefore, when the first display screen is in a display state, the electronic equipment receives a first input of a user on the second display screen, displays a first interface on the first display screen through a sliding track of the first input under the condition that the distance between the current position of the first input and the target area meets a preset condition, and guides the user to finish the input; and then, in the case that the end point of the sliding track is in the target area, the first input is excluded from being the false touch input of the user, so that the corresponding operation is finished in response to the first input, and the input which is stopped in advance without reaching the target area can be identified as the false touch input and is not processed, and the resource consumption is reduced.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 602, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 may also provide audio output related to a specific function performed by the electronic apparatus 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
The electronic device 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the electronic apparatus 600 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although the touch panel 6071 and the display panel 6061 are shown in fig. 6 as two separate components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to implement the input and output functions of the electronic device, and this is not limited here.
The interface unit 608 is an interface for connecting an external device to the electronic apparatus 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic device 600 or may be used to transmit data between the electronic device 600 and external devices.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 609, and calling data stored in the memory 609, thereby performing overall monitoring of the electronic device. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The electronic device 600 may further include a power supply 611 (e.g., a battery) for supplying power to the various components, and preferably, the power supply 611 may be logically connected to the processor 610 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
In addition, the electronic device 600 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the processes of the above-described embodiment of the interactive control method, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned interaction control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (10)
1. An interaction control method applied to an electronic device is characterized by comprising the following steps:
receiving a first input of a user on a second display screen under the condition that the first display screen of the electronic equipment is in a display state;
according to the sliding track of the first input on the second display screen, under the condition that the distance between the current position of the first input and a target area meets a preset condition, displaying a first interface on the first display screen, wherein the first interface comprises the sliding track and the target area;
and in the case that the end point of the sliding track is in the target area, responding to the first input, and executing the operation corresponding to the first input.
2. The method of claim 1, wherein the first interface further comprises: and prompt information displayed in the target area.
3. The method of claim 1, wherein the performing corresponding processing in response to the first input comprises:
and if the first input is continuously pressed at the end point of the sliding track, displaying a second interface on the first display screen, wherein the second interface comprises candidate objects.
4. The method of claim 3, further comprising, after displaying the second interface on the first display screen:
receiving a second input of the user while the second interface is displayed;
in response to the second input, a target object is selected among the candidate objects.
5. The method of claim 1, wherein the first display screen and the second display screen are the same display screen; or,
the second display screen is a display screen of the electronic equipment except the first display screen, and the second display screen is in a screen-off state.
6. An electronic device, comprising:
the first receiving module is used for receiving a first input of a user on the second display screen under the condition that the first display screen of the electronic equipment is in a display state;
the first processing module is used for displaying a first interface on the first display screen according to the sliding track of the first input on the second display screen under the condition that the distance between the current position of the first input and a target area meets a preset condition, wherein the first interface comprises the sliding track and the target area;
and the second processing module is used for responding to the first input and executing the operation corresponding to the first input under the condition that the end point of the sliding track is in the target area.
7. The electronic device of claim 6, wherein the first interface further comprises: and prompt information displayed in the target area.
8. The electronic device of claim 6, wherein the second processing module is further configured to:
and if the first input is continuously pressed at the end point of the sliding track, displaying a second interface on the first display screen, wherein the second interface comprises candidate objects.
9. The electronic device of claim 8, further comprising:
the second receiving module is used for receiving a second input of the user under the condition of displaying the second interface;
and the third processing module is used for responding to the second input and selecting a target object from the candidate objects.
10. The electronic device of claim 6, wherein the first display screen and the second display screen are the same display screen; or,
the second display screen is a display screen of the electronic equipment except the first display screen, and the second display screen is in a screen-off state.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010142421.3A CN111367483A (en) | 2020-03-04 | 2020-03-04 | Interaction control method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010142421.3A CN111367483A (en) | 2020-03-04 | 2020-03-04 | Interaction control method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111367483A true CN111367483A (en) | 2020-07-03 |
Family
ID=71206546
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010142421.3A Pending CN111367483A (en) | 2020-03-04 | 2020-03-04 | Interaction control method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111367483A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112925598A (en) * | 2021-03-17 | 2021-06-08 | 联想(北京)有限公司 | Application program starting method and starting device and electronic equipment |
CN113239212A (en) * | 2021-05-12 | 2021-08-10 | 维沃移动通信有限公司 | Information processing method and device and electronic equipment |
CN113703632A (en) * | 2021-08-31 | 2021-11-26 | 维沃移动通信有限公司 | Interface display method and device and electronic equipment |
WO2023078094A1 (en) * | 2021-11-03 | 2023-05-11 | 华为技术有限公司 | Electronic device, interaction method thereof, and readable medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014135019A (en) * | 2013-01-11 | 2014-07-24 | Konica Minolta Inc | Handwriting input device and control program |
CN105045481A (en) * | 2015-06-26 | 2015-11-11 | 深圳市金立通信设备有限公司 | Operation method and terminal |
CN105159575A (en) * | 2015-08-17 | 2015-12-16 | 美国掌赢信息科技有限公司 | Call initiating method and electronic equipment |
CN105260061A (en) * | 2015-11-20 | 2016-01-20 | 深圳市奇客布达科技有限公司 | Back face touch device and back face touch gesture of handheld electronic equipment |
CN108153466A (en) * | 2017-11-28 | 2018-06-12 | 北京珠穆朗玛移动通信有限公司 | Operating method, mobile terminal and storage medium based on double screen |
CN110531914A (en) * | 2019-08-28 | 2019-12-03 | 维沃移动通信有限公司 | A kind of arranging photo album method and electronic equipment |
-
2020
- 2020-03-04 CN CN202010142421.3A patent/CN111367483A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014135019A (en) * | 2013-01-11 | 2014-07-24 | Konica Minolta Inc | Handwriting input device and control program |
CN105045481A (en) * | 2015-06-26 | 2015-11-11 | 深圳市金立通信设备有限公司 | Operation method and terminal |
CN105159575A (en) * | 2015-08-17 | 2015-12-16 | 美国掌赢信息科技有限公司 | Call initiating method and electronic equipment |
CN105260061A (en) * | 2015-11-20 | 2016-01-20 | 深圳市奇客布达科技有限公司 | Back face touch device and back face touch gesture of handheld electronic equipment |
CN108153466A (en) * | 2017-11-28 | 2018-06-12 | 北京珠穆朗玛移动通信有限公司 | Operating method, mobile terminal and storage medium based on double screen |
CN110531914A (en) * | 2019-08-28 | 2019-12-03 | 维沃移动通信有限公司 | A kind of arranging photo album method and electronic equipment |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112925598A (en) * | 2021-03-17 | 2021-06-08 | 联想(北京)有限公司 | Application program starting method and starting device and electronic equipment |
CN113239212A (en) * | 2021-05-12 | 2021-08-10 | 维沃移动通信有限公司 | Information processing method and device and electronic equipment |
CN113703632A (en) * | 2021-08-31 | 2021-11-26 | 维沃移动通信有限公司 | Interface display method and device and electronic equipment |
WO2023078094A1 (en) * | 2021-11-03 | 2023-05-11 | 华为技术有限公司 | Electronic device, interaction method thereof, and readable medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109375890B (en) | Screen display method and multi-screen electronic equipment | |
CN109388304B (en) | Screen capturing method and terminal equipment | |
CN111092990B (en) | Application program sharing method, electronic device and storage medium | |
CN109032447B (en) | Icon processing method and mobile terminal | |
CN109800045B (en) | Display method and terminal | |
CN110830363B (en) | Information sharing method and electronic equipment | |
CN111367483A (en) | Interaction control method and electronic equipment | |
CN107728923B (en) | Operation processing method and mobile terminal | |
CN110221795B (en) | Screen recording method and terminal | |
CN108958593B (en) | Method for determining communication object and mobile terminal | |
CN108334272B (en) | Control method and mobile terminal | |
CN109683768B (en) | Application operation method and mobile terminal | |
CN108446156B (en) | Application program control method and terminal | |
CN110825295B (en) | Application program control method and electronic equipment | |
CN110096203B (en) | Screenshot method and mobile terminal | |
CN109847348B (en) | Operation interface control method, mobile terminal and storage medium | |
CN110321046A (en) | A kind of content selecting method and terminal | |
CN108509131B (en) | Application program starting method and terminal | |
CN111475066B (en) | Background switching method of application program and electronic equipment | |
CN111143011B (en) | Display method and electronic equipment | |
CN111026321A (en) | Touch control method and electronic equipment | |
CN111158548A (en) | Screen folding method and electronic equipment | |
CN110769303A (en) | Playing control method and device and mobile terminal | |
CN109189514B (en) | Terminal device control method and terminal device | |
CN110780751A (en) | Information processing method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200703 |
|
WD01 | Invention patent application deemed withdrawn after publication |