CN113157180B - Touch operation method and device for application and electronic equipment - Google Patents

Touch operation method and device for application and electronic equipment Download PDF

Info

Publication number
CN113157180B
CN113157180B CN202110335730.7A CN202110335730A CN113157180B CN 113157180 B CN113157180 B CN 113157180B CN 202110335730 A CN202110335730 A CN 202110335730A CN 113157180 B CN113157180 B CN 113157180B
Authority
CN
China
Prior art keywords
target
input
operation surface
dimensional control
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110335730.7A
Other languages
Chinese (zh)
Other versions
CN113157180A (en
Inventor
银道峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110335730.7A priority Critical patent/CN113157180B/en
Publication of CN113157180A publication Critical patent/CN113157180A/en
Application granted granted Critical
Publication of CN113157180B publication Critical patent/CN113157180B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application discloses a touch operation method, a touch operation device and electronic equipment applied, belongs to the technical field of communication, and can solve the problem that the electronic equipment can finish user requirements only by receiving a plurality of operation steps executed by a user on the electronic equipment, so that the operation steps are complicated, and the time consumption of the user for operating the electronic equipment is prolonged. The method comprises the following steps: the method comprises the steps that electronic equipment receives first input of a user on a target three-dimensional control, wherein the target three-dimensional control comprises N operation surfaces, and different operation surfaces indicate different applications; responding to the first input, the electronic equipment feeds back first prompt information and controls the target three-dimensional control to rotate to a target operation surface; the target operation surface is a focus operation surface of the rotated target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface.

Description

Touch operation method and device for application and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an applied touch operation method, an applied touch operation device and electronic equipment.
Background
With the development of electronic technology, electronic devices (e.g., mobile phones, tablet computers) have more and more functions, and accordingly, applications in electronic devices have more and more applications.
In the related art, applications in an electronic device are generally displayed in the form of application icons in a display screen of the electronic device. Assuming that the electronic equipment is a mobile phone, when a user needs to view a certain application in the electronic equipment, the display page of the application is separated from the current display page by 5 pages, and after the user starts to turn from the current application for 5 times, the user can find the page containing the application in the electronic equipment.
According to the above process, when the user searches the application in the electronic device, the electronic device needs to receive a plurality of operation steps executed by the user on the electronic device to complete the user demand, so that the operation steps are complicated, and the time consumed by the user to operate the electronic device is prolonged.
Disclosure of Invention
The embodiment of the application aims to provide an application touch operation method, device and electronic equipment, which can solve the problem that the electronic equipment can finish user requirements by receiving a plurality of operation steps executed by a user on the electronic equipment, so that the operation steps are complicated, and the time consumption of the user for operating the electronic equipment is prolonged.
In order to solve the technical problems, the application is realized as follows:
in a first aspect, an embodiment of the present application provides an application touch operation method, where the method includes: the method comprises the steps that electronic equipment receives first input of a user on a target three-dimensional control, wherein the target three-dimensional control comprises N operation surfaces, and different operation surfaces indicate different applications; responding to the first input, the electronic equipment feeds back first prompt information and controls the target three-dimensional control to rotate to a target operation surface; the target operation surface is a focus operation surface of the rotated target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface.
In a second aspect, an embodiment of the present application provides an application touch operation device, where the device includes a receiving module and an executing module; the receiving module is used for receiving first input of a user to a target three-dimensional control, wherein the target three-dimensional control comprises N operation surfaces, and different operation surfaces indicate different applications; the execution module is used for responding to the first input received by the receiving module, feeding back first prompt information and controlling the target three-dimensional control to rotate to a target operation surface; the target operation surface is a focus operation surface of the rotated target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, the program or instruction implementing the steps of the method according to the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In this embodiment of the present application, after receiving an input from a user on a target three-dimensional control including N operation surfaces (different operation surfaces indicate different applications), the electronic device will feed back, to the user, a prompt message indicating an application corresponding to the target operation surface, and control the target three-dimensional control to rotate to an operation surface with a largest operation area, that is, a target operation surface. Therefore, the user can quickly find the target application which needs to be operated through the prompt information fed back by the target three-dimensional control, and the target application is operated through the target three-dimensional control, so that the step that the user turns over to view the application in the electronic equipment is reduced, and the time consumption of the user for viewing the user is saved.
Drawings
Fig. 1 is a schematic flow chart of an application touch operation method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of an interface applied by an applied touch operation method according to an embodiment of the present application;
FIG. 3 is a second schematic diagram of an interface applied by an application touch operation method according to an embodiment of the present disclosure;
FIG. 4 is a third schematic diagram of an interface applied by an application touch operation method according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of an interface applied by an applied touch operation method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface applied by an applied touch operation method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an interface applied by an applied touch operation method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an applied touch operation device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The term "electrostatic touch feedback technology" appearing in the embodiments of the present application is explained as follows:
electrostatic touch feedback technology:
the principle of the electrostatic touch feedback technology is that an electric signal which is difficult to perceive is emitted to a user, so that an oscillating electrostatic field is formed around the skin, and when the user touches the display screen, the static electricity can adjust the friction between fingers and the display screen, so that the user generates a real texture feeling. The electrostatic touch feedback technology can be widely applied to the heavy fields of smart phones, tablet computers, automobile center control screens and the like, and promotes the touch experience of man-machine interaction to a new height.
Because the electrostatic touch feedback technology utilizes the principle of a bioelectric field to generate a tactile feedback, the electronic equipment can not depend on any moving parts of the electronic equipment and can not cause any physical change of a screen of the electronic equipment, and can manufacture vivid touch feeling without using mechanical vibration. Specifically, the electrostatic touch feedback technology can enable a user to generate different touch sensations such as rough hand feeling, smooth hand feeling or corner keys when using the touch screen.
The touch operation method of the application provided in the embodiment of the application is described in detail below through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
The touch operation method of the application can be applied to a scene that a user needs to execute shortcut operation on a certain application in the electronic equipment.
Aiming at the scene that a user needs to execute a shortcut operation on an application in electronic equipment, if the user needs to open an application 1 in the process of operating the electronic equipment, the application 1 is in a page A of the electronic equipment, the current display page of the electronic equipment is a page B of an application 2, and a display page C of the application 2 is located, and 5 pages are spaced between the page B and the page C, the user needs to close the application 2 first, return to the page B, and turn the electronic equipment until the page C is turned. In the above process, the user needs to perform the operation on the electronic device for at least 6 times, so that the electronic device can display the page C including the application 1, and further perform the opening input on the application 1 to open the application 1, which results in complicated operation steps and prolonged time consumption of the user to operate the electronic device.
In this embodiment of the present application, after receiving an input from a user on a target three-dimensional control including N operation surfaces (different operation surfaces indicate different applications), the electronic device will feed back, to the user, a prompt message indicating an application corresponding to the target operation surface, and control the target three-dimensional control to rotate to an operation surface with a largest operation area, that is, a target operation surface. Therefore, the user can quickly find the target application which needs to be operated through the prompt information fed back by the target three-dimensional control, and the target application is operated through the target three-dimensional control, so that the step that the user turns over to view the application in the electronic equipment is reduced, and the time consumption of the user for viewing the user is saved.
The embodiment provides an application touch operation method, as shown in fig. 1, which includes the following steps 301 and 302:
step 301: the touch control operation device of the application receives a first input of a user to a target three-dimensional control.
In this embodiment of the present application, the target three-dimensional control includes N operation surfaces, and different operation surfaces indicate different applications.
In this embodiment of the present application, the target three-dimensional control may be displayed in the display interface, or may not be displayed in the display interface.
In this embodiment of the present application, the display manner of the target three-dimensional control (that is, displayed on the display interface or not displayed on the display interface) may be user-defined or preset for an electronic device, which is not limited in this embodiment of the present application.
In this embodiment of the present application, the control model corresponding to the target three-dimensional control may be preset for the electronic device, or may be user-defined, which is not limited in this embodiment of the present application.
In this embodiment of the present application, the target three-dimensional control is a control that may include N operation surfaces.
The target three-dimensional control may be a three-dimensional polyhedral control and a three-dimensional circular control.
In one example, when the target three-dimensional control is a three-dimensional polyhedral control, the three-dimensional polyhedral control may be a stereoscopic model composed of a plurality of geometric surfaces. For example, as shown in fig. 2 (a), the target three-dimensional control may be tetrahedral, hexahedral.
Further, one geometric surface of the three-dimensional polyhedral control corresponds to one operation surface, and each operation surface can be used for indicating different applications. For example, taking a three-dimensional polyhedral control as an example, the hexahedral control includes 6 display surfaces, and 6 different applications can be displayed on the 6 display surfaces.
In one example, when the target three-dimensional control is a three-dimensional circular control, the three-dimensional circular control may include at least one curved geometric surface thereon. For example, as shown in fig. 2 (b), the three-dimensional control may be a sphere, a cylinder, a cone.
Further, each of the three-dimensional circular controls described above may be used to indicate a different application.
In this way, the touch control operation device of the application can indicate various different applications in various ways through the controls with various different shapes, enrich the ways that users can control the number of applications through the target three-dimensional control and further target operation on the applications.
It can be appreciated that the target three-dimensional control may be used to indicate all application programs in the electronic device, for example, when the electronic device includes 20 application programs and the shape of the target three-dimensional control is a sphere, the surface of the sphere control may be divided into 20 different arcuate surfaces, and the different arcuate surfaces indicate different applications; and may also be used to indicate a portion of an application program in an electronic device, for example, when the target three-dimensional control is in the shape of a cube, the cube control includes 6 operation surfaces, different operation surfaces indicate different applications, and the 6 applications may be applications most commonly used by a user.
For example, the first input may be a display surface for adjusting the target three-dimensional control to be displayed on the display interface, or an operation surface capable of receiving a user in the target three-dimensional control.
In this embodiment of the present application, the first input may be a touch input, for example, a click input, a slide input, a voice input, or a specific gesture input, which is not limited in this embodiment of the present application.
Step 302: and responding to the first input, and feeding back first prompt information by the applied touch control operation device to control the target three-dimensional control to rotate to a target operation surface.
The target operation surface is an operation surface of a focus of the rotated target three-dimensional control, the focus operation surface is an operation surface with a largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface.
In this embodiment of the present application, the target operation surface may be any operation surface of the N operation surfaces.
In this embodiment of the present application, the first prompting message is used to prompt an application and/or a function corresponding to the target operation.
Optionally, in an embodiment of the present application, the target three-dimensional control includes N operation surfaces, and an application identifier of a corresponding application is displayed on each operation surface.
It can be understood that, because the target three-dimensional control is a three-dimensional control, one or more operation surfaces can be generally in a non-hidden state, the rest of operation surfaces are in a hidden state, and the operation surfaces in the non-hidden state are display surfaces which can be operated by users, namely, the focus operation surface and the non-focus operation surface can receive the operation of the users. Further, in order to facilitate the user operation and ensure the accuracy of the user operation, the operation surface with the largest operation area, that is, the focus display surface, may be set as the only operation surface capable of receiving the operation, and the remaining operation surfaces are the operation surfaces incapable of being operated.
In this embodiment of the present application, before the applied touch operation device controls the target three-dimensional control to rotate to the target operation surface, the applied touch operation device may display the first operation surface on the target three-dimensional control. Wherein the first operation surface is an operation surface different from the target operation surface.
Further, the first operation surface may indicate a single operation surface or may indicate a plurality of operation surfaces.
It can be understood that, because the shape of the target three-dimensional control is a three-dimensional shape, when the target three-dimensional control is in the display screen, a part of the operation surface cannot be displayed, or a part of the operation surface cannot receive the input of the user, and when the hidden part of the operation surface is the target operation surface, the electronic device cannot execute the target operation. Therefore, the electronic device can receive the second input of the target three-dimensional control, so that the target three-dimensional control is rotated, the direction and the position in the target three-dimensional control are adjusted, the hidden operation surface is enabled to change the direction and the position, the input of the user on the hidden target operation surface is finally received, and the target operation is executed.
Example 1: assuming that the target three-dimensional control is a cube magic cube control, as shown in (a) of fig. 3, the cube magic cube control 31 is displayed on the display interface 32, where the cube magic cube control includes 6 operation surfaces (i.e., the N operation surfaces described above), the operation surface 33 is used to indicate a phone application, the operation surface 34 is used to indicate an information application, the operation surface 35 is used to indicate a music application, the operation surface 36 is used to indicate a mail application, the operation surface 37 is used to indicate a video application, and the operation surface 38 is used to indicate a shopping application (where the operation surfaces 37, 38, and 39 are hidden). The cube magic cube control 32 currently displayed on the display interface 31 by the electronic device has 3 operation surfaces, namely an operation surface 35, an operation surface 38 and an operation surface 36 (i.e. the first operation surface), at this time, the electronic device receives a sliding input (i.e. the first input) of the user to the cube control 32, and the cube control 32 rotates, and as shown in (b) of fig. 3, the operation surface 36 (i.e. the target operation surface and the focus display surface) is displayed on the display interface.
It should be noted that, when the target three-dimensional control is a magic cube control, the target three-dimensional control may also be a spherical magic cube control, as shown in fig. 4.
Therefore, the applied touch control operation device can receive the rotation input of the user, so that the operation input of the user to any operation surface in the target three-dimensional control can be flexibly received, and the convenience of using the electronic equipment by the user is improved.
In the touch operation method of the application provided by the embodiment of the application, after receiving the input of the user on the target three-dimensional control including the N operation surfaces (different operation surfaces indicate different applications), the touch operation device of the application feeds back prompt information indicating the application corresponding to the target operation surface to the user, and controls the target three-dimensional control to rotate to the operation surface with the largest operation area, namely the target operation surface. Therefore, the user can quickly find the target application which needs to be operated through the prompt information fed back by the target three-dimensional control, and the target application is operated through the target three-dimensional control, so that the step that the user turns over to view the application in the electronic equipment is reduced, and the time consumption of the user for viewing the user is saved.
Optionally, in this embodiment of the present application, after receiving the first input of the user to the target three-dimensional control in the step A1, the touch operation method for an application provided in this embodiment of the present application further includes the following step a:
Step A: and responding to the first input, and feeding back at least one first prompt message by the applied touch control operation device.
For example, each time the focal operation surface of the target three-dimensional control is replaced, the first prompt information is fed back once, where the first prompt information includes at least one of the following: voice information, texture haptic information.
It can be understood that, because the touch operation device of the application may feed back texture touch information or voice information to the user after receiving the first input of the user to the target three-dimensional control, when the user needs to use the target three-dimensional control, the electronic device may set the target three-dimensional control to be displayed in the display interface of the electronic device for the user to view when the user needs to view, or may not be displayed in the display interface of the electronic device.
In this embodiment of the present application, the form of the first prompt information is different according to the different first input modes.
In an example, when the input mode of the first input is touch input (for example, the user touches the target three-dimensional control in the display screen with a finger, so as to complete the first input), the first prompt information may be texture touch information, and the texture touch information may enable the user to perceive that the user inputs the first input to the electronic device, and then the electronic device performs the target operation corresponding to the first input.
In another example, when the input mode of the first input is a voice input (for example, the user indicates that the electronic device completes the first input through voice information) or a special gesture input (for example, the user completes the first input through waving a hand over a display screen of the electronic device), the first prompt information may be voice information, and the voice information may inform the user that the electronic device performs the target operation corresponding to the first input after the user inputs the first input to the electronic device.
For example, the texture haptic information corresponding to the different operation surfaces may be different in the vibration mode felt by the user. For example, when the target operational control contains 5 applications, as shown in (a) in fig. 5, the texture haptic information of application 1 may be a rough touch; as shown in (b) of fig. 5, the texture haptic information of application 2 may be a smooth haptic sensation; as shown in (c) of fig. 5, the texture haptic information of the application 3 may be a rough touch.
For example, the texture touch feeling corresponding to the different operation surfaces may also be different for the vibration area corresponding to the application identifier. The electronic device can set texture touch information in the area where the application identifier is located on the operation surface, and the texture touch information corresponding to the application identifier is different because the application identifiers of different applications are different. For example, when the target operation control includes 3 applications, the application 1 is a phone application, and the texture touch information corresponding to the operation surface of the application 1 may be texture touch corresponding to the area where the phone identifier is located; the application 2 is an information application, and the texture touch information corresponding to the operation surface of the application 2 can be texture touch information corresponding to the area where the information mark is located; the application 3 is a music application, and the texture touch information corresponding to the operation surface of the application 3 may be texture touch information corresponding to the area where the music identifier is located.
The first prompt information is for prompting the user about the change of the current focus operation surface.
Example 2: in connection with example 1 above, after the electronic device receives the sliding input (i.e., the second input) of the user to the cube control 32, the cube control 32 rotates, and after 2 times of short vibration, the operation surface 33 (i.e., the target operation surface and the focus display surface) is displayed on the display interface.
Therefore, after the focus display surface of the target three-dimensional control in the electronic device changes, the electronic device feeds back the second prompt information to the user, so that the user can accurately judge that the target operation surface of the target three-dimensional control needs to be operated without checking the electronic device.
Optionally, in the embodiment of the present application, before the step 301, the touch operation method of the application provided in the embodiment of the present application further includes the following step C1 and step C2:
step C1: and the touch control operation device of the application receives a second input of the user to the target operation surface.
Step C2: and responding to the second input, executing the target operation of the application corresponding to the target operation surface by the touch control operation device of the application, and feeding back second prompt information.
The second prompting information is used for prompting the function corresponding to the target operation.
In this embodiment of the present application, the second input is used to trigger the electronic device to feedback, to the user, prompt information corresponding to the target operation performed by the application indicated by the target operation surface after the input is performed on the target operation surface.
In embodiments of the present application, different ways of the second input may instruct the electronic device to perform different target operations. For example, when the second input is a double-click input, the electronic device may perform a function of opening an application indicated by the target operation surface, and when the second input is a slide input on the target operation surface, the electronic device may perform a function of adjusting an application parameter of an application currently displayed in the display screen of the electronic device.
Optionally, in this embodiment of the present application, the second input is a touch input of the user to the target operation surface, and the second prompting information is the texture touch information, and different textures corresponding to different operation surfaces are different textures.
In an embodiment of the present application, the above-mentioned target operation includes at least one of the following: and starting the application corresponding to the target operation surface, and switching the application currently displayed in the display screen of the electronic equipment into the application corresponding to the target operation surface.
The application parameters of the application are different according to the application currently displayed in the display screen. For example, when the application is a music application, the application parameters may be a volume adjustment parameter and a song progress parameter, and when the application is a video application, the application parameters may be a volume adjustment parameter and a video progress parameter.
It should be noted that, the specific form of the first prompt information and the second prompt information are different. For example, in the case where the first hint information and the second hint information are both texture haptic information, if the first hint information is short shock, the second hint information may be long shock. Therefore, through the differentiation of vibration forms, a user can accurately acquire the rotation condition of the current target three-dimensional control without checking the electronic equipment, and further accurately judge the target operation surface, and can accurately acquire the target operation surface of the current target three-dimensional control without checking the electronic equipment, and further accurately execute target operation on the target operation surface.
Example 3: assuming that the target three-dimensional control is a sphere control, as shown in fig. 6 (a), the sphere control 41 is displayed on the display interface 42, the sphere control includes 3 operation surfaces (i.e., the N operation surfaces) and the operation surface 43 is used for indicating a phone application, the operation surface 44 is used for indicating an information application, and the operation surface 45 is used for indicating a music application, after receiving the click input (i.e., the second input) of the user on the sphere control operation surface 43, the electronic device feeds back texture tactile information (i.e., the second prompt information) of double vibration to the user, and as shown in fig. 6 (b), opens a page 46 (i.e., the target operation) corresponding to the phone application on the current display interface 42.
Therefore, the user can complete the target operation of the application indicated by the target operation surface through the prompt information fed back by the target operation surface of the target three-dimensional control without checking the target three-dimensional control, so that the probability of opening the wrong application when the user does not need to check the display screen of the electronic equipment is avoided, and the accuracy of performing the target operation on the application required by the user without checking the display screen when the user uses the electronic equipment is improved.
Optionally, in the embodiment of the present application, in the case where each operation surface of the target three-dimensional control includes M function adjustment areas, in the target operation of executing the application corresponding to the target operation surface in the step C2, the touch operation method of the application provided in the embodiment of the present application includes the following step D:
step D: and the applied touch control operation device executes the target operation corresponding to the target function adjustment area in the target operation surface by the second input, and feeds back the third prompt information.
For example, the third prompting message is used for prompting the function corresponding to the target function adjustment area, and M is a positive integer.
For example, the third prompt information may be at least one of the following: the visual information may be texture tactile information or voice information, which is not limited in the embodiment of the present application.
The form of the third prompting information may be preset in advance for the electronic device or may be set by a user in a user-defined manner, which is not limited in the embodiment of the present application.
For example, the second input may refer to the foregoing description, and will not be described herein.
For example, the function corresponding to the function adjustment area may be preset in advance for the applied touch operation device, or may be set by user definition.
It can be understood that, on each operation surface of the target three-dimensional control, the applied touch operation device can divide a plurality of different functional areas in advance. The area boundary of each functional area can be displayed in the process of receiving the user set function, so that the user can know the touch control range of the functional area, and after the function of each functional area is set, the area boundary of each functional area can be displayed or not in the process of receiving the input of the user to different functional areas on any operation surface of the target three-dimensional control by the applied touch control operation device.
For example, the target operation may be used to adjust an application parameter of an application currently displayed in a display screen of the electronic device.
In one example, the application parameter may be volume, brightness, etc.
Example 4: in connection with the above example 1, after receiving the sliding input of the user to the cube control operation surface 33 (i.e., the above second input), the electronic device feeds back texture tactile information of the area where the application identifier of the phone application is located (i.e., the above third hint information) to the user, and increases the current call volume of the phone application (i.e., the above target operation) as shown in (b) of fig. 7.
Therefore, a user can directly select the application required by the user through the target three-dimensional control, and can execute the operation of adjusting parameters on the application indicated by the target operation surface on the target three-dimensional control, so that the related operation can be completed without opening a display interface of the application indicated by the target operation surface by the electronic equipment, steps and time consumption for adjusting the parameters in the application by the user are saved, and the efficiency of using the electronic equipment by the user is improved.
Optionally, in the embodiment of the present application, before the step 301, the touch operation method of the application provided in the embodiment of the present application may further include the following step E1 and step E2:
step E1: and the touch control operation device of the application receives a third input of the user on the first interface.
Step E2: and responding to the third input, and starting the target three-dimensional control by the touch control operation device of the application.
The first interface may be any interface in the electronic device, which is not limited in the embodiments of the present application.
The third input may be preset for the electronic device, or may be set by a user in a user-defined manner, which is not limited in the embodiment of the present application.
For example, the third input may be used to trigger the touch operation device of the application to start the target three-dimensional control.
The third input may be a touch input, a voice input, or a special gesture input, which is not limited in the embodiment of the present application.
It can be understood that, in general, the target three-dimensional control is in a dormant state, and after the third input is required to invoke and start the target three-dimensional control, the applied touch operation device can further receive the first input of the user and execute the target operation.
Example 5: in connection with example 1 above, before the electronic device receives a user sliding input on the operating face 35 of the cube-shaped magic cube control 32, the electronic device may receive a user circular sliding input (i.e., the third input described above) on the display interface 31, and after receiving the sliding input, the electronic device may display the cube-shaped magic cube control 32 on the display interface 31.
Therefore, the touch control operation device of the application can enable the target three-dimensional control to be in the dormant state at ordinary times, and accordingly resource consumption of the electronic equipment is saved.
Optionally, in the embodiment of the present application, before the step 301, the touch operation method of the application provided in the embodiment of the present application may further include the following step F1 and step F2:
step F1: and the applied touch control operation device receives a fourth input for setting the target three-dimensional control by the user.
Step F2: and responding to the fourth input, and setting the attribute information of the target three-dimensional control by the applied touch control operation device.
For example, the attribute information of the target three-dimensional control may include at least one of: and each operation surface corresponds to an application, each application corresponds to prompt information, each application corresponds to an application identifier, and each application corresponds to function information.
In an example, when the attribute information is an application identifier corresponding to each application, the application identifier may be an application identifier of the application itself, or may be an application identifier customized by a user, for example, a picture self-drawn by the user using a drawing tool of the electronic device itself, and the picture is set as the application identifier, or a photo in the electronic device may be set as the application identifier by the user.
In an example, when the attribute information is the function information corresponding to each application, the function information may be the function most commonly used in the application, and by laying out the function information on the operation surface of the target three-dimensional control, the touch control operation device of the application may receive the input of the user for executing the function regulation operation on the application corresponding to the operation surface on the operation surface, so that the application range of the target three-dimensional control is enriched, and the efficiency of the application operation indicated by the user on the operation surface is improved.
Illustratively, the fourth input is for setting a target three-dimensional control attribute. For example, the fourth input described above may be an input for opening a setting application that sets the target three-dimensional control attribute information.
The beneficial effects of the various implementation manners in this embodiment may be specifically referred to the beneficial effects of the corresponding implementation manners in the foregoing method embodiment, and in order to avoid repetition, the description is omitted here.
It should be noted that, in the touch operation method of an application provided in the embodiments of the present application, the execution body may be an application touch operation device, or a control module of the application touch operation device for executing the touch operation method of the application. In the embodiment of the application, a touch operation method of executing an application by using the touch operation device of the application is taken as an example, and the touch operation device of the application provided in the embodiment of the application is described.
Fig. 8 is a schematic diagram of a possible structure of a touch operation device for implementing the application provided in the embodiments of the present application. As shown in fig. 8, the touch operation device 600 of the above application includes a receiving module 601 and an executing module 602; the receiving module 601 is configured to receive a first input from a user to a target three-dimensional control, where the target three-dimensional control includes N operation surfaces, and different operation surfaces indicate different applications; the executing module is configured to respond to the first input received by the receiving module 601, feed back first prompt information, and control the target three-dimensional control to rotate to a target operation surface; the target operation surface is a focus operation surface of the rotated target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface.
In the touch control operation device of the application provided by the embodiment of the application, after receiving the input of the user on the target three-dimensional control including the N operation surfaces (different operation surfaces indicate different applications), the touch control operation device of the application feeds back prompt information indicating the application corresponding to the target operation surface to the user, and controls the target three-dimensional control to rotate to the operation surface with the largest operation area, namely the target operation surface. Therefore, the user can quickly find the target application which needs to be operated through the prompt information fed back by the target three-dimensional control, and the target application is operated through the target three-dimensional control, so that the step that the user turns over to view the application in the electronic equipment is reduced, and the time consumption of the user for viewing the user is saved.
Optionally, in the embodiment of the present application, the apparatus 600 further includes a feedback module 603; the feedback module 603 is configured to feedback at least one first prompt message in response to the first input received by the receiving module 601; and feeding back the first prompt information once every time the focus operation surface of the target three-dimensional control is replaced, wherein the first prompt information comprises at least one of the following items: voice information, texture haptic information.
Optionally, in the embodiment of the present application, the receiving module 601 is further configured to receive a second input from a user to the target operation surface; the executing module 602 is further configured to execute a target operation of the application corresponding to the target operation surface in response to the second input received by the receiving module 601, and feed back second prompting information, where the second prompting information is used to prompt a function corresponding to the target operation.
Optionally, in this embodiment of the present application, when each operation surface of the target three-dimensional control includes M function adjustment areas, the executing module 602 is specifically configured to execute the target operation corresponding to the target function adjustment area in the target operation surface, which is received by the receiving module 601, and feed back third prompting information, where the third prompting information is used to prompt a function corresponding to the target function adjustment area, and M is a positive integer.
Optionally, in an embodiment of the present application, the apparatus 600 further includes a start module 604; the receiving module 601 is further configured to receive a third input from a user at the first interface; the activation module 604 is further configured to activate the target three-dimensional control in response to the third input received by the receiving module 601.
Optionally, in an embodiment of the present application, the apparatus 600 further includes a start module 604; the receiving module 601 is further configured to receive a fourth input for setting the target three-dimensional control by a user; the starting module 604 is configured to set the N pieces of operation surface attribute information of the target three-dimensional control in response to the fourth input received by the receiving module 601.
The touch operation device applied in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc., and the non-mobile electronic device may be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The touch operation device applied in the embodiments of the present application may be a device with an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The touch operation device of the application provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 7, and in order to avoid repetition, a detailed description is omitted here.
It should be noted that, as shown in fig. 8, modules that are necessarily included in the applied touch operation device 600 are illustrated by solid line boxes, such as a receiving module 601; the modules that may or may not be included in the applied touch operation device 600 are indicated by dashed boxes, such as the feedback module 603.
Optionally, as shown in fig. 9, the embodiment of the present application further provides an electronic device 800, including a processor 801, a memory 802, and a program or an instruction stored in the memory 802 and capable of running on the processor 801, where the program or the instruction implements each process of the touch operation method embodiment of the application when executed by the processor 801, and the process can achieve the same technical effect, and for avoiding repetition, a detailed description is omitted herein.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 10 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, and processor 110. Wherein the user input unit 107 includes: touch panel 1071 and other input devices 1072, display unit 106 includes a display panel 1061, and input unit 104 includes an image processor 1041 and a microphone 1042, and memory 109 can be used to store software programs (e.g., an operating system, application programs needed for at least one function), and various data.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 110 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The user input unit 107 receives a first input of the user to the target three-dimensional control, where the target three-dimensional control includes N operation surfaces, and different operation surfaces indicate different applications; the processor 110 is configured to respond to the first input received by the user input unit 107, feed back first prompt information, and control the target three-dimensional control to rotate to a target operation surface; the target operation surface is a focus operation surface of the rotated target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface.
In the touch operation method of the application provided by the embodiment of the application, after receiving the input of the user on the target three-dimensional control including the N operation surfaces (different operation surfaces indicate different applications), the electronic device feeds back prompt information indicating the application corresponding to the target operation surface to the user, and controls the target three-dimensional control to rotate to the operation surface with the largest operation area, namely the target operation surface. Therefore, the user can quickly find the target application which needs to be operated through the prompt information fed back by the target three-dimensional control, and the target application is operated through the target three-dimensional control, so that the step that the user turns over to view the application in the electronic equipment is reduced, and the time consumption of the user for viewing the user is saved.
Optionally, the processor 110 is further configured to feed back at least one first prompt message in response to the first input received by the user input unit 107; and feeding back the first prompt information once every time the focus operation surface of the target three-dimensional control is replaced, wherein the first prompt information comprises at least one of the following items: voice information, texture haptic information.
Optionally, the user input unit 107 is further configured to receive a second input from a user to the target operation surface; the processor 110 is further configured to execute a target operation of the application corresponding to the target operation surface in response to the second input received by the user input unit 107, and feedback a second prompting message, where the second prompting message is used to prompt a function corresponding to the target operation.
Optionally, when each operation surface of the target three-dimensional control includes M function adjustment areas, the processor 110 is specifically configured to execute the target operation corresponding to the target function adjustment area in the target operation surface by using the second input, and feed back third prompting information, where the third prompting information is used to prompt a function corresponding to the target function adjustment area, and M is a positive integer.
Optionally, the user input unit 107 is further configured to receive a third input from the user at the first interface; the processor 110 is further configured to activate the target three-dimensional control in response to the third input received by the user input unit 107.
Optionally, the user input unit 107 is further configured to receive a fourth input for setting the target three-dimensional control by a user; the processor 110 is further configured to set the N pieces of operation surface attribute information of the target three-dimensional control in response to the fourth input received by the user input unit 107.
It should be appreciated that in embodiments of the present application, the input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein. Memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 110 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the application further provides a readable storage medium, on which a program or an instruction is stored, where the program or the instruction implements each process of the touch operation method embodiment of the application when executed by a processor, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the touch operation method embodiment of the application, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), including several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. An applied touch operation method is characterized in that,
receiving first input of a user to a target three-dimensional control, wherein the target three-dimensional control comprises N operation surfaces, and different operation surfaces indicate different applications;
responding to the first input, feeding back first prompt information, and controlling the target three-dimensional control to rotate to a target operation surface;
the target operation surface is a rotated focus operation surface of the target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface;
before the first input of the user to the target operation surface of the target three-dimensional control is received, the method further comprises:
receiving a second input of a user to the target operation surface;
responding to the second input, executing target operation of the application corresponding to the target operation surface, and feeding back second prompt information, wherein the second prompt information is used for prompting a function corresponding to the target operation;
under the condition that each operation surface of the target three-dimensional control comprises M function adjustment areas, executing the target operation of the application corresponding to the target operation surface and feeding back second prompt information, wherein the method comprises the following steps:
Executing the target operation corresponding to the target function adjustment area in the target operation surface by the second input, and feeding back third prompt information, wherein the third prompt information is used for prompting the function corresponding to the target function adjustment area, and M is a positive integer;
and after the third prompt information is fed back, simultaneously displaying the target three-dimensional control and a function adjustment interface corresponding to the target function adjustment area.
2. The method of claim 1, wherein after the receiving the first input to the target three-dimensional control by the user, the method further comprises:
responding to the first input, and feeding back at least one first prompt message;
and feeding back the first prompt information once every time the focus operation surface of the target three-dimensional control is replaced, wherein the first prompt information comprises at least one of the following items: voice information, texture haptic information.
3. The method of claim 1, wherein prior to receiving the first input by the user to the target three-dimensional control, the method further comprises:
receiving a third input of a user on the first interface;
responsive to the third input, the target three-dimensional control is launched.
4. The method of claim 1, wherein prior to receiving the first input by the user to the target three-dimensional control, the method further comprises:
receiving a fourth input of setting the target three-dimensional control by a user;
and setting N pieces of operation surface attribute information of the target three-dimensional control in response to the fourth input.
5. An applied touch control operation device is characterized by comprising a receiving module and an executing module;
the receiving module is used for receiving first input of a user to a target three-dimensional control, the target three-dimensional control comprises N operation surfaces, and different operation surfaces indicate different applications;
the execution module is used for responding to the first input received by the receiving module, feeding back first prompt information and controlling the target three-dimensional control to rotate to a target operation surface;
the target operation surface is a rotated focus operation surface of the target three-dimensional control, the focus operation surface is an operation surface with the largest operation area, and the first prompt information is used for indicating an application corresponding to the target operation surface;
the receiving module is further used for receiving a second input of a user to the target operation surface;
The execution module is further configured to execute a target operation of an application corresponding to the target operation surface in response to the second input received by the receiving module, and feed back second prompt information, where the second prompt information is used to prompt a function corresponding to the target operation;
in the case where each of the operation surfaces of the target three-dimensional control includes M function adjustment regions,
the execution module is specifically configured to execute a target operation corresponding to a target function adjustment area in the target operation surface, which is received by the receiving module, and feed back third prompt information, where the third prompt information is used to prompt a function corresponding to the target function adjustment area, and M is a positive integer;
and after the third prompt information is fed back, simultaneously displaying the target three-dimensional control and a function adjustment interface corresponding to the target function adjustment area.
6. The apparatus of claim 5, further comprising a feedback module;
the feedback module is used for responding to the first input received by the receiving module and feeding back at least one first prompt message;
and feeding back the first prompt information once every time the focus operation surface of the target three-dimensional control is replaced, wherein the first prompt information comprises at least one of the following items: voice information, texture haptic information.
7. The apparatus of claim 5, further comprising a start-up module;
the receiving module is further used for receiving a third input of the user on the first interface;
the starting module is used for responding to the third input received by the receiving module and starting the target three-dimensional control.
8. The apparatus of claim 5, further comprising a start-up module;
the receiving module is further used for receiving a fourth input for setting the target three-dimensional control by a user;
the starting module is used for responding to the fourth input received by the receiving module and setting N pieces of operation surface attribute information of the target three-dimensional control.
9. An electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the touch operation method of an application as claimed in any one of claims 1-4.
10. A readable storage medium, wherein a program or instructions is stored on the readable storage medium, which when executed by a processor, implements the steps of the touch operation method of an application according to any of claims 1-4.
CN202110335730.7A 2021-03-29 2021-03-29 Touch operation method and device for application and electronic equipment Active CN113157180B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110335730.7A CN113157180B (en) 2021-03-29 2021-03-29 Touch operation method and device for application and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110335730.7A CN113157180B (en) 2021-03-29 2021-03-29 Touch operation method and device for application and electronic equipment

Publications (2)

Publication Number Publication Date
CN113157180A CN113157180A (en) 2021-07-23
CN113157180B true CN113157180B (en) 2024-03-12

Family

ID=76885413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110335730.7A Active CN113157180B (en) 2021-03-29 2021-03-29 Touch operation method and device for application and electronic equipment

Country Status (1)

Country Link
CN (1) CN113157180B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102054134A (en) * 2009-10-29 2011-05-11 宏碁股份有限公司 Multidirectional input method and electronic system
CN103777743A (en) * 2012-10-23 2014-05-07 联想(北京)有限公司 Information processing method and electronic device
CN108170501A (en) * 2017-12-19 2018-06-15 阿里巴巴集团控股有限公司 Using exchange method and device
CN110523085A (en) * 2019-08-30 2019-12-03 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN111782030A (en) * 2020-05-19 2020-10-16 上海传英信息技术有限公司 Setting method of three-dimensional icon, electronic device and readable storage medium
CN112099691A (en) * 2020-09-21 2020-12-18 珠海格力电器股份有限公司 Display method and device of application program icon, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101608764B1 (en) * 2009-07-14 2016-04-04 엘지전자 주식회사 Mobile terminal and method for controlling display thereof
US9575560B2 (en) * 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102054134A (en) * 2009-10-29 2011-05-11 宏碁股份有限公司 Multidirectional input method and electronic system
CN103777743A (en) * 2012-10-23 2014-05-07 联想(北京)有限公司 Information processing method and electronic device
CN108170501A (en) * 2017-12-19 2018-06-15 阿里巴巴集团控股有限公司 Using exchange method and device
CN110523085A (en) * 2019-08-30 2019-12-03 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN111782030A (en) * 2020-05-19 2020-10-16 上海传英信息技术有限公司 Setting method of three-dimensional icon, electronic device and readable storage medium
CN112099691A (en) * 2020-09-21 2020-12-18 珠海格力电器股份有限公司 Display method and device of application program icon, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113157180A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN110083282B (en) Man-machine interaction method, device, terminal and medium based on information display page
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US9348416B2 (en) Haptic feedback control system
KR102091161B1 (en) Mobile terminal and control method for the mobile terminal
KR101810468B1 (en) Mobile terminal and control method thereof
CN105912241A (en) Method and device for man-machine interaction, and terminal
CN112163406B (en) Interactive message display method and device, computer equipment and storage medium
US20190369714A1 (en) Displaying physical input devices as virtual objects
WO2013095671A1 (en) Transition mechanism for computing system utilizing user sensing
CN112230907B (en) Program generation method, device, terminal and storage medium
WO2022111397A1 (en) Control method and apparatus, and electronic device
CN111651106A (en) Unread message prompting method, unread message prompting device, unread message prompting equipment and readable storage medium
CN112764647B (en) Display method, display device, electronic equipment and readable storage medium
CN105843539A (en) Information processing method and electronic device
CN113157180B (en) Touch operation method and device for application and electronic equipment
CN111638828A (en) Interface display method and device
CN112328155B (en) Input device control method and device and electronic device
CN113204302B (en) Virtual robot-based operation method, device, equipment and storage medium
CN113485625A (en) Electronic equipment response method and device and electronic equipment
CN114100121A (en) Operation control method, device, equipment, storage medium and computer program product
CN114090110A (en) Application program starting method and device and electronic equipment
CN112764618A (en) Interface operation method, device, equipment and storage medium
CN113608649B (en) Method, device, equipment and readable storage medium for displaying sliding list
CN113094282B (en) Program block running method, device, equipment and storage medium
CN112230906B (en) Method, device and equipment for creating list control and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant