CN107765987A - A kind of user interaction approach and device - Google Patents
A kind of user interaction approach and device Download PDFInfo
- Publication number
- CN107765987A CN107765987A CN201711072743.XA CN201711072743A CN107765987A CN 107765987 A CN107765987 A CN 107765987A CN 201711072743 A CN201711072743 A CN 201711072743A CN 107765987 A CN107765987 A CN 107765987A
- Authority
- CN
- China
- Prior art keywords
- special effect
- effect information
- trigger
- current
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 21
- 238000000034 method Methods 0.000 claims abstract description 61
- 230000000694 effects Effects 0.000 claims description 285
- 238000004891 communication Methods 0.000 claims description 18
- 230000015654 memory Effects 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 6
- 210000001508 eye Anatomy 0.000 description 11
- 210000005252 bulbus oculi Anatomy 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 241000533950 Leucojum Species 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 241001122767 Theaceae Species 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The embodiments of the invention provide a kind of user interaction approach, and applied to field of computer technology, this method includes:By detecting the trigger action of user, when detecting the trigger action of user, trigger action is able to carry out, and show special-effect information corresponding to trigger action.The embodiments of the invention provide a kind of user interaction approach and device to be used for when performing trigger action, shows corresponding special efficacy, so as to the enjoyment being lifted at during execution trigger action, and then lifts Consumer's Experience.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a user interaction method and device.
Background
Along with the development of intelligent terminals, the use of the intelligent terminals is more and more popular. At present, intelligent terminals bring convenience to people, and application software on the intelligent terminals is more and more abundant. Ordinary application software can basically meet daily requirements of users, and the users operate the application software to realize specific operation.
In the prior art, specific operation is realized only by operating application software, which is tedious and incapable of causing a user to generate corresponding fun to an operation process, so that the fun of the user in the process of operating the application software is low, and further the experience of the user is poor.
Disclosure of Invention
In order to overcome the above technical problems or at least partially solve the above technical problems, the following technical solutions are proposed:
according to a first aspect, an embodiment of the present invention provides a user interaction method, including:
detecting a trigger operation of a user;
and when the trigger operation of the user is detected, executing the trigger operation and displaying special effect information corresponding to the trigger operation.
Further, before the step of displaying the special effect information corresponding to the trigger operation, the method further includes:
and determining special effect information corresponding to the trigger operation.
Specifically, when the trigger operation includes: when the trigger operation based on the external object is performed, the method for determining the special effect information corresponding to the trigger operation comprises the following steps:
determining trigger position information of a current trigger operation based on an external object on a screen as first position information;
and determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relation between the trigger position information and the special effect information and the first position information.
Further, before the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information, the method further includes:
and configuring the corresponding relation between the trigger position information and the special effect information.
Specifically, when the external object-based trigger operation includes: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during touch operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a touch position of the current touch operation on the touch screen;
and determining special effect information corresponding to the current touch operation according to the corresponding relation between the touch position and the special effect information and the touch position of the current touch operation on the touch screen.
Specifically, when the external object-based trigger operation includes: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during touch operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a relative position relation between a touch position of the current touch operation on the touch screen and a first reference position;
and determining special effect information corresponding to the touch operation according to the corresponding relation between the relative position relation and the special effect information and the relative position relation between the current touch position and the first reference position.
The special effect information is a special effect animation comprising direction indication information, and the relative position relation and the direction indication information of the special effect animation keep the same direction.
Specifically, when the trigger operation includes: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during sliding operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining the sliding direction of the current sliding operation;
and determining special effect information corresponding to the current sliding operation according to the sliding direction of the current sliding operation and the corresponding relation between the sliding direction and the special effect information.
Specifically, when the trigger operation includes: when a mouse is clicked, determining the trigger position information of the current trigger operation based on an external object on a screen as first position information; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining the click position of the current click operation on the screen;
and determining special effect information corresponding to the current click operation according to the corresponding relation between the click position and the special effect information and the click position of the current click operation on the screen.
Specifically, when the trigger operation includes: when a mouse is clicked, determining the trigger position information of the current trigger operation based on an external object on a screen as first position information; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a relative position relation between a click position of the current click operation on the screen and a second reference position;
and determining the special effect information corresponding to the current click operation according to the corresponding relation between the relative position relation and the special effect information and the relative position relation between the click position of the current click operation on the screen and the second reference position.
Specifically, when the trigger operation includes: when the trigger operation is based on the sound, the method for determining the special effect information corresponding to the trigger operation comprises the following steps:
determining sound source position information corresponding to the current sound-based trigger operation as second position information;
and determining special effect information corresponding to the current trigger operation based on the sound according to the corresponding relation between the sound source position information and the special effect information and the second position information.
Further, before the step of determining special effect information corresponding to the current sound-based trigger operation according to the correspondence between the sound source position information and the special effect information and the second position information, the method further includes:
and configuring the corresponding relation between the sound source position information and the special effect information.
Specifically, sound source position information corresponding to the current sound-based trigger operation is determined as second position information; the step of determining special effect information corresponding to the current trigger operation based on the sound according to the corresponding relation between the sound source position information and the special effect information and the second position information comprises the following steps:
determining a sound source direction corresponding to the current trigger operation based on sound;
and determining special effect information corresponding to the current voice-based trigger operation according to the corresponding relation between the voice source direction and the special effect information and the voice source direction corresponding to the current voice-based trigger operation.
Specifically, sound source position information corresponding to the current sound-based trigger operation is determined as second position information; the step of determining special effect information corresponding to the current trigger operation based on the sound according to the corresponding relation between the sound source position information and the special effect information and the second position information comprises the following steps:
determining a relative distance between a sound source position corresponding to the current sound-based trigger operation and the terminal equipment;
and determining the special effect information corresponding to the current voice-based trigger operation according to the corresponding relation between the relative distance and the special effect information and the relative distance between the sound source position corresponding to the current voice-based trigger operation and the terminal equipment.
Wherein the special effect information includes at least one of:
special effect information related to the animation; special effect information relating to sound; effect information relating to vibrations.
An embodiment of the present invention further provides, according to a second aspect, a user interaction apparatus, including:
the detection module is used for detecting the triggering operation of a user;
the execution module is used for executing the triggering operation when the detection module detects the triggering operation of the user;
and the display module is used for displaying the special effect information corresponding to the trigger operation.
Further, the apparatus further comprises: a determination module;
and the determining module is used for determining special effect information corresponding to the triggering operation.
Specifically, the determining module is specifically configured to determine, as the first position information, trigger position information of a current trigger operation based on an external object on a screen;
the determining module is specifically further configured to determine, based on the corresponding relationship between the trigger position information and the special effect information and the first position information, special effect information corresponding to a current trigger operation based on the external object.
Further, the apparatus further comprises: a configuration module;
and the configuration module is used for configuring the corresponding relation between the trigger position information and the special effect information.
Specifically, the external object-based triggering operation includes: when the touch operation is carried out,
the determining module is specifically used for determining the touch position of the current touch operation on the touch screen;
the determining module is specifically further configured to determine special effect information corresponding to the current touch operation according to the corresponding relationship between the touch position and the special effect information and the touch position of the current touch operation on the touch screen.
Specifically, when the external object-based trigger operation includes: when the touch operation is carried out,
the determining module is specifically used for determining a relative position relation between a touch position of the current touch operation on the touch screen and a first reference position;
the determining module is specifically further configured to determine special effect information corresponding to the touch operation according to a corresponding relationship between the relative position relationship and the special effect information and a relative position relationship between the current touch position and the first reference position.
The special effect information is a special effect animation comprising direction indication information, and the relative position relation and the direction indication information of the special effect animation keep the same direction.
Specifically, when the trigger operation includes: when the sliding operation is carried out, the sliding operation,
the determining module is specifically used for determining the sliding direction of the current sliding operation;
the determining module is specifically further configured to determine special effect information corresponding to the current sliding operation according to the sliding direction of the current sliding operation and a corresponding relationship between the sliding direction and the special effect information.
Specifically, when the trigger operation includes: when the mouse is clicked to operate, the mouse can be used,
the determining module is specifically used for determining the clicking position of the current clicking operation on the screen;
the determining module is specifically further configured to determine special effect information corresponding to the current click operation according to the corresponding relationship between the click position and the special effect information and the click position of the current click operation on the screen.
Specifically, when the trigger operation includes: when the mouse is clicked to operate, the mouse can be used,
the determining module is specifically used for determining a relative position relation between a click position of the current click operation on the screen and a second reference position;
the determining module is specifically configured to determine special effect information corresponding to the current click operation according to a corresponding relationship between the relative position relationship and the special effect information and a relative position relationship between a click position of the current click operation on the screen and a second reference position.
Specifically, when the trigger operation includes: in the case of a sound-based trigger operation,
the determining module is specifically used for determining sound source position information corresponding to the current trigger operation based on the sound as second position information;
the determining module is specifically further configured to determine special effect information corresponding to the current trigger operation based on the sound according to the correspondence between the sound source position information and the special effect information and the second position information.
Further, the configuration module is further configured to configure a corresponding relationship between the sound source position information and the special effect information.
Specifically, the determining module is specifically configured to determine a sound source direction corresponding to a current sound-based trigger operation;
the determining module is specifically further configured to determine special effect information corresponding to the current sound-based triggering operation according to a correspondence between the sound source direction and the special effect information and the sound source direction corresponding to the current sound-based triggering operation.
Specifically, the determining module is further configured to determine a relative distance between a sound source position corresponding to the current sound-based triggering operation and the terminal device;
the determining module is specifically further configured to determine, according to a correspondence between the relative distance and the special effect information and a relative distance between a sound source position corresponding to the current sound-based triggering operation and the terminal device, special effect information corresponding to the current sound-based triggering operation.
Wherein the special effect information includes at least one of:
special effect information related to the animation; special effect information relating to sound; effect information relating to vibrations.
Embodiments of the present invention also provide, according to a third aspect, a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
According to the fourth aspect, an embodiment of the present invention further provides a terminal device, including: the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the corresponding operation of the user interaction method according to the first aspect.
Compared with the prior art that specific operation is realized only by operating application software, the method and the device for user interaction can execute the trigger operation and display special effect information corresponding to the trigger operation by detecting the trigger operation of the user when the trigger operation of the user is detected. The method and the device have the advantages that the special effect corresponding to the trigger operation is displayed while the trigger operation is executed, and the user can know the current trigger operation according to the displayed special effect, so that the pleasure of the user in the operation process can be improved, and the user experience is further improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a user interaction method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a user interaction apparatus according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of another user interaction device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative only and should not be construed as limiting the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As will be appreciated by those skilled in the art, a "terminal" as used herein includes both devices having a wireless signal receiver, which are devices having only a wireless signal receiver without transmit capability, and devices having receive and transmit hardware, which have devices having receive and transmit hardware capable of two-way communication over a two-way communication link. Such a device may include: a cellular or other communication device having a single line display or a multi-line display or a cellular or other communication device without a multi-line display; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data communication capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. As used herein, a "terminal Device" may also be a communication terminal, a web terminal, a music/video playing terminal, such as a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with music/video playing function, or a smart tv, a set-top box, etc.
Example one
An embodiment of the present invention provides a user interaction method, as shown in fig. 1, including:
step 101, detecting a trigger operation of a user.
For the embodiment of the invention, the terminal equipment can detect the triggering operation of the user in real time. In the embodiment of the present invention, the triggering operation may be at least one of the following: touch operation; performing sliding operation; clicking the mouse; a sound-based trigger operation.
And 102, when the trigger operation of the user is detected, executing the trigger operation, and displaying special effect information corresponding to the trigger operation.
Wherein the special effect information includes at least one of: special effect information related to the animation; special effect information relating to sound; effect information relating to vibrations.
In a specific embodiment of the present invention, if the user touches a virtual key to display the video to be played, the video to be played is displayed, and a "running water" sound is generated.
Compared with the prior art that specific operation is realized only by operating application software, the user interaction method provided by the embodiment of the invention can execute the trigger operation and display special effect information corresponding to the trigger operation by detecting the trigger operation of the user when the trigger operation of the user is detected. The embodiment of the invention displays the special effect corresponding to the trigger operation while executing the trigger operation, and the user can know the current trigger operation according to the displayed special effect, so that the interest of the user in the operation process can be improved, and the user experience is further improved.
Example two
Another possible implementation manner of the embodiment of the present invention further includes, on the basis of the operation shown in the first embodiment, the operation shown in the second embodiment, wherein,
before the step of displaying the special effect information corresponding to the trigger operation, the method further comprises the following steps: and determining special effect information corresponding to the trigger operation.
For the embodiment of the invention, the corresponding special effect information is different according to different types of the trigger operation.
EXAMPLE III
Another possible implementation manner of the embodiment of the present invention further includes, on the basis of the operation shown in the second embodiment, the operation shown in the third embodiment, wherein,
when the trigger operation comprises: when the trigger operation based on the external object is performed, the method for determining the special effect information corresponding to the trigger operation comprises the following steps: determining trigger position information of a current trigger operation based on an external object on a screen as first position information; and determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relation between the trigger position information and the special effect information and the first position information.
For embodiments of the invention, the external object may comprise at least one of: a finger; a stylus; a mouse. The present invention is not limited to the embodiments.
Further, before the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information, the method further includes: and configuring the corresponding relation between the trigger position information and the special effect information.
For the embodiment of the invention, the user can configure the corresponding relation between the trigger position information and the special effect information, and the terminal equipment can also configure the corresponding relation between the trigger position information and the special effect information. The present invention is not limited to the embodiments.
Example four
Another possible implementation manner of the embodiment of the present invention further includes, on the basis of the operation shown in the third embodiment, the operation shown in the fourth embodiment, wherein,
when the external object-based triggering operation includes: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during touch operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a touch position of the current touch operation on the touch screen; and determining special effect information corresponding to the current touch operation according to the corresponding relation between the touch position and the special effect information and the touch position of the current touch operation on the touch screen.
For the embodiment of the present invention, the touch operation includes: finger-based touch operations and stylus-based touch operations. The present invention is not limited to the embodiments.
For the embodiment of the present invention, before the step of touching the touch position on the touch screen according to the corresponding relationship between the touch position and the special effect information and the current touch operation, the method further includes: and configuring the corresponding relation between the touch position and the special effect information. In the embodiment of the present invention, the touch position may be a position of the virtual key on the screen.
For example, the special effect information is special effect animation similar to "eyes", wherein the "eyeball" part is used as a control and can move, if four virtual keys are displayed on the current screen, namely a virtual key 1, a virtual key 2, a virtual key 3 and a virtual key 4, wherein the virtual key 1, the virtual key 2, the virtual key 3, the virtual key 4 and the special effect information are located in a display area below the screen, wherein the virtual key 1 and the virtual key 2 are located on the left side of the special effect information, the virtual key 3 and the virtual key 4 are located on the right side of the special effect information, the special effect information corresponding to the virtual key 1 (position 1) is that the "eyeball" control moves upwards, the special effect information corresponding to the virtual key 2 (position 2) is that the "eyeball" control moves leftwards, the special effect information corresponding to the virtual key 3 (position 3) is that the "eyeball" control moves downwards, the special effect information corresponding to the virtual key 4 (position 4) is that the control of the 'eyeball' moves rightwards.
For the embodiment of the present invention, the corresponding relationship between the touch position and the special effect information may be configured by the user, or may be configured by the terminal device. The present invention is not limited to the embodiments.
For the embodiment of the invention, a plurality of virtual keys can be displayed on the current screen, and different virtual keys correspond to different positions on the screen. In the embodiment of the invention, when a user touches the virtual key on the screen to trigger corresponding operation, the terminal device determines special effect information corresponding to the current trigger operation according to the position of the currently triggered virtual key and the corresponding relation between the touch position and the special effect information.
For example, when the user triggers the virtual key 1 and the terminal device executes the operation corresponding to the virtual key 1, the displayed special effect information is that the "eyeball" control moves upwards.
When the external object-based triggering operation includes: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during touch operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a relative position relation between a touch position of the current touch operation on the touch screen and a first reference position; and determining special effect information corresponding to the touch operation according to the corresponding relation between the relative position relation and the special effect information and the relative position relation between the current touch position and the first reference position.
The special effect information is a special effect animation comprising direction indication information, and the relative position relation and the direction indication information of the special effect animation keep the same direction.
For the embodiment of the present invention, before the step of determining the special effect information corresponding to the touch operation according to the corresponding relationship between the relative position relationship and the special effect information and the relative position relationship between the current touch position and the first reference position, the method further includes: and configuring the corresponding relation between the relative position relation and the special effect information.
For the embodiment of the present invention, the corresponding relationship between the relative position relationship and the special effect information may be configured by the user, or may be configured by the terminal device. The present invention is not limited to the embodiments.
For the embodiment of the present invention, the relative position relationship may be a direction in which the touch position is located relative to the first reference position; the distance of the touch position with respect to the first reference position may also be used. The present invention is not limited to the embodiments.
For example, the special effect information is a special effect animation similar to "eyes", and the special effect information is disposed in a display area below the screen, the first reference position may be a position of the first reference information on the screen, the virtual key 1 and the virtual key 2 are located on the left side of the special effect information, the virtual key 3 and the virtual key 4 are located on the right side of the special effect information, if the relative position relationship is a direction in which the touch position is located relative to the first reference position, the touch position is configured to be located on the left side of the special effect information, the corresponding special effect information is that the "eyeball" control moves to the left side, that is, when the virtual key 1 and the virtual key 2 are touched, the corresponding special effect information is that the "eyeball" control moves to the left side; if the relative position relationship is the direction of the touch position relative to the first reference position, the touch position is configured to be located on the right side of the special effect information, and the corresponding special effect information is that the "eyeball" control moves to the right side, that is, when the virtual key 3 and the virtual key 4 are touched, the corresponding special effect information is that the "eyeball" control moves to the right side.
For example, the special effect information is a special effect animation similar to "eyes", and the special effect information is disposed in a display area below the screen, the first reference position may be a position of the first reference information on the screen, the distances between the virtual key 1 and the virtual key 3 and the special effect information are respectively the same as a first relative distance, the distances between the virtual key 2 and the virtual key 4 and the special effect information are respectively the same as a second relative distance, and if the relative position relationship is a distance between the touch position and the first reference position, the special effect information corresponding to the first relative distance is configured to move the "eyes" control upward; namely, the virtual key 1 and the virtual key 3 are touched, and the displayed special effect information is that the 'eyes' control moves upwards; configuring special effect information corresponding to the second relative distance as an 'eye' control to move downwards; namely, the virtual key 2 and the virtual key 4 are touched, and the displayed special effect information is that the 'eyes' control moves downwards.
When the trigger operation comprises: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during sliding operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining the sliding direction of the current sliding operation; and determining special effect information corresponding to the current sliding operation according to the sliding direction of the current sliding operation and the corresponding relation between the sliding direction and the special effect information.
For the embodiment of the present invention, before the step of determining the special effect information corresponding to the current sliding operation according to the sliding direction of the current sliding operation and the corresponding relationship between the sliding direction and the special effect information, the method further includes: and configuring the corresponding relation between the sliding direction and the special effect information.
For the embodiment of the present invention, the corresponding relationship between the sliding direction and the special effect information may be configured by the user, or may be configured by the terminal device. The present invention is not limited to the embodiments.
For example, when the user slides left on the screen, the displayed special effect may be that the "eyes" control moves left; when the user slides to the right on the screen, the displayed special effect can be that the 'eyes' control moves to the right; when the user slides upwards on the screen, the displayed special effect can be that the 'eyes' control moves upwards; when the user slides down the screen, the displayed special effect may be a downward movement of the "eyes" control.
As another example, when the user slides left on the screen, the special effect displayed may be "snowflake"; when the user slides to the right on the screen, the displayed special effect can be 'lightning'; when the user slides up the screen, the displayed special effect may be "wave; the special effect displayed may be "fallen leaves" as the user slides down the screen.
For the embodiment of the invention, the special effect information corresponding to the current sliding operation can be determined according to the sliding direction of the current sliding operation and the corresponding relation between the sliding direction and the special effect information by determining the sliding direction of the current sliding operation, so that the special effect corresponding to the sliding operation is displayed in the process of executing the sliding operation, the fun of the user in operating the terminal equipment can be improved, and in addition, the user can determine whether the current sliding operation is misoperation according to the displayed special effect information, so that the misoperation can be reduced, and the experience of the user can be improved.
When the trigger operation comprises: when a mouse is clicked, determining the trigger position information of the current trigger operation based on an external object on a screen as first position information; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining the click position of the current click operation on the screen; and determining special effect information corresponding to the current click operation according to the corresponding relation between the click position and the special effect information and the click position of the current click operation on the screen.
For the embodiment of the present invention, before the step of determining the special effect information corresponding to the current click operation according to the corresponding relationship between the click position and the special effect information and the click position of the current click operation on the screen, the method further includes: and configuring the corresponding relation between the click position and the special effect information.
For the embodiment of the invention, the corresponding relation between the click position and the special effect information can be configured by a user or terminal equipment. The present invention is not limited to the embodiments.
When the trigger operation comprises: when a mouse is clicked, determining the trigger position information of the current trigger operation based on an external object on a screen as first position information; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a relative position relation between a click position of the current click operation on the screen and a second reference position; and determining the special effect information corresponding to the current click operation according to the corresponding relation between the relative position relation and the special effect information and the relative position relation between the click position of the current click operation on the screen and the second reference position.
For the embodiment of the present invention, before the step of determining the special effect information corresponding to the current click operation according to the corresponding relationship between the relative position relationship and the special effect information and the relative position relationship between the click position of the current click operation on the screen and the second reference position, the method further includes: and configuring the corresponding relation between the relative position relation and the special effect information.
For the embodiment of the present invention, the corresponding relationship between the relative position relationship and the special effect information may be configured by the user, or may be configured by the terminal device. The present invention is not limited to the embodiments.
For the embodiment of the present invention, the relative position relationship may be the location direction of the click position relative to the second reference position; the distance of the click position with respect to the second reference position may also be used. The present invention is not limited to the embodiments.
EXAMPLE five
Another possible implementation manner of the embodiment of the present invention, on the basis of the operation shown in the second embodiment, further includes the operation shown in the fifth embodiment, wherein,
when the trigger operation is based on the sound, the method for determining the special effect information corresponding to the trigger operation comprises the following steps:
determining sound source position information corresponding to the current sound-based trigger operation as second position information; and determining special effect information corresponding to the current trigger operation based on the sound according to the corresponding relation between the sound source position information and the special effect information and the second position information.
For the embodiment of the present invention, the sound source position information corresponding to the sound-based trigger operation includes: the direction of the sound source position relative to the terminal device and the distance of the sound source position relative to the terminal device.
For the embodiment of the invention, whether the current trigger operation based on the sound is the operation triggered by the user, if the current trigger operation based on the sound is the operation triggered by the user, the special effect information corresponding to the current trigger operation based on the sound is determined, and the corresponding special effect information is displayed in the process of executing the trigger operation.
For the embodiment of the invention, when the trigger operation based on the sound is detected, the trigger of the current trigger operation is determined by detecting the tone and/or the segment of the sound, and corresponding special effect information is displayed.
For the embodiment of the invention, a section of voice corresponding to each user is recorded, the tone and/or the section of the voice are/is detected, and the tone and/or the section of the voice are/is corresponding to the special effect information.
For example, the special effect information corresponding to the trigger operation from the local user may be configured as "shake", and the special effect information corresponding to the trigger operation of the other user may be set as no special effect.
For the embodiment of the invention, when the trigger operation based on the sound is detected, the trigger of the current trigger operation can be determined by detecting the tone and/or the segment of the sound, and the corresponding special effect information is displayed. In other words, in a noisy environment, whether the current trigger operation based on the sound is the trigger operation of the local user can be determined according to the displayed special effect information, so that the misoperation can be further reduced, and the user experience is further improved.
EXAMPLE six
Another possible implementation manner of the embodiment of the present invention is, on the basis of the fifth embodiment, further including the operation shown in the sixth embodiment, wherein,
before the step of determining the special effect information corresponding to the current trigger operation based on the sound according to the corresponding relationship between the sound source position information and the special effect information and the second position information, the method further comprises the following steps:
and configuring the corresponding relation between the sound source position information and the special effect information.
For the embodiment of the present invention, the corresponding relationship between the configured sound source position information and the special effect information may be configured by the user, or may be configured by the terminal device. The present invention is not limited to the embodiments.
EXAMPLE seven
Another possible implementation manner of the embodiment of the present invention further includes, on the basis of the operation shown in the fifth embodiment or the sixth embodiment, the operation shown in the seventh embodiment, wherein,
determining sound source position information corresponding to the current sound-based trigger operation as second position information; the step of determining special effect information corresponding to the current trigger operation based on the sound according to the corresponding relation between the sound source position information and the special effect information and the second position information comprises the following steps:
determining a sound source direction corresponding to the current trigger operation based on sound; and determining special effect information corresponding to the current voice-based trigger operation according to the corresponding relation between the voice source direction and the special effect information and the voice source direction corresponding to the current voice-based trigger operation.
For an embodiment of the invention, the sound source directions comprise: the sound source is located to the left of the terminal device, the sound source is located to the right of the terminal device, the sound source is located below the terminal device, and the sound source is located above the terminal device. The present invention is not limited to the embodiments.
For the embodiment of the present invention, before the step of determining the special effect information corresponding to the current trigger operation based on sound according to the correspondence between the sound source direction and the special effect information and the sound source direction corresponding to the current trigger operation based on sound, the method further includes: the correspondence between the sound source direction corresponding to the sound-based trigger operation and the special effect information is configured.
For example, the special effect information corresponding to the sound source located on the left side of the terminal device is playing 'running water' sound; the special effect information corresponding to the sound source positioned at the right side of the terminal equipment is playing 'wave' sound; the special effect information corresponding to the sound source positioned below the terminal equipment is the sound for playing the tea frying; the special effect information corresponding to the sound source positioned above the terminal equipment is playing 'radiation' sound.
Determining sound source position information corresponding to the current sound-based trigger operation as second position information; the step of determining special effect information corresponding to the current trigger operation based on the sound according to the corresponding relation between the sound source position information and the special effect information and the second position information comprises the following steps:
determining a relative distance between a sound source position corresponding to the current sound-based trigger operation and the terminal equipment; and determining the special effect information corresponding to the current voice-based trigger operation according to the corresponding relation between the relative distance and the special effect information and the relative distance between the sound source position corresponding to the current voice-based trigger operation and the terminal equipment.
For the embodiment of the invention, the relative distance between the sound source position and the terminal equipment represents the distance between the sound source position and the terminal equipment.
For the embodiment of the present invention, before the step of determining the special effect information corresponding to the current voice-based trigger operation according to the corresponding relationship between the relative distance and the special effect information and the relative distance between the sound source position corresponding to the current voice-based trigger operation and the terminal device, the method further includes: and configuring the corresponding relation between the relative distance and the special effect information.
For example, the relative distance between the sound source position corresponding to the sound-based trigger operation and the terminal device is 0-10 centimeters (cm), and the corresponding special effect information is that "snowflakes" are displayed; displaying 'lightning' corresponding to special effect information that the relative distance between a sound source position corresponding to the sound-based triggering operation and the terminal equipment is 10-20 cm; the special effect information corresponding to the relative distance between the sound source position and the terminal device, which corresponds to the sound-based triggering operation, being greater than 20cm is display "lightning".
For the embodiment of the invention, the relative distance between the sound source position corresponding to the current trigger operation based on the sound and the terminal equipment is determined, and the special effect information corresponding to the current trigger operation based on the sound can be determined according to the corresponding relation between the relative distance and the special effect information and the relative distance between the sound source position corresponding to the current trigger operation based on the sound and the terminal equipment, namely, in a noisy environment, a user can determine whether the current trigger operation based on the sound is effective operation according to the special effect information, so that the possibility of misoperation can be reduced, and the experience degree of the user can be improved.
An embodiment of the present invention provides a user interaction apparatus, as shown in fig. 2, the apparatus includes: a detection module 21, an execution module 22 and a display module 23; wherein,
the detection module 21 is used for detecting the trigger operation of the user;
an executing module 22, configured to execute a triggering operation when the detecting module 21 detects the triggering operation of the user;
and the display module 23 is configured to display special effect information corresponding to the trigger operation.
Wherein the special effect information includes at least one of:
special effect information related to the animation; special effect information relating to sound; effect information relating to vibrations.
Further, as shown in fig. 3, the apparatus further includes: a determination module 31;
and the determining module 31 is configured to determine special effect information corresponding to the trigger operation.
Specifically, the determining module 31 is specifically configured to determine, as the first position information, trigger position information of a trigger operation on the screen currently based on the external object.
The determining module 31 is further specifically configured to determine, based on the corresponding relationship between the trigger position information and the special effect information and the first position information, special effect information corresponding to a current trigger operation based on the external object.
Further, as shown in fig. 3, the apparatus further includes: a configuration module 32;
and the configuration module 32 is configured to configure a corresponding relationship between the trigger position information and the special effect information.
Specifically, when the external object-based trigger operation includes: during the touch operation, the determining module 31 is specifically further configured to determine a touch position of the current touch operation on the touch screen. The determining module 31 is further specifically configured to determine special effect information corresponding to the current touch operation according to the corresponding relationship between the touch position and the special effect information and the touch position of the current touch operation on the touch screen.
Specifically, when the external object-based trigger operation includes: during the touch operation, the determining module 31 is specifically further configured to determine a relative position relationship between the touch position of the current touch operation on the touch screen and the first reference position. The determining module 31 is further specifically configured to determine special effect information corresponding to the touch operation according to a corresponding relationship between the relative position relationship and the special effect information and a relative position relationship between the current touch position and the first reference position.
The special effect information is a special effect animation comprising direction indication information, and the relative position relation and the direction indication information of the special effect animation keep the same direction.
Specifically, when the trigger operation includes: during the sliding operation, the determining module 31 is specifically configured to determine a sliding direction of the current sliding operation; the determining module 31 is further specifically configured to determine special effect information corresponding to the current sliding operation according to the sliding direction of the current sliding operation and a corresponding relationship between the sliding direction and the special effect information.
Specifically, when the trigger operation includes: the determining module 31 is specifically configured to determine a click position of a current click operation on a screen when the mouse clicks; the determining module 31 is further specifically configured to determine special effect information corresponding to the current click operation according to a corresponding relationship between the click position and the special effect information and a click position of the current click operation on the screen.
Specifically, when the trigger operation includes: the determining module 31 is specifically configured to determine a relative position relationship between a click position of the current click operation on the screen and a second reference position when the mouse is clicked; the determining module 31 is further specifically configured to determine special effect information corresponding to the current click operation according to a corresponding relationship between the relative position relationship and the special effect information and a relative position relationship between a click position of the current click operation on the screen and a second reference position.
Specifically, when the trigger operation includes: the determining module 31 is specifically configured to determine, as the second position information, sound source position information corresponding to a current sound-based triggering operation; the determining module 31 is further specifically configured to determine special effect information corresponding to the current trigger operation based on the sound source position information and the special effect information according to the corresponding relationship between the sound source position information and the special effect information, and the second position information.
Further, the configuration module 32 is further configured to configure a corresponding relationship between the sound source position information and the special effect information.
Specifically, the determining module 31 is specifically configured to determine a sound source direction corresponding to the current sound-based triggering operation; the determining module 31 is further specifically configured to determine special effect information corresponding to the current sound-based triggering operation according to the correspondence between the sound source direction and the special effect information and the sound source direction corresponding to the current sound-based triggering operation.
Specifically, the determining module 31 is further configured to determine a relative distance between a sound source position corresponding to the current sound-based triggering operation and the terminal device; the determining module 31 is further specifically configured to determine the special effect information corresponding to the current sound-based triggering operation according to the corresponding relationship between the relative distance and the special effect information and the relative distance between the sound source position corresponding to the current sound-based triggering operation and the terminal device.
Compared with the prior art that specific operation is realized only by operating application software, the user interaction device provided by the embodiment of the invention can execute the trigger operation and display special effect information corresponding to the trigger operation by detecting the trigger operation of the user. The embodiment of the invention displays the special effect corresponding to the trigger operation while executing the trigger operation, and the user can know the current trigger operation according to the displayed special effect, so that the interest of the user in the operation process can be improved, and the user experience is further improved.
The user interaction device provided in the embodiment of the present invention can implement the method embodiment provided above, and for specific function implementation, reference is made to the description in the method embodiment, which is not described herein again.
The embodiment of the invention provides a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program realizes the method of any one of the first embodiment to the seventh embodiment.
Compared with the prior art that specific operation is realized only by operating application software, the embodiment of the invention detects the trigger operation of the user, can execute the trigger operation when the trigger operation of the user is detected, and displays special effect information corresponding to the trigger operation. The embodiment of the invention displays the special effect corresponding to the trigger operation while executing the trigger operation, and the user can know the current trigger operation according to the displayed special effect, so that the interest of the user in the operation process can be improved, and the user experience is further improved.
The computer-readable storage medium provided in the embodiments of the present invention can implement the method embodiments provided above, and for specific function implementation, reference is made to the description in the method embodiments, which is not repeated herein.
An embodiment of the present invention provides a terminal device, including: the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the user interaction method as shown in any one of the first embodiment to the seventh embodiment.
Compared with the prior art that specific operation is realized only by operating application software, the terminal equipment provided by the embodiment of the invention can execute the trigger operation and display special effect information corresponding to the trigger operation by detecting the trigger operation of the user. The embodiment of the invention displays the special effect corresponding to the trigger operation while executing the trigger operation, and the user can know the current trigger operation according to the displayed special effect, so that the pleasure of the user in the operation process can be improved, and the user experience can be further improved.
The terminal device provided in the embodiment of the present invention may implement the method embodiment provided above, and for specific function implementation, reference is made to the description in the method embodiment, which is not described herein again.
Those skilled in the art will appreciate that the present invention includes apparatus directed to performing one or more of the operations described in the present application. These devices may be specially designed and manufactured for the required purposes, or they may comprise known devices in general-purpose computers. These devices have stored therein computer programs that are selectively activated or reconfigured. Such a computer program may be stored in a device (e.g., computer) readable medium, including, but not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magnetic-optical disks, ROMs (Read-Only memories), RAMs (Random Access memories), EPROMs (Erasable programmable Read-Only memories), EEPROMs (Electrically Erasable programmable Read-Only memories), flash memories, magnetic cards, or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a bus. That is, a readable medium includes any medium that stores or transmits information in a form readable by a device (e.g., a computer).
It will be understood by those within the art that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions. Those skilled in the art will appreciate that the computer program instructions may be implemented by a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the features specified in the block or blocks of the block diagrams and/or flowchart illustrations of the present disclosure.
Those of skill in the art will appreciate that various operations, methods, steps in the processes, acts, or solutions discussed in the present application may be alternated, modified, combined, or deleted. Further, various operations, methods, steps in the flows, which have been discussed in the present application, may be interchanged, modified, rearranged, decomposed, combined, or eliminated. Further, steps, measures, schemes in the various operations, methods, procedures disclosed in the prior art and the present invention can also be alternated, changed, rearranged, decomposed, combined, or deleted.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A method of user interaction, comprising:
detecting a trigger operation of a user;
and when the trigger operation of the user is detected, executing the trigger operation and displaying special effect information corresponding to the trigger operation.
2. The method according to claim 1, wherein before the step of displaying the special effect information corresponding to the trigger operation, the method further comprises:
and determining special effect information corresponding to the trigger operation.
3. The method of claim 2, wherein when the triggering operation comprises: when the trigger operation is based on the external object, the method for determining the special effect information corresponding to the trigger operation comprises the following steps:
determining trigger position information of a current trigger operation based on an external object on a screen as first position information;
and determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relation between the trigger position information and the special effect information and the first position information.
4. The method according to claim 3, wherein, before the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information, the method further comprises:
and configuring the corresponding relation between the trigger position information and the special effect information.
5. The method according to claim 3 or 4, wherein when the external object based triggering operation comprises: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during touch operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a touch position of the current touch operation on the touch screen;
and determining special effect information corresponding to the current touch operation according to the corresponding relation between the touch position and the special effect information and the touch position of the current touch operation on the touch screen.
6. The method according to claim 3 or 4, wherein when the external object based triggering operation comprises: determining trigger position information of the current trigger operation based on the external object on a screen as first position information during touch operation; the step of determining special effect information corresponding to the current trigger operation based on the external object based on the corresponding relationship between the trigger position information and the special effect information and the first position information includes:
determining a relative position relation between a touch position of the current touch operation on the touch screen and a first reference position;
and determining special effect information corresponding to the touch operation according to the corresponding relation between the relative position relation and the special effect information and the relative position relation between the current touch position and the first reference position.
7. A user interaction device, comprising:
the detection module is used for detecting the triggering operation of a user;
the execution module is used for executing the trigger operation when the detection module detects the trigger operation of the user;
and the display module is used for displaying the special effect information corresponding to the trigger operation.
8. The apparatus of claim 7, further comprising: a determination module;
the determining module is configured to determine special effect information corresponding to the trigger operation.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the method of any one of claims 1-6.
10. A terminal device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the user interaction method of any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711072743.XA CN107765987A (en) | 2017-11-03 | 2017-11-03 | A kind of user interaction approach and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711072743.XA CN107765987A (en) | 2017-11-03 | 2017-11-03 | A kind of user interaction approach and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107765987A true CN107765987A (en) | 2018-03-06 |
Family
ID=61272145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711072743.XA Pending CN107765987A (en) | 2017-11-03 | 2017-11-03 | A kind of user interaction approach and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107765987A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110134480A (en) * | 2019-05-20 | 2019-08-16 | 北京字节跳动网络技术有限公司 | Processing method, device, electronic equipment and the storage medium of user's trigger action |
CN110688046A (en) * | 2019-09-24 | 2020-01-14 | 腾讯音乐娱乐科技(深圳)有限公司 | Song playing method and device and storage medium |
CN112148188A (en) * | 2020-09-23 | 2020-12-29 | 北京市商汤科技开发有限公司 | Interaction method and device in augmented reality scene, electronic equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110208524A1 (en) * | 2010-02-25 | 2011-08-25 | Apple Inc. | User profiling for voice input processing |
US20140115510A1 (en) * | 2012-10-23 | 2014-04-24 | Lenovo (Beijing) Co., Ltd. | Information Processing Method And Electronic Device |
CN104407764A (en) * | 2013-10-29 | 2015-03-11 | 贵阳朗玛信息技术股份有限公司 | Method and device of presenting scene effect |
CN106296777A (en) * | 2016-07-29 | 2017-01-04 | 青岛海信电器股份有限公司 | The implementation method of a kind of broken animation and device |
CN106453859A (en) * | 2016-09-23 | 2017-02-22 | 维沃移动通信有限公司 | Voice control method and mobile terminal |
CN106569658A (en) * | 2016-10-19 | 2017-04-19 | 武汉悦然心动网络科技股份有限公司 | Multimedia theme configuration method and multimedia theme configuration system of input method |
CN106570921A (en) * | 2016-11-18 | 2017-04-19 | 广东小天才科技有限公司 | Expression display method and system for cartoon characters |
CN106878825A (en) * | 2017-01-09 | 2017-06-20 | 腾讯科技(深圳)有限公司 | Based on live sound effect methods of exhibiting and device |
-
2017
- 2017-11-03 CN CN201711072743.XA patent/CN107765987A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110208524A1 (en) * | 2010-02-25 | 2011-08-25 | Apple Inc. | User profiling for voice input processing |
US20140115510A1 (en) * | 2012-10-23 | 2014-04-24 | Lenovo (Beijing) Co., Ltd. | Information Processing Method And Electronic Device |
CN104407764A (en) * | 2013-10-29 | 2015-03-11 | 贵阳朗玛信息技术股份有限公司 | Method and device of presenting scene effect |
CN106296777A (en) * | 2016-07-29 | 2017-01-04 | 青岛海信电器股份有限公司 | The implementation method of a kind of broken animation and device |
CN106453859A (en) * | 2016-09-23 | 2017-02-22 | 维沃移动通信有限公司 | Voice control method and mobile terminal |
CN106569658A (en) * | 2016-10-19 | 2017-04-19 | 武汉悦然心动网络科技股份有限公司 | Multimedia theme configuration method and multimedia theme configuration system of input method |
CN106570921A (en) * | 2016-11-18 | 2017-04-19 | 广东小天才科技有限公司 | Expression display method and system for cartoon characters |
CN106878825A (en) * | 2017-01-09 | 2017-06-20 | 腾讯科技(深圳)有限公司 | Based on live sound effect methods of exhibiting and device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110134480A (en) * | 2019-05-20 | 2019-08-16 | 北京字节跳动网络技术有限公司 | Processing method, device, electronic equipment and the storage medium of user's trigger action |
CN110134480B (en) * | 2019-05-20 | 2023-06-13 | 抖音视界有限公司 | User trigger operation processing method and device, electronic equipment and storage medium |
CN110688046A (en) * | 2019-09-24 | 2020-01-14 | 腾讯音乐娱乐科技(深圳)有限公司 | Song playing method and device and storage medium |
CN110688046B (en) * | 2019-09-24 | 2022-02-25 | 腾讯音乐娱乐科技(深圳)有限公司 | Song playing method and device and storage medium |
CN112148188A (en) * | 2020-09-23 | 2020-12-29 | 北京市商汤科技开发有限公司 | Interaction method and device in augmented reality scene, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11320959B2 (en) | Mobile terminal and method of controlling the same | |
US9928028B2 (en) | Mobile terminal with voice recognition mode for multitasking and control method thereof | |
EP2637086B1 (en) | Mobile terminal | |
CN107621922B (en) | Screen splitting operation method and device | |
CN102238282B (en) | Mobile terminal capable of providing multiplayer game and operating method thereof | |
US10042596B2 (en) | Electronic device and method for controlling the same | |
WO2021115194A1 (en) | Application icon display method and electronic device | |
CN107741820B (en) | Input method keyboard display method and mobile terminal | |
EP2693321B1 (en) | Mobile terminal and control method thereof | |
EP2669784A1 (en) | Mobile terminal and control method thereof | |
CN107908952B (en) | Method and device for identifying real machine and simulator and terminal | |
EP2402898A1 (en) | Displaying advertisements on a mobile terminal | |
CN107102806A (en) | A kind of split screen input method and mobile terminal | |
US11138956B2 (en) | Method for controlling display of terminal, storage medium, and electronic device | |
KR20140143556A (en) | Portable terminal and method for user interface in the portable terminal | |
CN107423018B (en) | A kind of multi-screen display method and terminal | |
CN105144068A (en) | Application program display method and terminal | |
CN107786906A (en) | The method and apparatus that a kind of browser plays video in independent window | |
CN110795007A (en) | Method and device for acquiring screenshot information | |
CN107765987A (en) | A kind of user interaction approach and device | |
KR20090121033A (en) | Mobile terminal using of proximity touch and wallpaper controlling method therefor | |
CN104375756A (en) | Touch operation method and touch operation device | |
US20140189554A1 (en) | Mobile terminal and control method thereof | |
KR101855938B1 (en) | Mobile terminal and operation control method thereof | |
KR20110055057A (en) | Mobile terminal and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180306 |