CN110420457B - Suspension operation method, suspension operation device, terminal and storage medium - Google Patents

Suspension operation method, suspension operation device, terminal and storage medium Download PDF

Info

Publication number
CN110420457B
CN110420457B CN201811161371.2A CN201811161371A CN110420457B CN 110420457 B CN110420457 B CN 110420457B CN 201811161371 A CN201811161371 A CN 201811161371A CN 110420457 B CN110420457 B CN 110420457B
Authority
CN
China
Prior art keywords
virtual
user interface
graphical user
controlling
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811161371.2A
Other languages
Chinese (zh)
Other versions
CN110420457A (en
Inventor
黄华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811161371.2A priority Critical patent/CN110420457B/en
Publication of CN110420457A publication Critical patent/CN110420457A/en
Application granted granted Critical
Publication of CN110420457B publication Critical patent/CN110420457B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Abstract

The embodiment of the invention provides a suspension operation method, a device, a terminal and a storage medium, wherein the method is applied to the terminal which can display a graphical user interface in a display screen, and the content displayed by the graphical user interface at least comprises a virtual scene and a virtual character, and the method comprises the following steps: detecting non-contact operation in a preset range area in front of the graphic user interface; and controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface. On one hand, the dependence on the virtual controls is reduced, the number of the virtual controls can be reduced, the occupied display area is reduced, the immersive user experience is realized, on the other hand, the user operates in the space, the operation dimension is more, the difference is larger, the accuracy of identifying the user operation can be improved, and the probability of misoperation is reduced.

Description

Suspension operation method, suspension operation device, terminal and storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a suspension operation method, apparatus, terminal, and storage medium.
Background
With the development of technology, terminals such as mobile phones and tablet computers have higher and higher utilization rates in various aspects of work, study, daily communication and the like.
The user often installs virtual social application, 3D map, game and other applications in the terminal to meet the demands of user work, entertainment, social contact and the like.
Wherein the game may provide a virtual scene, such as a street, farm, island, etc., in which virtual characters are presented, and for shooting-type games, the user may control the virtual characters to perform virtual shooting operations.
Currently, in a terminal, a virtual shooting operation is generally performed using the following two modes:
in the first way, the application provides a virtual control, and the user clicks the virtual control, so that a virtual shooting operation corresponding to the control is triggered.
As shown in fig. 1A, taking a shooting game as an example, a user may click on a virtual control to perform a virtual shooting operation, turn on/off a magnifier, or the like.
In the second mode, the application cancels the virtual control, and the user directly performs touch operation on an interface of the application, and triggers corresponding virtual shooting operation by identifying the touch operation.
As shown in fig. 1B, taking a certain shooting game as an example, the user may directly perform shooting operation by clicking or pressing long on the interface, turn on the magnifier after pressing long, perform virtual shooting operation after releasing the finger, and so on.
For the first mode, as the area of the display screen of the terminal is smaller, the virtual control occupies a certain display area, and the visual effect is affected.
In addition, for applications such as games, the user operation is frequent, and the user needs to be familiar with the position of the virtual control, so that the operation is complicated.
For the second mode, the user operates on the plane, the operation dimension is small, the operation modes are similar, and the accuracy of identifying the user operation is low, so that misoperation is easy to occur, and the user experience is affected.
Taking a shooting game as an example, the user slides sideways, intending to move the controlled avatar, but may be identified as a shooting operation, resulting in a hidden avatar exposing position.
Disclosure of Invention
The embodiment of the invention provides a suspension operation method, a suspension operation device, a suspension operation terminal and a suspension operation storage medium, which are used for solving the problems that a virtual control occupies a display area, the touch operation identification accuracy is low and misoperation is easy to cause.
In a first aspect, an embodiment of the present invention further provides a suspension operation method, which is applied to a terminal capable of presenting a graphical user interface in a display screen, where content presented by the graphical user interface includes at least a virtual scene and a virtual character, and the method includes:
detecting non-contact operation in a preset range area in front of the graphic user interface;
and controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface.
Optionally, the controlling the virtual character to execute the virtual shooting operation according to the distance between the non-contact operation and the graphical user interface includes:
detecting a movement track of the non-contact operation;
controlling visual field presentation of the virtual scene in the graphical user interface according to the movement track;
and controlling the virtual character to execute virtual shooting operation towards a preset position in the visual field presentation according to the distance.
Optionally, the method further comprises:
detecting a sliding operation acting on the graphical user interface;
and controlling the visual field presentation of the virtual scene in the graphical user interface according to the sliding operation.
Optionally, the controlling the virtual character to execute the virtual shooting operation according to the distance between the non-contact operation and the graphical user interface includes:
and if the distance between the non-contact operation and the graphical user interface is within a preset first threshold range, controlling the virtual character to execute single shooting operation.
Optionally, the controlling the virtual character to execute the virtual shooting operation according to the distance between the non-contact operation and the graphical user interface includes:
and if the distance between the non-contact operation and the graphical user interface is within a preset second threshold range, controlling the virtual character to execute continuous shooting operation.
In a second aspect, an embodiment of the present invention provides a hover operation device applied to a terminal capable of presenting a graphical user interface in a display screen, where content presented by the graphical user interface includes at least a virtual scene and a virtual character, the device includes:
the non-contact operation detection module is used for detecting non-contact operation in a preset range area in front of the graphical user interface;
and the virtual shooting operation control module is used for controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface.
Optionally, the virtual shooting operation control module includes:
a movement track detection sub-module for detecting the movement track of the non-contact operation;
a movement track control sub-module for controlling the visual field presentation of the virtual scene in the graphical user interface according to the movement track;
and the visual field shooting sub-module is used for controlling the virtual character to execute virtual shooting operation towards a preset position in the visual field presentation according to the distance.
Optionally, the method further comprises:
a sliding operation detection module for detecting a sliding operation acting on the graphical user interface;
and the visual field presentation control module is used for controlling visual field presentation of the virtual scene in the graphical user interface according to the sliding operation.
Optionally, the virtual shooting operation control module includes:
and the single shooting operation execution sub-module is used for controlling the virtual character to execute the single shooting operation if the distance between the non-contact operation and the graphical user interface is within a preset first threshold range.
Optionally, the virtual shooting operation control module includes:
and the continuous shooting operation execution sub-module is used for controlling the virtual character to execute continuous shooting operation if the distance between the non-contact operation and the graphical user interface is within a preset second threshold range.
In a third aspect, an embodiment of the present invention provides a terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program implements the steps of the suspension operation method when executed by the processor.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the levitation operation method.
In the embodiment of the invention, the non-contact operation in the area of a preset range in front of the graphic user interface is detected, the virtual role is controlled to execute the virtual shooting operation according to the distance between the non-contact operation and the graphic user interface, so that the suspension operation is realized to control the virtual shooting operation, on one hand, the dependence on virtual controls is reduced, the number of the virtual controls can be reduced, the occupied display area is reduced, the immersive user experience is realized, on the other hand, the user performs the operation in the space, the operation dimension is more and the difference is larger, the accuracy of identifying the user operation can be improved, and the probability of misoperation is reduced.
Drawings
Fig. 1A and 1B are exemplary diagrams of a conventional operation mode.
FIG. 2 is a flow chart of an embodiment of a levitation operation method of the present invention.
Fig. 3 is an exemplary diagram of a levitation operation of the present invention.
FIG. 4 is a flow chart of another embodiment of a levitation operation method of the present invention.
Fig. 5A-5B are exemplary diagrams of a movement trace of the present invention.
Fig. 6 is a block diagram of a levitation operation device according to the present invention.
Fig. 7 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 2, a flowchart of an embodiment of a suspension operation method of the present invention is shown, which may specifically include the following steps:
step 201, detecting a non-contact operation in a preset range area in front of the graphical user interface.
In a specific implementation, the embodiment of the present invention may be applied to a terminal that may present a graphical user interface in a display screen, for example, a mobile phone, a tablet computer, a wearable device (such as VR (Virtual Reality) glasses, a VR helmet, a smart watch), and so on, which the embodiment of the present invention is not limited.
Further, the display screen may be a touch screen, i.e. a touch operation may be provided in addition to displaying a graphical user interface.
In addition, the terminal supports a hover operation on the display screen, i.e., operates in non-contact with the display screen.
In one embodiment, using an infrared emitter and a photodiode as sensors in a terminal, infrared signals are emitted by the infrared emitter, captured by the photodiode after emission, and by deploying a plurality of sensors, fingers, palms, and other objects hovering above a display screen can be detected as hovering objects for control applications.
The operating system of the terminal may include Android (Android), IOS, windows Phone, windows, etc., and may support running various applications, such as virtual social applications, 3D maps, games, etc.
The graphical user interfaces of these applications may be displayed on a display screen, with the content presented at the graphical user interface including at least virtual scenes and virtual objects.
For games, the virtual scene may include a building scene, a MOBA (Multiplayer Online Battle Arena Games, multiplayer online tactical athletic game) scene, a shooting scene, etc., and the virtual object may include an object that a user may control of a virtual building, virtual character, etc.
In the embodiment of the invention, the user performs the non-contact operation in a preset range area on the display screen (display graphical user interface) by one hand, and at this time, the non-contact operation in a preset range area in front of the display screen can be detected.
And 202, controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface.
In the embodiment of the invention, the distance between the non-contact operation of the user and the display screen (displaying the graphical user interface) can be measured, and the virtual character is controlled to execute the virtual shooting operation according to the distance.
In particular implementations, the virtual shooting operation may include a single shooting operation and a continuous shooting operation.
In one case, if the distance between the non-contact operation and the graphical user interface is within a preset first threshold range, the virtual character is controlled to execute a single shooting operation.
In another case, if the distance between the non-contact operation and the graphical user interface is within a preset second threshold range, the virtual character is controlled to execute continuous shooting operation.
Typically, the values of the second threshold range are smaller than the values of the first threshold range.
As shown in fig. 3, taking a shooting game as an example, a user can suspend his/her hand above a display screen, and the user can move his/her hand vertically downward in the direction of the arrow, so as to continuously reduce the distance between his/her hand and the display screen (display the graphical user interface), and can switch from controlling the virtual character from a single shooting operation to a continuous shooting operation.
At this time, if the user puts down the hand so that it leaves above the display screen (display graphical user interface), the virtual character may be controlled to stop performing the virtual shooting operation.
In the embodiment of the invention, the non-contact operation in the area of a preset range in front of the graphic user interface is detected, the virtual role is controlled to execute the virtual shooting operation according to the distance between the non-contact operation and the graphic user interface, so that the suspension operation is realized to control the virtual shooting operation, on one hand, the dependence on virtual controls is reduced, the number of the virtual controls can be reduced, the occupied display area is reduced, the immersive user experience is realized, on the other hand, the user performs the operation in the space, the operation dimension is more and the difference is larger, the accuracy of identifying the user operation can be improved, and the probability of misoperation is reduced.
Referring to fig. 4, a flowchart of another method embodiment of the levitation operation of the present invention is shown, which may specifically include the steps of:
step 401, detecting a sliding operation acting on the graphical user interface.
Step 402, controlling the visual field presentation of the virtual scene in the graphical user interface according to the sliding operation.
In the embodiment of the invention, a user places one hand (such as the left hand) on the display screen to realize contact operation, namely, sliding operation on a graphical user interface.
At this time, the visual field presentation of the virtual scene in the graphical user interface may be controlled according to the sliding operation.
In general, the user can place the hand on the display screen, slide leftwards, rightwards, upwards and downwards, and then control the display view of the virtual scene to move horizontally leftwards, horizontally rightwards, vertically upwards and vertically downwards correspondingly.
Step 403, detecting a non-contact operation in a preset range area in front of the graphical user interface.
Step 404, detecting a movement track of the non-contact operation.
In the embodiment of the invention, the user can lift the hand (such as left hand) or suspend the other hand (such as right hand) on the display screen, so as to realize non-contact operation, control the virtual character to execute virtual shooting operation, and realize shooting while aiming.
In the display screen, the positions of the non-contact operation can be detected at a certain frequency, so that the series of positions are arranged in time to generate a moving track of the non-contact operation.
And step 405, controlling the visual field presentation of the virtual scene in the graphical user interface according to the movement track.
In a specific implementation, the visual field presentation of the virtual scene in the graphical user interface is controlled to be converted towards the corresponding direction based on the moving track of the non-contact operation.
Further, a virtual camera is disposed in the virtual scene, and by moving the virtual camera along a moving direction corresponding to the moving track, an effect of displaying a field of view of the moving virtual scene can be displayed.
As shown in fig. 5A, taking a shooting game as an example, a user can suspend his/her hand on the display screen and move horizontally leftwards and rightwards in the direction of the arrow, so as to control the horizontal left and right movement of the visual field of the virtual scene.
As shown in fig. 5B, taking a shooting game as an example, a user can suspend his/her hand on the display screen and move the hand horizontally up and down in the direction of the arrow, so as to control the vertical up and down movement of the visual field of the virtual scene.
And step 406, controlling the virtual character to execute virtual shooting operation towards a preset position in the visual field presentation according to the distance between the non-contact operation and the graphical user interface.
In the embodiment of the invention, for the same non-contact operation, on one hand, the visual field presentation of the virtual scene in the graphical user interface is controlled according to the movement track, and on the other hand, the virtual shooting operation is executed towards the preset position in the visual field presentation by controlling the virtual character according to the distance between the non-contact operation and the graphical user interface, so that shooting while aiming is realized.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 6, there is shown a block diagram of a hover operation device of the present invention, applied to a terminal capable of presenting a graphical user interface on a display screen, where the content presented by the graphical user interface includes at least a virtual scene and a virtual character, and the device may specifically include the following modules:
a non-contact operation detection module 601, configured to detect a non-contact operation in a preset range area in front of the graphical user interface;
and a virtual shooting operation control module 602, configured to control the virtual character to perform a virtual shooting operation according to a distance between the non-contact operation and the gui.
In one embodiment of the present invention, the virtual shooting operation control module 602 includes:
a movement track detection sub-module for detecting the movement track of the non-contact operation;
a movement track control sub-module for controlling the visual field presentation of the virtual scene in the graphical user interface according to the movement track;
and the visual field shooting sub-module is used for controlling the virtual character to execute virtual shooting operation towards a preset position in the visual field presentation according to the distance.
In one embodiment of the invention, the method further comprises:
a sliding operation detection module for detecting a sliding operation acting on the graphical user interface;
and the visual field presentation control module is used for controlling visual field presentation of the virtual scene in the graphical user interface according to the sliding operation.
In one embodiment of the present invention, the virtual shooting operation control module 602 includes:
and the single shooting operation execution sub-module is used for controlling the virtual character to execute the single shooting operation if the distance between the non-contact operation and the graphical user interface is within a preset first threshold range.
In one embodiment of the present invention, the virtual shooting operation control module 602 includes:
and the continuous shooting operation execution sub-module is used for controlling the virtual character to execute continuous shooting operation if the distance between the non-contact operation and the graphical user interface is within a preset second threshold range.
The terminal provided by the embodiment of the present invention can implement each process implemented by the terminal in the method embodiments of fig. 2 to 5, and in order to avoid repetition, a description is omitted here.
In the embodiment of the invention, the non-contact operation in the area of a preset range in front of the graphic user interface is detected, the virtual role is controlled to execute the virtual shooting operation according to the distance between the non-contact operation and the graphic user interface, so that the suspension operation is realized to control the virtual shooting operation, on one hand, the dependence on virtual controls is reduced, the number of the virtual controls can be reduced, the occupied display area is reduced, the immersive user experience is realized, on the other hand, the user performs the operation in the space, the operation dimension is more and the difference is larger, the accuracy of identifying the user operation can be improved, and the probability of misoperation is reduced.
Fig. 7 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
The terminal 700 includes, but is not limited to: radio frequency unit 701, network module 702, audio output unit 703, input unit 704, sensor 705, display unit 706, user input unit 707, interface unit 708, memory 709, processor 710, and power supply 711. It will be appreciated by those skilled in the art that the terminal structure shown in fig. 7 is not limiting of the terminal and that the terminal may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. In the embodiment of the invention, the terminal comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
Wherein the processor 710 is configured to detect a non-contact operation in a predetermined range area in front of the graphical user interface; and controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface.
In the embodiment of the invention, the non-contact operation in the area of a preset range in front of the graphic user interface is detected, the virtual role is controlled to execute the virtual shooting operation according to the distance between the non-contact operation and the graphic user interface, so that the suspension operation is realized to control the virtual shooting operation, on one hand, the dependence on virtual controls is reduced, the number of the virtual controls can be reduced, the occupied display area is reduced, the immersive user experience is realized, on the other hand, the user performs the operation in the space, the operation dimension is more and the difference is larger, the accuracy of identifying the user operation can be improved, and the probability of misoperation is reduced.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, specifically, receiving downlink data from a base station, and then processing the received downlink data by the processor 710; and, the uplink data is transmitted to the base station. Typically, the radio unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio unit 701 may also communicate with networks and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 702, such as helping the user to send and receive e-mail, browse web pages, access streaming media, etc.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal 700. The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used for receiving an audio or video signal. The input unit 704 may include a graphics processor (Graphics Processing Unit, GPU) 7041 and a microphone 7042, the graphics processor 7041 processing image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphics processor 7041 may be stored in memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 701 in the case of a telephone call mode.
The terminal 700 also includes at least one sensor 705, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or the backlight when the terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when the accelerometer sensor is stationary, and can be used for recognizing the terminal gesture (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 705 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., and will not be described again here.
The display unit 706 is used to display information input by a user or information provided to the user. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 7071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). The touch panel 7071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 710, and receives and executes commands sent from the processor 710. In addition, the touch panel 7071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 710 to determine a type of a touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although in fig. 7, the touch panel 7071 and the display panel 7061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the terminal, which is not limited herein.
The interface unit 708 is an interface to which an external device is connected to the terminal 700. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 700 or may be used to transmit data between the terminal 700 and an external device.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 709 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by running or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby performing overall monitoring of the terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 710.
The terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may preferably be logically coupled to the processor 710 via a power management system, such as to perform charge, discharge, and power management functions via the power management system.
In addition, the terminal 700 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides a terminal, which includes a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program when executed by the processor 710 implements the respective processes of the above suspension operation method embodiment, and can achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above suspension operation method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (8)

1. A hover operation method, applied to a terminal capable of presenting a graphical user interface in a display screen, wherein content presented by the graphical user interface at least includes a virtual scene and a virtual character, the method comprising:
detecting non-contact operation in a preset range area in front of the graphic user interface;
controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface;
wherein, according to the distance between the non-contact operation and the graphical user interface, controlling the virtual character to execute virtual shooting operation includes:
if the distance between the non-contact operation and the graphical user interface is within a preset first threshold range, controlling the virtual character to execute single shooting operation;
and if the distance between the non-contact operation and the graphical user interface is within a preset second threshold range, controlling the virtual character to execute continuous shooting operation.
2. The method of claim 1, wherein controlling the virtual character to perform a virtual shooting operation according to a distance between the non-contact operation and the graphical user interface comprises:
detecting a movement track of the non-contact operation;
controlling visual field presentation of the virtual scene in the graphical user interface according to the movement track;
and controlling the virtual character to execute virtual shooting operation towards a preset position in the visual field presentation according to the distance.
3. The method according to claim 1, wherein the method further comprises:
detecting a sliding operation acting on the graphical user interface;
and controlling the visual field presentation of the virtual scene in the graphical user interface according to the sliding operation.
4. A hover operational device for a terminal capable of presenting a graphical user interface on a display, the graphical user interface presenting content including at least a virtual scene and a virtual character, the device comprising:
the non-contact operation detection module is used for detecting non-contact operation in a preset range area in front of the graphical user interface;
the virtual shooting operation control module is used for controlling the virtual character to execute virtual shooting operation according to the distance between the non-contact operation and the graphical user interface;
wherein, virtual shooting operation control module includes:
a single shooting operation execution sub-module, configured to control the virtual character to execute a single shooting operation if a distance between the non-contact operation and the graphical user interface is within a preset first threshold range;
and the continuous shooting operation execution sub-module is used for controlling the virtual character to execute continuous shooting operation if the distance between the non-contact operation and the graphical user interface is within a preset second threshold range.
5. The apparatus of claim 4, wherein the virtual shooting operation control module comprises:
a movement track detection sub-module for detecting the movement track of the non-contact operation;
a movement track control sub-module for controlling the visual field presentation of the virtual scene in the graphical user interface according to the movement track;
and the visual field shooting sub-module is used for controlling the virtual character to execute virtual shooting operation towards a preset position in the visual field presentation according to the distance.
6. The apparatus of claim 5, wherein the apparatus further comprises:
a sliding operation detection module for detecting a sliding operation acting on the graphical user interface;
and the visual field presentation control module is used for controlling visual field presentation of the virtual scene in the graphical user interface according to the sliding operation.
7. A terminal comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the hover operation method according to any of claims 1-3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, implements the steps of the levitation operation method of any of claims 1 to 3.
CN201811161371.2A 2018-09-30 2018-09-30 Suspension operation method, suspension operation device, terminal and storage medium Active CN110420457B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811161371.2A CN110420457B (en) 2018-09-30 2018-09-30 Suspension operation method, suspension operation device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811161371.2A CN110420457B (en) 2018-09-30 2018-09-30 Suspension operation method, suspension operation device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110420457A CN110420457A (en) 2019-11-08
CN110420457B true CN110420457B (en) 2023-09-08

Family

ID=68407284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811161371.2A Active CN110420457B (en) 2018-09-30 2018-09-30 Suspension operation method, suspension operation device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110420457B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061360B (en) * 2019-11-12 2023-08-22 北京字节跳动网络技术有限公司 Control method and device based on user head motion, medium and electronic equipment
CN110908571B (en) * 2019-11-28 2022-06-28 腾讯科技(深圳)有限公司 Control method, device and equipment of sliding control and storage medium
CN112274919A (en) * 2020-11-20 2021-01-29 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467344A (en) * 2010-11-17 2012-05-23 索尼公司 System and method for display proximity based control of a touch screen user interface
CN103399669A (en) * 2013-08-14 2013-11-20 惠州Tcl移动通信有限公司 Non-contact operated mobile terminal and non-contact operation method thereof
CN106774907A (en) * 2016-12-22 2017-05-31 腾讯科技(深圳)有限公司 A kind of method and mobile terminal that virtual objects viewing area is adjusted in virtual scene
CN108052202A (en) * 2017-12-11 2018-05-18 深圳市星野信息技术有限公司 A kind of 3D exchange methods, device, computer equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467344A (en) * 2010-11-17 2012-05-23 索尼公司 System and method for display proximity based control of a touch screen user interface
CN103399669A (en) * 2013-08-14 2013-11-20 惠州Tcl移动通信有限公司 Non-contact operated mobile terminal and non-contact operation method thereof
CN106774907A (en) * 2016-12-22 2017-05-31 腾讯科技(深圳)有限公司 A kind of method and mobile terminal that virtual objects viewing area is adjusted in virtual scene
CN108052202A (en) * 2017-12-11 2018-05-18 深圳市星野信息技术有限公司 A kind of 3D exchange methods, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN110420457A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
EP4047940A1 (en) Screencast control method and electronic device
CN108182019B (en) Suspension control display processing method and mobile terminal
CN109499061B (en) Game scene picture adjusting method and device, mobile terminal and storage medium
CN108920059B (en) Message processing method and mobile terminal
CN109240577B (en) Screen capturing method and terminal
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
CN110874147B (en) Display method and electronic equipment
CN109032486B (en) Display control method and terminal equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN110531915B (en) Screen operation method and terminal equipment
CN110830363B (en) Information sharing method and electronic equipment
CN107613095B (en) Incoming call processing method and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN107809534B (en) Control method, terminal and computer storage medium
CN110519512B (en) Object processing method and terminal
CN111092990A (en) Application program sharing method and electronic equipment
CN110420457B (en) Suspension operation method, suspension operation device, terminal and storage medium
CN110442279B (en) Message sending method and mobile terminal
CN110990172A (en) Application sharing method, first electronic device and computer-readable storage medium
CN111124706A (en) Application program sharing method and electronic equipment
CN110764675A (en) Control method and electronic equipment
CN110795402B (en) Method and device for displaying file list and electronic equipment
CN110941469B (en) Application splitting creation method and terminal equipment thereof
CN110413363B (en) Screenshot method and terminal equipment
CN111124569A (en) Application sharing method, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant