CN110955377A - Control method of virtual object and related device - Google Patents

Control method of virtual object and related device Download PDF

Info

Publication number
CN110955377A
CN110955377A CN201911185755.2A CN201911185755A CN110955377A CN 110955377 A CN110955377 A CN 110955377A CN 201911185755 A CN201911185755 A CN 201911185755A CN 110955377 A CN110955377 A CN 110955377A
Authority
CN
China
Prior art keywords
virtual object
touch screen
function
area
screen area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911185755.2A
Other languages
Chinese (zh)
Inventor
鲍慧翡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911185755.2A priority Critical patent/CN110955377A/en
Publication of CN110955377A publication Critical patent/CN110955377A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method and a related device of a virtual object, wherein the function of a first virtual object is started by responding to the triggering operation of the first virtual object, and the first virtual object is positioned in a response area; determining an empty touch screen area according to the response area; then responding to the spaced touch screen operation above the spaced touch screen area, and starting a function of a second virtual object in the spaced touch screen area; and controlling the second virtual object in the spaced touch screen area. Therefore, the control of the plurality of virtual objects in the limited response area is realized, the plurality of virtual objects cannot interfere with each other, the accuracy of the control process is improved, and the user experience is improved.

Description

Control method of virtual object and related device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and a related apparatus for controlling a virtual object.
Background
With the development of the related technologies of mobile terminals, more and more intelligent devices appear in the lives of people, wherein the control of virtual objects through the intelligent devices is particularly prominent.
Generally, a screen touch manner is adopted to control a virtual object, that is, a manner of two-hand control is adopted when a plurality of functions are simultaneously controlled in response to a sliding or clicking operation of a user on a screen of a smart device.
However, functions of virtual objects in the smart device are more and more abundant, in some scenes, a user needs to control multiple elements in the same area, however, the user operation efficiency is first, multiple virtual objects cannot be controlled in the same screen area at the same time, the automatically triggered virtual objects are used immediately, the problem of mutual interference in the control process of the multiple virtual objects also exists, the accuracy of the control process is affected, and the user experience is reduced.
Disclosure of Invention
In view of this, the present application provides a method for controlling a virtual object, which can improve the accuracy of controlling multiple virtual objects.
A first aspect of the present application provides a method for controlling a virtual object, which may be applied to a system or a program for controlling a virtual object through a touch screen, and specifically includes: in response to a trigger operation on a first virtual object, starting a function of the first virtual object, wherein the first virtual object is located in a response area;
determining an empty touch screen area according to the response area;
in response to a touch screen operation above the touch screen area, starting a function of a second virtual object in the touch screen area, wherein the second virtual object is located outside the touch screen area and the touch screen area is associated with the second virtual object through the touch screen touch operation;
and controlling the second virtual object in the spaced touch screen area to realize the control of the first virtual object and the second virtual object in the response area.
Preferably, in some possible implementations of the present application, the controlling the second virtual object in the spaced touch screen area includes:
acquiring a moving path of the second virtual object in the spaced touch screen area;
controlling interface content under a function of the second virtual object based on the movement path.
Preferably, in some possible implementations of the present application, the controlling interface content under the function of the second virtual object based on the movement path includes:
determining a starting point and an end point of the moving path;
determining a motion vector according to the starting point and the end point;
and calculating the interface content corresponding to the movement vector according to a first preset algorithm.
Preferably, in some possible implementations of the present application, the controlling interface content under the function of the second virtual object based on the movement path includes:
determining the projection of the moving path in the target direction to obtain a projection path;
and calculating the interface content corresponding to the projection path according to a second preset algorithm.
Preferably, in some possible implementations of the present application, the starting, in response to the spaced touch screen operation on the spaced touch screen region, a function of a second virtual object in the spaced touch screen region includes:
detecting whether a trigger element is suspended in the spaced touch screen area;
and if so, starting the function of the second virtual object.
Preferably, in some possible implementations of the present application, the method further includes:
detecting whether a trigger element is suspended in a preset range of the spaced touch screen area, wherein the preset range is set based on the distance from the display interface;
and if so, starting the function of the second virtual object.
Preferably, in some possible implementations of the present application, the method further includes:
detecting the stay time of the trigger element in the preset range;
and if the retention time meets a preset condition, starting the function of the second virtual object.
Preferably, in some possible implementations of the present application, the method further includes:
and if the trigger element contacts the screen in the response area, closing the function of the second virtual object.
Preferably, in some possible implementation manners of the present application, the determining an empty touch screen region according to the response region includes:
and projecting the response area on at least one preset plane to obtain the spaced touch screen area.
Preferably, in some possible implementations of the present application, the method further includes:
responding to a trigger operation on a third virtual object, and starting a function of the third virtual object;
controlling display content based on the function of the first virtual object, the function of the second virtual object, and the function of the third virtual object.
Preferably, in some possible implementations of the present application, the function of the first virtual object is control of a moving direction in a game scene, the function of the second virtual object is control of an observation angle of view in the game scene, and the function of the third virtual object is control of a direction of a line of sight in the game scene.
A second aspect of the present application provides an apparatus for virtual object control, comprising: the starting unit is used for responding to the triggering operation of a first virtual object and starting the function of the first virtual object, and the first virtual object is positioned in the response area;
the determining unit is used for determining an empty touch screen area according to the response area;
the touch screen display device comprises an air touch screen unit, a touch screen display unit and a touch screen display unit, wherein the air touch screen unit is used for responding to an air touch screen operation above an air touch screen area, starting a function of a second virtual object in the air touch screen area, the second virtual object is positioned outside the air touch screen area, and the air touch screen area is associated with the second virtual object through the air touch screen operation;
and the control unit is used for controlling the second virtual object in the spaced touch screen area so as to control the first virtual object and the second virtual object in the response area.
Preferably, in some possible implementation manners of the present application, the control unit is specifically configured to acquire a moving path of the second virtual object in the spaced touch screen area;
and is specifically configured to control interface content under the function of the second virtual object based on the movement path.
Preferably, in some possible implementations of the present application, the determining unit is specifically configured to determine a starting point and an end point of the moving path;
specifically, the method is used for determining a motion vector according to the starting point and the end point;
and specifically, calculating the interface content corresponding to the motion vector according to a first preset algorithm.
Preferably, in some possible implementations of the present application, the method is specifically configured to determine a projection of the moving path in a target direction, so as to obtain a projection path;
and specifically, calculating the interface content corresponding to the projection path according to a second preset algorithm.
Preferably, in some possible implementation manners of the present application, the spaced touch screen unit is specifically configured to detect whether a trigger element is suspended in the spaced touch screen region;
and the air-separating touch screen unit is specifically used for starting the function of the second virtual object if the air-separating touch screen unit is used for starting the function of the second virtual object.
Preferably, in some possible implementation manners of the present application, the air-insulated touch screen unit is further configured to detect whether a trigger element is suspended in a preset range of the air-insulated touch screen region, where the preset range is set based on a distance from the display interface;
and the air-isolation touch screen unit is further used for starting the function of the second virtual object if the air-isolation touch screen unit is used for starting the function of the second virtual object.
The air-insulated touch screen unit is further used for detecting the retention time of the trigger element in the preset range;
and the air-insulated touch screen unit is further used for starting the function of the second virtual object if the retention time meets a preset condition.
Preferably, in some possible implementations of the present application, the spaced touch screen unit is further configured to close the function of the second virtual object if a trigger element contacts the screen in the response area.
Preferably, in some possible implementations of the present application, the determining unit is specifically configured to project the response area on at least one preset plane to obtain the spaced touch screen area.
Preferably, in some possible implementations of the present application, the control unit is further configured to start a function of a third virtual object in response to a trigger operation on the third virtual object;
the control unit is further configured to control display content based on the function of the first virtual object, the function of the second virtual object, and the function of the third virtual object.
A third aspect of the present application provides a computer device comprising: a memory, a processor, and a bus system; the memory is used for storing program codes; the processor is configured to execute the method of virtual object control according to any one of the first aspect or the first aspect according to instructions in the program code.
A fourth aspect of the present application provides a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the method of virtual object control of the first aspect or any of the first aspects.
According to the technical scheme, the embodiment of the application has the following advantages:
starting the function of a first virtual object by responding to a trigger operation of the first virtual object, wherein the first virtual object is positioned in a response area; determining an empty touch screen area according to the response area; then responding to a spaced touch screen operation above the spaced touch screen area, and starting a function of a second virtual object in the spaced touch screen area, wherein the spaced touch screen area is associated with the second virtual object through the spaced touch screen operation; and further controlling the second virtual object in the spaced touch screen area to realize control of the first virtual object and the second virtual object in the response area. Therefore, the control of the plurality of virtual objects in the limited response area is realized, the plurality of virtual objects cannot interfere with each other, the accuracy of the control process is improved, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a diagram of a network architecture for operation of a virtual object control system;
fig. 2 is a flowchart of a virtual object control according to an embodiment of the present disclosure;
fig. 3 is a flowchart of a method for controlling a virtual object according to an embodiment of the present application;
fig. 4 is a schematic view of a scene controlled by a virtual object according to an embodiment of the present disclosure;
fig. 5 is a schematic view of another scene controlled by a virtual object according to an embodiment of the present disclosure;
fig. 6 is a schematic view of another scene controlled by a virtual object according to an embodiment of the present disclosure;
fig. 7 is a schematic view of another scene controlled by a virtual object according to an embodiment of the present application;
fig. 8 is an interface schematic diagram of a method for controlling a virtual object according to an embodiment of the present disclosure;
fig. 9 is a schematic view of another scene controlled by a virtual object according to an embodiment of the present application;
fig. 10 is a schematic view of another scene controlled by a virtual object according to an embodiment of the present application;
FIG. 11 is a flowchart of another method for controlling a virtual object according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a virtual object control apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a method and a related device for controlling a virtual object, which can be applied to a system or a program for controlling the virtual object through a touch screen, and can start the function of a first virtual object by responding to the triggering operation of the first virtual object, wherein the first virtual object is positioned in a response area; determining an empty touch screen area according to the response area; then responding to a spaced touch screen operation above the spaced touch screen area, and starting a function of a second virtual object in the spaced touch screen area, wherein the spaced touch screen area is associated with the second virtual object through the spaced touch screen operation; and further controlling the second virtual object in the spaced touch screen area to realize control of the first virtual object and the second virtual object in the response area. Therefore, the control of the plurality of virtual objects in the limited response area is realized, the plurality of virtual objects cannot interfere with each other, the accuracy of the control process is improved, and the user experience is improved.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "corresponding" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that the virtual object control method provided by the present application may be applied to a system or a program for controlling a virtual object through a touch screen, for example, a mobile terminal touch screen game, specifically, the virtual object control system may operate in a network architecture as shown in fig. 1, which is a network architecture diagram of the virtual object control system, as can be seen from the diagram, the virtual object control system may provide virtual object control with multiple information sources, and the terminal performs a scene corresponding to an air-insulated touch screen operation in program information by receiving the program information sent by the server in a client, and further controls multiple virtual objects, it can be understood that multiple terminal devices are shown in fig. 1, in an actual scene, more or fewer types of terminal devices may participate in the process of controlling the virtual object, and the specific number and types are determined by an actual scene, without limitation, fig. 1 shows one server, but in an actual scenario, a plurality of servers may participate, and particularly in a scenario of multi-application control interaction, the number of specific servers depends on the actual scenario.
It should be noted that the virtual object control method provided in this embodiment may also be performed offline, that is, without participation of a server, at this time, the terminal determines the program content stored locally and performs related operations, and updates the display interface in response to the operations.
It is understood that the virtual object control system described above may be run on a personal mobile terminal, for example: an application, such as a touch-screen operated game, may also be run on the server, and may also be run on a third-party device to provide virtual object control to obtain virtual object control results of the information sources, such as: in the mobile terminal, firstly, judging a game mode, determining a corresponding response area, and determining an area of an empty touch screen based on the response area so as to operate another virtual object; the specific virtual object control system may be operated in the above-mentioned device in the form of a program, may also be operated as a system component in the above-mentioned device, and may also be used as one of cloud service programs, and a specific operation mode is determined by an actual scene, which is not limited herein.
With the development of the related technologies of mobile terminals, more and more intelligent devices appear in the lives of people, wherein the control of virtual objects through the intelligent devices is particularly prominent.
Generally, a screen touch manner is adopted to control a virtual object, that is, a manner of two-hand control is adopted when a plurality of functions are simultaneously controlled in response to a sliding or clicking operation of a user on a screen of a smart device.
However, functions of virtual objects in the smart device are more and more abundant, in some scenes, a user needs to control multiple elements in the same area, however, the user operation efficiency is first, multiple virtual objects cannot be controlled in the same screen area at the same time, the automatically triggered virtual objects are used immediately, the problem of mutual interference in the control process of the multiple virtual objects also exists, the accuracy of the control process is affected, and the user experience is reduced.
In order to solve the above problem, the present application provides a method for controlling a virtual object, which is applied to a flow framework for controlling a virtual object shown in fig. 2, as shown in fig. 2, for a flow framework diagram for controlling a virtual object provided in an embodiment of the present application, a server issues related program information, for example: game scenes, control strategies or control elements, etc.; and then, running the relevant program at the client, carrying out operation on the program by a user through an operation end, triggering the blank touch screen operation in the corresponding screen area when the relevant switching rule of the control strategy is met, and simultaneously controlling the program by combining the touch screen operation.
It should be understood that the game is described as an example, and may be other control action scenarios, and the specific form depends on the actual scenario, which is not limited herein.
It can be understood that the method provided by the present application may be a program written as a processing logic in a hardware system, or may be a virtual object control device that implements the processing logic in an integrated or external manner. As one implementation manner, the virtual object control device starts the function of a first virtual object by responding to a trigger operation of the first virtual object, wherein the first virtual object is positioned in a response area; determining an empty touch screen area according to the response area; then responding to a spaced touch screen operation above the spaced touch screen area, and starting a function of a second virtual object in the spaced touch screen area, wherein the spaced touch screen area is associated with the second virtual object through the spaced touch screen operation; and further controlling the second virtual object in the spaced touch screen area to realize control of the first virtual object and the second virtual object in the response area. Therefore, the control of the plurality of virtual objects in the limited response area is realized, the plurality of virtual objects cannot interfere with each other, the accuracy of the control process is improved, and the user experience is improved.
With reference to the above flow architecture, the following describes a method for controlling a virtual object in the present application, please refer to fig. 3, where fig. 3 is a flow chart of a method for controlling a virtual object according to an embodiment of the present application, and the embodiment of the present application at least includes the following steps:
301. and responding to the trigger operation of the first virtual object, and starting the function of the first virtual object.
In this embodiment, the first virtual object is located in the response area, and it is understood that the response area may be the same size as the first virtual object, or may be an area expanded based on the first virtual object, for example: in a virtual object such as a game direction control joystick, the corresponding region may be a region that is expanded outward by a certain range based on the control joystick, since the operation of the user may exceed the range corresponding to the control joystick during the operation; the specific extension range is determined by the actual scene, and may be preset or adjusted according to the user requirement, which is not limited herein.
It is understood that the triggering operation on the first virtual object may be a click or a drag, for example: clicking a running button in the game; or, in the game, the moving direction of the related object is controlled by dragging the virtual stick.
302. Determining an empty touch screen area according to the response area;
in this embodiment, the blank touch screen region is a region of the non-contact screen corresponding to the operation of the user on the related object without contacting the screen; the response area can be projected on at least one preset plane to obtain the spaced touch screen area.
Specifically, the projection surface may be a two-dimensional surface, as shown in fig. 4, which is a scene schematic diagram of virtual object control provided in the embodiment of the present application, and the scene schematic diagram includes a screen display interface a1, a first virtual object a2, a response area A3, and a blank touch screen area a4, where a user may control the first virtual object a2 within the range of the response area A3, and may determine a corresponding blank touch screen area a4 based on the response area A3, and the user may operate other virtual objects in the blank touch screen area a 4.
It should be noted that the shape of the insulated touchscreen region a4 in fig. 4 is similar to the shape of the response region A3, but in an actual scenario, the insulated touchscreen region a4 may be any shape or size of region generated based on the response region A3, and is not limited thereto.
In another possible scenario, the projection surface may be multiple, that is, one three-dimensional space range, as shown in fig. 5, which is a schematic view of another virtual object control provided in this embodiment of the present application, and the view includes a screen display interface B1, a first virtual object B2, a response area B3, an empty touch screen area lower interface B4, an empty touch screen area upper interface B5, and a height difference B6 between the upper interface and the lower interface, where a user may control the first virtual object B2 within the range of the response area B3, and may determine a corresponding space area based on the response area B3, where the space area is composed of an empty touch screen area lower interface B4, an empty touch screen area upper interface B5, and a height difference B6 between the upper interface and the lower interface, and the user may operate other virtual objects in the control area.
It should be noted that, in fig. 5, the shapes of the spaced touch screen region lower interface B4 and the spaced touch screen region upper interface B5 are similar to the shape of the response region B3, but in an actual scenario, the spaced touch screen region lower interface B4 and the spaced touch screen region upper interface B5 may be regions of any shape or size generated based on the response region B3, and the height difference B6 between the upper interface and the lower interface may be determined according to user settings or program settings, or may be adjusted according to user or program requirements, which is not limited herein.
It is understood that the three-dimensional space range described above uses two projection surfaces, and specifically, a three-dimensional space formed by more projection surfaces may also be a regular three-dimensional space, or an irregular three-dimensional space, where the specific shape depends on the actual scene.
303. In response to a touch screen operation above the touch screen area, a function of a second virtual object is initiated in the touch screen area.
In this embodiment, the second virtual object is located outside the clear touch screen area, and the clear touch screen area is associated with the second virtual object through the clear touch screen operation. The related process of the touch screen operation and the second virtual object may be preset or modified based on the user's requirement, that is, the touch screen operation may be related to other virtual objects to operate the virtual object.
Optionally, in order to ensure accuracy of a function starting time of the second virtual object, whether a trigger element is suspended in the spaced touch screen area may be detected; and if so, starting the function of the second virtual object.
Optionally, because the blank touch screen area is associated with the corresponding area, whether a trigger element is suspended in a preset range of the blank touch screen area can be detected, wherein the preset range is set based on a distance from the display interface; and if so, starting the function of the second virtual object. For example: and 5mm-20mm above the response area is an isolated touch screen area, and if the trigger element is suspended in the range, the function of the second virtual object is started.
Optionally, in order to prevent the user indication from sliding through the empty touch screen area to cause misjudgment, the retention time of the trigger element in the preset range may also be detected; and if the retention time meets a preset condition, starting the function of the second virtual object. For example: and if the stay time of the trigger element in the area 5mm-20mm above the response area reaches 2 seconds, starting the function of the second virtual object.
It is understood that the triggering element may be a finger or other operable limb of the user, or may be a physical component having operability such as a stylus.
In addition, the specific trigger condition may be one or a combination of more of the above embodiments, and the specific form depends on the actual scene.
It should be noted that, for the process of ending the blank triggering operation, on one hand, it may be determined whether the triggering element contacts the screen, and if the triggering element contacts the screen in the response area, the function of the second virtual object is turned off. On the other hand, whether the trigger element is in the preset range can be further judged, that is, if the trigger element leaves the spaced touch screen area, the function of the second virtual object is closed, specifically, the height of the trigger element from the screen is higher than the height of the upper boundary of the spaced touch screen area, or is lower than the height of the lower boundary of the spaced touch screen area.
304. Controlling the second virtual object in the spaced touch screen area.
In this embodiment, a display interface for performing scene update based on the function of the first virtual object and the function of the second virtual object is displayed by controlling the second virtual object and the triggered first virtual object.
Alternatively, the control process for the second virtual object may be responsive in real time, i.e. the scene update moves as the second virtual object moves; specifically, the moving distance may be the same or may be changed in a certain ratio. The control process of the second virtual object can also be step-by-step, corresponding scene changes can be judged according to the displacement distance of the second virtual object, and the moving path of the second virtual object can be acquired in the spaced touch screen area; interface content under the function of the second virtual object is then controlled based on the movement path.
Optionally, because the moving path of the user in the operation process may not be a straight line, distance statistics may be performed on such a scene in a curve fitting manner, specifically, in a two-dimensional spaced touch screen region, as shown in fig. 6, the scene is a schematic view of another virtual object control provided in the embodiment of the present application, and a starting point and an ending point of the moving path are determined first; then determining a motion vector according to the starting point and the end point; and calculating the interface content corresponding to the motion vector according to a first preset algorithm, wherein the first preset algorithm can be based on the change of the motion vector, and the interface content generates linear scene change along with the change of the motion vector.
In another possible scenario, the spaced touch screen region is a three-dimensional space, as shown in fig. 7, which is a schematic view of a scenario controlled by another virtual object provided in this embodiment of the present application, and displacements in different directions, that is, a depth direction displacement C1, a horizontal direction displacement C2, and a vertical direction displacement C3, may be respectively calculated, specifically, a projection of the movement path in the target direction is first determined, so as to obtain a projection path; and then calculating the interface content corresponding to the projection path according to a second preset algorithm. For example, the displacement values of the operation, that is, the depth direction displacement C1, the horizontal direction displacement C2, and the vertical direction displacement C3 correspond to the change values of the corresponding viewing angles in the interface scene, respectively, and the displacement value of the operation may be the same as the change value of the corresponding viewing angle, or may be enlarged or reduced in a certain ratio.
The foregoing embodiment introduces two operation processes of virtual objects, specifically, more virtual objects may be controlled, and specifically, a function of a third virtual object may be started in response to a trigger operation on the third virtual object, so as to display a display interface for performing scene update based on the function of the first virtual object, the function of the second virtual object, and the function of the third virtual object. In one possible scenario, the function of the first virtual object is a movement direction control, the function of the second virtual object is an observation angle control, and the function of the third virtual object is a line-of-sight direction control.
With reference to the foregoing embodiment, a function of a first virtual object is started by responding to a trigger operation on the first virtual object, where the first virtual object is located in a response area; determining an empty touch screen area according to the response area; then responding to a spaced touch screen operation above the spaced touch screen area, and starting a function of a second virtual object in the spaced touch screen area, wherein the spaced touch screen area is associated with the second virtual object through the spaced touch screen operation; and further controlling the second virtual object in the spaced touch screen area to realize control of the first virtual object and the second virtual object in the response area. Therefore, the control of the plurality of virtual objects in the limited response area is realized, the plurality of virtual objects cannot interfere with each other, the accuracy of the control process is improved, and the user experience is improved.
The foregoing embodiment describes a control process of multiple media contents, and the method for controlling a virtual object according to the embodiment of the present application is described below with reference to a specific scenario, which will be described below with reference to fig. 8, where fig. 8 is an interface schematic diagram of the method for controlling a virtual object according to the embodiment of the present application, and the interface schematic diagram includes a display interface D1, a moving direction control object D2 (a first virtual object), an observation angle control object D3 (a second virtual object), a line-of-sight direction control object D4 (a third virtual object), and a target element D5. The user can make the target element D5 move in the scene corresponding to the display interface D1 by controlling the moving direction control object D2, change the viewing angle of the user in the scene corresponding to the display interface D1 by controlling the viewing angle control object D3, and make the direction in which the target element D5 faces in the display interface D1 by controlling the viewing gaze direction control object D4.
Generally, after controlling the moving direction control object D2 and the viewing angle control object D3, the user has no spare fingers to operate the viewing angle control object D4, which causes inconvenience in control, and the moving direction control object D2 and the viewing angle control object D3 are likely to interfere with each other when controlled simultaneously, which causes misoperation and affects the accuracy of the control process.
Through the method for controlling a virtual object provided by the present application, an isolated touch screen region is set above a response region of a moving direction control object D2, as shown in fig. 10, which is another scene schematic diagram for controlling a virtual object provided in an embodiment of the present application, in the diagram, a region is set above a rocker (response region) corresponding to a moving direction control object D2, if a finger hovers for n seconds, a visual field observation function is triggered, and a function operation corresponding to an observation visual angle control object D3 is performed in the isolated touch screen region, which does not affect a control process of the moving direction control object D2.
Further, as for the operation process of the viewing angle control object D3, as shown in fig. 9, the operation process is another scene schematic diagram controlled by the virtual object provided in the embodiment of the present application, and may be an operation of dragging a finger within a preset distance and height range, where a plane within each height range may be an area corresponding to the whole display interface D1, or an area obtained by setting based on a preset rule, and a specific size is determined according to an actual scene.
Through the control of the space for observing the visual angle control object D3, the user can also control the visual direction control object D4, so that the user experience is improved and the accuracy of the control process is ensured by controlling the observation visual angle control object D3 and the visual direction control object D4 simultaneously after the automatic running (corresponding function of D1) is triggered.
With reference to the foregoing embodiment, the determination logic shown in fig. 11 can be obtained, and as shown in fig. 11, is a schematic flow chart of virtual object control provided in the embodiment of the present application, first, an "auto-run" state is triggered, at this time, suspension detection of a finger is performed, if the finger is above a corresponding area, whether the finger is within a preset range is continuously detected, and if the finger is within the preset range, whether the finger stays for a preset duration is further detected, for example: and 2 seconds, if the preset duration is reached, triggering a visual field observation function, controlling and observing by a user, namely entering a visual field observation mode, detecting whether the finger is separated from the preset area in real time, and if so, separating from the visual field observation mode. During the visual field observation mode, the user can simultaneously perform observation in different directions by controlling the direction of the line of sight. Therefore, the visual field observation function of the virtual rocker can be activated through the operation of the spaced touch screen after the automatic running is triggered, the control process of the multiple virtual objects of controlling the direction by the right hand and adjusting the visual field by the left hand is realized.
In order to better implement the above-mentioned aspects of the embodiments of the present application, the following also provides related apparatuses for implementing the above-mentioned aspects. Referring to fig. 12, fig. 12 is a schematic structural diagram of a virtual object control apparatus according to an embodiment of the present application, where the virtual object control apparatus 1200 includes:
an initiating unit 1201, configured to initiate a function of a first virtual object in response to a trigger operation on the first virtual object, where the first virtual object is located in a response area;
a determining unit 1202, configured to determine an empty touch screen region according to the response region;
an empty touch screen unit 1203, configured to, in response to an empty touch screen operation on the empty touch screen area, start a function of a second virtual object in the empty touch screen area, where the second virtual object is located outside the empty touch screen area, and the empty touch screen area is associated with the second virtual object through the empty touch screen operation;
a control unit 1204, configured to control the second virtual object in the spaced touch screen region, so as to control the first virtual object and the second virtual object in the response region.
Preferably, in some possible implementations of the present application, the control unit 1204 is specifically configured to obtain a moving path of the second virtual object in the spaced touch screen area;
the control unit 1204 is specifically configured to control interface content under the function of the second virtual object based on the movement path.
Preferably, in some possible implementations of the present application, the control unit 1204 is specifically configured to determine a starting point and an end point of the moving path;
the control unit 1204 is specifically configured to determine a motion vector according to the starting point and the ending point;
the control unit 1204 is specifically configured to calculate interface content corresponding to the motion vector according to a first preset algorithm.
Preferably, in some possible implementations of the present application, the control unit 1204 is specifically configured to determine a projection of the moving path in the target direction to obtain a projection path;
the control unit 1204 is specifically configured to calculate interface content corresponding to the projection path according to a second preset algorithm.
Preferably, in some possible implementation manners of the present application, the spaced touch screen unit 1203 is specifically configured to detect whether a trigger element is suspended in the spaced touch screen region;
the air-insulated touch screen unit 1203 is specifically configured to, if yes, start a function of the second virtual object.
Preferably, in some possible implementation manners of the present application, the spaced touch screen unit 1203 is further configured to detect whether a trigger element is suspended in a preset range of the spaced touch screen region, where the preset range is set based on a distance from the display interface;
the air-insulated touch screen unit 1203 is further configured to, if yes, start a function of the second virtual object.
The air-insulated touch screen unit 1203 is further configured to detect a retention time of the trigger element within the preset range;
the air-insulated touch screen unit 1203 is further configured to start a function of the second virtual object if the staying time meets a preset condition.
Preferably, in some possible implementations of the present application, the blank touch screen unit 1203 is further configured to close the function of the second virtual object if a trigger element contacts the screen in the response area.
Preferably, in some possible implementations of the present application, the determining unit 1202 is specifically configured to project the response area on at least one preset plane to obtain the spaced touch screen area.
Preferably, in some possible implementations of the present application, the control unit 1204 is further configured to start a function of a third virtual object in response to a trigger operation on the third virtual object;
the control unit 1204 is further configured to control display content based on the function of the first virtual object, the function of the second virtual object, and the function of the third virtual object.
Starting the function of a first virtual object by responding to a trigger operation of the first virtual object, wherein the first virtual object is positioned in a response area; determining an empty touch screen area according to the response area; then responding to a spaced touch screen operation above the spaced touch screen area, and starting a function of a second virtual object in the spaced touch screen area, wherein the spaced touch screen area is associated with the second virtual object through the spaced touch screen operation; and further controlling the second virtual object in the spaced touch screen area to realize control of the first virtual object and the second virtual object in the response area. Therefore, the control of the plurality of virtual objects in the limited response area is realized, the plurality of virtual objects cannot interfere with each other, the accuracy of the control process is improved, and the user experience is improved.
An embodiment of the present application further provides a terminal device, as shown in fig. 13, which is a schematic structural diagram of the terminal device provided in the embodiment of the present application, and for convenience of description, only a part related to the embodiment of the present application is shown, and details of the specific technology are not disclosed, please refer to a method part in the embodiment of the present application. The terminal may be any terminal device including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a point of sale (POS), a vehicle-mounted computer, and the like, taking the terminal as the mobile phone as an example:
fig. 13 is a block diagram illustrating a partial structure of a mobile phone related to a terminal provided in an embodiment of the present application. Referring to fig. 13, the handset includes: radio Frequency (RF) circuitry 1310, memory 1320, input unit 1330, display unit 1340, sensor 1350, audio circuitry 1360, wireless fidelity (WiFi) module 1370, processor 1380, and power supply 1390. Those skilled in the art will appreciate that the handset configuration shown in fig. 13 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 13:
RF circuit 1310 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for processing received downlink information of a base station by processor 1380; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuitry 1310 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 1310 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), etc.
The memory 1320 may be used to store software programs and modules, and the processor 1380 executes various functional applications and data processing of the cellular phone by operating the software programs and modules stored in the memory 1320. The memory 1320 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1320 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1330 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1330 may include a touch panel 1331 and other input devices 1332. Touch panel 1331, also referred to as a touch screen, can collect touch operations by a user on or near the touch panel 1331 (e.g., operations by a user on or near touch panel 1331 using any suitable object or accessory such as a finger, a stylus, etc., and spaced touch operations within a certain range on touch panel 1331), and drive corresponding connected devices according to a preset program. Alternatively, the touch panel 1331 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1380, where the touch controller can receive and execute commands sent by the processor 1380. In addition, the touch panel 1331 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1330 may include other input devices 1332 in addition to the touch panel 1331. In particular, other input devices 1332 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 1340 may be used to display information input by a user or information provided to the user and various menus of the cellular phone. The display unit 1340 may include a display panel 1341, and optionally, the display panel 1341 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like. Further, touch panel 1331 can overlay display panel 1341, and when touch panel 1331 detects a touch operation on or near touch panel 1331, processor 1380 can be configured to determine the type of touch event, and processor 1380 can then provide a corresponding visual output on display panel 1341 based on the type of touch event. Although in fig. 13, the touch panel 1331 and the display panel 1341 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1331 and the display panel 1341 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1350, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 1341 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 1341 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The audio circuit 1360, speaker 1361, microphone 1362 may provide an audio interface between the user and the handset. The audio circuit 1360 may transmit the electrical signal converted from the received audio data to the speaker 1361, and the electrical signal is converted into a sound signal by the speaker 1361 and output; on the other hand, the microphone 1362 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 1360, and then processes the audio data by the audio data output processor 1380, and then sends the audio data to, for example, another cellular phone via the RF circuit 1310, or outputs the audio data to the memory 1320 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1370, and provides wireless broadband internet access for the user. Although fig. 13 shows the WiFi module 1370, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1380 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1320 and calling data stored in the memory 1320, thereby integrally monitoring the mobile phone. Optionally, processor 1380 may include one or more processing units; alternatively, processor 1380 may integrate an application processor, which handles primarily the operating system, user interface, and applications, and a modem processor, which handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1380.
The handset also includes a power supply 1390 (e.g., a battery) to provide power to the various components, which may optionally be logically coupled to the processor 1380 via a power management system to manage charging, discharging, and power consumption management via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiment of the present application, the processor 1380 included in the terminal further has a function of performing the respective steps of the page processing method as described above.
An embodiment of the present application further provides a computer-readable storage medium, in which virtual object control instructions are stored, and when the computer-readable storage medium is executed on a computer, the computer is enabled to execute the steps performed by the virtual object control apparatus in the methods described in the foregoing embodiments shown in fig. 2 to 11.
Embodiments of the present application further provide a computer program product including virtual object control instructions, which when executed on a computer, causes the computer to perform the steps performed by the virtual object control apparatus in the method described in the foregoing embodiments shown in fig. 2 to 11.
An embodiment of the present application further provides a virtual object control system, where the virtual object control system may include the virtual object control apparatus in the embodiment described in fig. 12 or the terminal device described in fig. 13.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a virtual object control apparatus, or a network device) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (14)

1. A method for controlling a virtual object, comprising:
in response to a trigger operation on a first virtual object, starting a function of the first virtual object, wherein the first virtual object is located in a response area;
determining an empty touch screen area according to the response area;
in response to a touch screen operation above the touch screen area, starting a function of a second virtual object in the touch screen area, wherein the second virtual object is located outside the touch screen area and the touch screen area is associated with the second virtual object through the touch screen touch operation;
and controlling the second virtual object in the spaced touch screen area to realize the control of the first virtual object and the second virtual object in the response area.
2. The method of claim 1, wherein said controlling the second virtual object in the clear touch screen area comprises:
acquiring a moving path of the second virtual object in the spaced touch screen area;
controlling interface content under a function of the second virtual object based on the movement path.
3. The method of claim 2, wherein the controlling interface content under the function of the second virtual object based on the movement path comprises:
determining a starting point and an end point of the moving path;
determining a motion vector according to the starting point and the end point;
and calculating the interface content corresponding to the movement vector according to a first preset algorithm.
4. The method of claim 2, wherein the controlling interface content under the function of the second virtual object based on the movement path comprises:
determining the projection of the moving path in the target direction to obtain a projection path;
and calculating the interface content corresponding to the projection path according to a second preset algorithm.
5. The method of claim 1, wherein initiating a function of a second virtual object in the clear touch screen region in response to a clear touch screen operation over the clear touch screen region comprises:
detecting whether a trigger element is suspended in the spaced touch screen area;
and if so, starting the function of the second virtual object.
6. The method of claim 5, further comprising:
detecting whether a trigger element is suspended in a preset range of the spaced touch screen area, wherein the preset range is set based on the distance from the display interface;
and if so, starting the function of the second virtual object.
7. The method of claim 6, further comprising:
detecting the stay time of the trigger element in the preset range;
and if the retention time meets a preset condition, starting the function of the second virtual object.
8. The method according to any one of claims 5-7, further comprising:
and if the trigger element contacts the screen in the response area, closing the function of the second virtual object.
9. The method according to any one of claims 1-7, wherein determining an empty touch screen area from the response area comprises:
and projecting the response area on at least one preset plane to obtain the spaced touch screen area.
10. The method according to any one of claims 1-7, further comprising:
responding to a trigger operation on a third virtual object, and starting a function of the third virtual object;
controlling display content based on the function of the first virtual object, the function of the second virtual object, and the function of the third virtual object.
11. The method of claim 10, wherein the function of the first virtual object is a movement direction control in a game scene, the function of the second virtual object is a viewing perspective control in a game scene, and the function of the third virtual object is a line-of-sight direction control in a game scene.
12. An apparatus for controlling a virtual object, comprising:
the starting unit is used for responding to the triggering operation of a first virtual object and starting the function of the first virtual object, and the first virtual object is positioned in the response area;
the determining unit is used for determining an empty touch screen area according to the response area;
the touch screen display device comprises an air touch screen unit, a touch screen display unit and a control unit, wherein the air touch screen unit is used for responding to a first air touch screen operation above an air touch screen area, starting a function of a second virtual object in the air touch screen area, the second virtual object is positioned outside the air touch screen area, and the air touch screen area is associated with the second virtual object through the first air touch screen operation;
and the control unit is used for controlling the second virtual object in the spaced touch screen area so as to control the first virtual object and the second virtual object in the response area.
13. A computer device, the computer device comprising a processor and a memory:
the memory is used for storing program codes; the processor is configured to perform the method of virtual object control of any of claims 1 to 11 according to instructions in the program code.
14. A computer-readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of virtual object control of any of the preceding claims 1 to 11.
CN201911185755.2A 2019-11-27 2019-11-27 Control method of virtual object and related device Pending CN110955377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911185755.2A CN110955377A (en) 2019-11-27 2019-11-27 Control method of virtual object and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911185755.2A CN110955377A (en) 2019-11-27 2019-11-27 Control method of virtual object and related device

Publications (1)

Publication Number Publication Date
CN110955377A true CN110955377A (en) 2020-04-03

Family

ID=69978635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911185755.2A Pending CN110955377A (en) 2019-11-27 2019-11-27 Control method of virtual object and related device

Country Status (1)

Country Link
CN (1) CN110955377A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114063821A (en) * 2021-11-15 2022-02-18 深圳市海蓝珊科技有限公司 Non-contact screen interaction method
CN114995652A (en) * 2022-06-28 2022-09-02 天翼数字生活科技有限公司 Screen control method and user terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110420456A (en) * 2018-10-11 2019-11-08 网易(杭州)网络有限公司 The method and device of selecting object, computer storage medium, electronic equipment
CN110448903A (en) * 2019-01-22 2019-11-15 网易(杭州)网络有限公司 Determination method, apparatus, processor and the terminal of control strategy in game
CN110448898A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game
CN110448895A (en) * 2018-09-29 2019-11-15 网易(杭州)网络有限公司 Information processing method and device in game
CN110448899A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110448895A (en) * 2018-09-29 2019-11-15 网易(杭州)网络有限公司 Information processing method and device in game
CN110420456A (en) * 2018-10-11 2019-11-08 网易(杭州)网络有限公司 The method and device of selecting object, computer storage medium, electronic equipment
CN110448898A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game
CN110448899A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game
CN110448903A (en) * 2019-01-22 2019-11-15 网易(杭州)网络有限公司 Determination method, apparatus, processor and the terminal of control strategy in game

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114063821A (en) * 2021-11-15 2022-02-18 深圳市海蓝珊科技有限公司 Non-contact screen interaction method
CN114995652A (en) * 2022-06-28 2022-09-02 天翼数字生活科技有限公司 Screen control method and user terminal
CN114995652B (en) * 2022-06-28 2024-05-31 天翼数字生活科技有限公司 Screen control method and user terminal

Similar Documents

Publication Publication Date Title
CN107908334B (en) Fingerprint icon display method and device and mobile terminal
CN108055408B (en) Application program control method and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN109314727B (en) Method and device for controlling screen of mobile terminal
CN109885373B (en) Rendering method and device of user interface
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
CN108762564B (en) Operation control method and terminal equipment
CN110531915B (en) Screen operation method and terminal equipment
CN108881719B (en) Method for switching shooting mode and terminal equipment
KR20170086572A (en) Method and device for determining motion trajectory of target subject, and storage medium
TW201516844A (en) Apparatus and method for selecting object
CN104866110A (en) Gesture control method, mobile terminal and system
CN107317918B (en) Parameter setting method and related product
CN107329572B (en) Control method, mobile terminal and computer-readable storage medium
CN110609648A (en) Application program control method and terminal
CN109521937B (en) Screen display control method and mobile terminal
CN110673770A (en) Message display method and terminal equipment
CN111240545A (en) Application switching method and electronic equipment
CN109568942B (en) Handle peripheral and virtual object control method and device
CN107797723B (en) Display style switching method and terminal
CN110955377A (en) Control method of virtual object and related device
CN109002245B (en) Application interface operation method and mobile terminal
CN108579078B (en) Touch operation method and related product
CN112817501A (en) Method and related device for displaying media content
CN108388400B (en) Operation processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022167

Country of ref document: HK