CN110935173B - Operation control method, operation control device, storage medium, and electronic device - Google Patents

Operation control method, operation control device, storage medium, and electronic device Download PDF

Info

Publication number
CN110935173B
CN110935173B CN201911143167.2A CN201911143167A CN110935173B CN 110935173 B CN110935173 B CN 110935173B CN 201911143167 A CN201911143167 A CN 201911143167A CN 110935173 B CN110935173 B CN 110935173B
Authority
CN
China
Prior art keywords
shooting
target object
adsorption
determining
height value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911143167.2A
Other languages
Chinese (zh)
Other versions
CN110935173A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911143167.2A priority Critical patent/CN110935173B/en
Publication of CN110935173A publication Critical patent/CN110935173A/en
Application granted granted Critical
Publication of CN110935173B publication Critical patent/CN110935173B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an operation control method, an operation control device, a storage medium and an electronic device. The method comprises the following steps: acquiring a shooting instruction triggered by executing operation on a man-machine interaction interface, wherein the shooting instruction is used for requesting to execute shooting operation on a target object in a game task; responding to a shooting instruction, and determining the current centering position of the shooting operation; and controlling the center of gravity to move towards the target object under the condition that the center of gravity is positioned in a suction area matched with the target object, wherein the suction area is an aiming area set for the target object. The invention solves the problem of higher control complexity of shooting operation.

Description

Operation control method, operation control device, storage medium, and electronic device
Technical Field
The invention relates to the field of computers, in particular to an operation control method, an operation control device, a storage medium and an electronic device.
Background
In the current shooting game application, in order to accurately hit a target object in the shooting process, hands and eyes of a user are often required to be matched with each other to aim at the target object so as to finish shooting operation in a game scene.
However, the target object is usually in a moving state in a game scene, so that when shooting for the moving target object, the player is also required to adjust the aiming action in real time to ensure the shooting hit. That is, in the shooting control method provided in the related art, there is a problem that the control complexity of the shooting operation is high.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides an operation control method, an operation control device, a storage medium and an electronic device, and at least solves the technical problem of high control complexity of shooting operation.
According to an aspect of an embodiment of the present invention, there is provided an operation control method including: acquiring a shooting instruction triggered by executing operation on a man-machine interaction interface, wherein the shooting instruction is used for requesting to execute shooting operation on a target object in a game task; responding to the shooting instruction, and determining the current centering position of the shooting operation; and controlling the alignment position to move toward the target object when the alignment position is located in a suction area that matches the target object, wherein the suction area is an aiming area provided for the target object.
According to another aspect of the embodiments of the present invention, there is also provided an operation control apparatus including: the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a shooting instruction triggered by executing operation on a man-machine interaction interface, and the shooting instruction is used for requesting to execute shooting operation on a target object in a game task; a first determining unit, configured to determine, in response to the shooting instruction, a current alignment position of the shooting operation; a first control unit configured to control the centering position to move toward the target object when the centering position is located in a suction area that matches the target object, wherein the suction area is an aiming area provided for the target object.
According to still another aspect of the embodiments of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is configured to execute the above-described operation control method when executed.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the operation control method through the computer program.
In the embodiment of the invention, a shooting instruction triggered by the execution operation of the human-computer interaction interface is acquired, the current centering position of the shooting operation is determined in response to the shooting instruction, and the centering position is controlled to move towards the target object under the condition that the centering position is located in the adsorption area matched with the target object. The process enables the center of gravity position of the adsorption area to automatically move towards the target object by setting the adsorption area, thereby reducing the control complexity of the shooting operation.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a network environment for a method of operation control according to an embodiment of the present invention;
FIG. 2 is a flow chart of an operation control method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an operation control method in an alternative point adsorption game mode according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating an operation control method in an alternative line segment adsorption game mode according to an embodiment of the present invention;
FIG. 5 is a schematic illustration of a shot in an alternative shooting mode of adsorption according to embodiments of the present invention;
FIG. 6 is a schematic diagram of a shot in an alternative point attachment game mode according to an embodiment of the invention;
FIG. 7 is a schematic diagram of a shot in an alternative line segment adsorption game mode according to an embodiment of the invention;
FIG. 8 is a schematic structural diagram of an alternative operational control device in accordance with an embodiment of the present invention;
FIG. 9 is a schematic structural diagram of an alternative operational control apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an alternative electronic device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of the embodiments of the present invention, an operation control method is provided, and optionally, as an optional implementation, the operation control method may be applied, but not limited, to an operation control system in a network environment as shown in fig. 1, where the operation control system includes a user equipment 102, a network 110, and a server 112. Please refer to S101 to S105 in fig. 1.
S101, the user equipment 102 obtains a shooting instruction triggered by executing operation on the man-machine interaction interface.
S102, the user device 102 sends a firing instruction to the network 110.
S103, the network 110 sends a shooting instruction to the server 112.
The server 112 determines the current centering position of the shooting operation in response to the shooting instruction S104.
S105, the server 112 controls the centroid position to move toward the target object when the centroid position is located in the suction area matching the target object.
In the embodiment of the present invention, it is assumed that a client of a game application (such as a shooting game application client shown in fig. 1) is installed in a user equipment 102, where the user equipment 102 includes a memory 104, a processor 106 and a display 108. The display 108 is configured to detect a human-machine interaction operation (e.g., a shooting instruction triggered by performing an operation on the human-machine interaction interface, where the shooting instruction may be used to request a target object in a game task to perform a shooting operation) through a human-machine interaction interface corresponding to the client; a processor 106, configured to respond to a corresponding operation instruction (e.g., respond to the shooting instruction) according to the human-computer interaction operation; the memory 104 is used for storing the above-mentioned operation instructions.
Further, the processor 106 sends the above operation instruction to the server 112 through the network 110. The server 112 includes a database 114 and a processing engine 116, and the database 114 is used for storing the above-mentioned operation instructions. The processor engine 116 is configured to determine a current alignment position of the firing operation in response to the firing command based on the received operating command. In the case where the centroid position is located in a suction area that matches the target object, which is an aiming area set for the target object, the centroid position is controlled to move toward the target object.
In the embodiment of the present invention, as an optional implementation manner, after acquiring a shooting instruction triggered by an operation performed on a human-computer interaction interface, the user equipment 102 may further respond to the shooting instruction to determine a current centroid position of the shooting operation; and controlling the center-of-gravity position to move towards the target object under the condition that the center-of-gravity position is located in the adsorption area matched with the target object. That is, the operation control method disclosed in the present invention can be performed by data interaction between the user equipment 102 and the server 112, or can be performed by the user equipment 102 independently, but is not limited thereto. The above is merely an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the user equipment may be, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a PC, and other computer equipment that supports running an application client. The server and the user equipment may implement data interaction through a network, which may include but is not limited to a wireless network or a wired network. Wherein, this wireless network includes: bluetooth, WIFI, and other networks that enable wireless communication. Such wired networks may include, but are not limited to: wide area networks, metropolitan area networks, and local area networks. The above is merely an example, and this is not limited in this embodiment.
According to an aspect of an embodiment of the present invention, there is provided an operation control method, as shown in fig. 2, including:
s201, a shooting instruction triggered by the execution operation of the human-computer interaction interface is obtained, wherein the shooting instruction is used for requesting to execute the shooting operation on a target object in the game task.
And S202, responding to the shooting instruction, and determining the current centering position of the shooting operation.
And S203, controlling the center of gravity to move towards the target object under the condition that the center of gravity is positioned in a suction area matched with the target object, wherein the suction area is an aiming area set for the target object.
It should be noted that the method steps shown in fig. 2 can be applied, but not limited to, in the operation control system shown in fig. 1, and are completed through data interaction between the user device 102 and the server 112, and can also be applied, but not limited to, in the user device 102, and are completed by the user device 102 independently. The above is merely an example, and this is not limited in this embodiment.
Alternatively, in the present embodiment, the operation control method may be applied to, but not limited to, a game application, such as a shooting game application. A user at a human-computer interaction interface of a game client can trigger a shooting instruction in a touch mode and the like, the game client acquires the shooting instruction and can respond to the shooting instruction to determine the current center position of shooting operation, and the center position is controlled to move towards a target object under the condition that the center position is located in an adsorption area matched with the target object.
Through the operation control method, a shooting instruction triggered by the execution operation of the human-computer interaction interface is obtained, the current centering position of the shooting operation is determined in response to the shooting instruction, and the centering position is controlled to move towards the target object under the condition that the centering position is located in the adsorption area matched with the target object. The process enables the center of gravity position of the adsorption area to automatically move towards the target object by setting the adsorption area, thereby reducing the control complexity of the shooting operation.
In the embodiment of the present invention, the manner in which the user triggers the shooting instruction on the human-computer interaction interface may be to click a corresponding virtual key on the screen, or may also be to press the screen until the pressing force is greater than the preset force, and the like. And when the user triggers a shooting instruction, the game client can respond to the shooting instruction to shoot, and at the moment, the current centering position of the shooting operation is determined.
As another alternative, controlling the movement of the isocenter position to the target object may include the steps of:
s1, a shooting suction mode set for the game task is determined.
And S2, determining a shooting position corresponding to the shooting operation in the adsorption area according to the shooting adsorption mode.
And S3, controlling the center of sight position to move to a shooting position, wherein the shooting position is located in the object display area corresponding to the target object.
By implementing the optional implementation mode, the shooting position corresponding to the shooting operation can be determined according to different shooting adsorption modes, for example, the shooting point corresponding to the shooting operation can be determined according to the point adsorption game mode, the shooting point can be the center position of the target object, and the like, the shooting point corresponding to the shooting operation can also be determined according to the line segment adsorption game mode, and the shooting point can be any position in a preset line segment range, so that the shooting position can be controlled, and the shooting complexity is reduced.
As an alternative embodiment, determining a shooting position corresponding to a shooting operation in the adsorption area according to the shooting adsorption mode may include:
s1, determining a first position in an object display area corresponding to the target object in the adsorption area under the condition that the shooting adsorption mode is the point adsorption game mode; taking the first position as a shooting position;
s2, determining a second position according to the height value of the quasi-center position and the object height of the target object under the condition that the shooting adsorption mode is the line segment adsorption game mode; the second position is taken as the shooting position.
Through the implementation of the optional implementation mode, the shooting adsorption mode can be a point adsorption game mode or a line adsorption game mode, the first position can be determined according to the point adsorption game mode and is used as the shooting position, the second position can also be determined according to the line adsorption game mode and is used as the shooting position, so that the shooting position determination in different modes is realized, and the operation control is more complete and comprehensive.
As an optional implementation manner, determining the first position in the object display area corresponding to the target object in the adsorption area may include: the center position in the object display area is determined to be a first position.
In the embodiment of the invention, the first position can be a central point of the object display area, optionally, when the target object is a virtual enemy, the first position can also be a heart part on the virtual enemy, and the like.
By implementing such an embodiment, the center position in the object display area can be taken as the first position, so that the shooting position tends to the center point of the shooting target, and the hit rate is higher.
As an alternative embodiment, determining the second position according to the height of the isocenter position and the object height of the target object includes:
s1, determining the position matched with the first height value in the object display area as a second position when the height value of the quasi-center position is higher than the first height value;
s2, determining a position in the object display area matching the second height value as a second position in case that the height value of the isocenter position is lower than the second height value;
s3, in case that the height value of the isocenter position is lower than the first height value and higher than the second height value, determining a position in the object display area that matches the height value of the isocenter position as a second position.
By implementing the optional implementation mode, the second position can be determined according to the height value of the quasi-center position, the second position can be determined according to the quasi-center height value aimed by the user, the reality of the game experience of the user is improved, and the position of the quasi-center height value aimed by the user, which exceeds the preset line segment area, can be returned to the endpoint of the preset line segment area, so that the range of the second position is controllable, and the game experience of the user is further improved.
In the embodiment of the invention, different shooting adsorption modes can be adopted under different game modes, so that the shooting requirements under different game scenes are met. For example, for a shooting and killing game scene with many enemies, a point adsorption game mode is preferably adopted, the center of the position falling into the adsorption area is moved towards the shooting position of the target object, shooting correction is carried out, and shooting accuracy is improved. Or, aiming at a game scene with few enemies and high requirement on the accuracy of the hitting and killing part, a line segment adsorption game mode is preferably adopted, the center of sight with the position falling into a preset adsorption area is moved towards a preset point of a shooting target by combining the height of the center of sight position, shooting correction of the part is carried out, the shooting accuracy is improved, different shooting scores can be set aiming at different parts, if the head of the shooting target in shooting can be set with a higher score, other parts of the shooting target in shooting can be set with a lower score, and various game requirements are met. Further, when the quasi-center reaches the target object, stopping the adsorption process; further, if the user triggers a stop shooting instruction in the process of controlling the movement of the sighting center toward the target object according to the current game mode, the adsorption process is stopped.
In this embodiment of the present invention, optionally, controlling the centroid position to move towards the target object may include controlling the centroid position to move towards the target object according to a target speed, so that the centroid position moves at a constant speed, or the centroid position may also move towards the target object at a variable speed, which is not limited in this embodiment of the present invention. Further, the target speed may be a preset moving speed of the isocenter when the adsorption is generated, and when the target speed is high, the time required for the isocenter to move to the first position or the second position is short, so that the shooting hit rate is high; when the target speed is small, the time required for the isocenter to move to the first position or the second position is large, and thus the shooting hit rate is low. Therefore, different target speeds can be set according to different requirements on game difficulty, more game scenes are provided, and various game requirements of users are met. The second position contains a number of shots on the target object, in particular the second position is determined by the height value of the isocenter and the target object.
As an alternative embodiment, in the process of controlling the movement of the isocenter position to the target object, the following steps may be further performed:
s1, when the position of the target object is detected to move, determining the target position of the moved target object;
s2, updating the adsorption area matched with the target object according to the target position to obtain an updated adsorption area;
s3, controlling the isocenter position to stop moving to the target object when the isocenter position is separated from the updated adsorption area;
and S4, controlling the center position to move to the target object on the target position when the center position is in the updated adsorption area.
By implementing the optional implementation mode, the adsorption area matched with the target object can be continuously updated in the process of controlling the movement of the alignment position to the target object, the alignment position is controlled to stop moving to the target object under the condition that the alignment position is separated from the adsorption area, and the alignment position is controlled to move to the target object under the condition that the alignment position is located in the adsorption area, so that the movement real-time performance of the alignment position is higher.
As an alternative embodiment, determining the current alignment position of the firing operation may include:
s1, taking the position of the gun support in the shooting operation as an end point, and generating shooting rays along the current muzzle direction;
s2, the position at which the shot ray is directed is determined as the current isocenter position.
In the embodiment of the present invention, referring to fig. 5, a (player) holds a gun to shoot a target object B (target object), and generates a shooting ray in the current muzzle direction by using the position of the gun as an endpoint (in fig. 5, the position of the muzzle in the gun is used as an endpoint), that is, the dotted line in the muzzle direction in fig. 5, and the pointed position is the current quasi-center position.
By implementing the optional implementation mode, the current centroid position can be determined by generating the shooting ray, the operation is convenient and fast, the accuracy is high, and the centroid position can be determined more accurately.
In the embodiment of the present invention, the target object at least includes a virtual enemy, the adsorption area may be an area selected by a frame located around the virtual enemy, as shown in fig. 5, the preset virtual area is an area selected by a rectangular frame around B in the figure, and in practical application, other shapes such as a circle may also be used to set the boundary of the preset virtual area. Optionally, the target object may further include a virtual article, and the like, which is not limited in the embodiment of the present invention.
As an alternative implementation, the following steps may also be performed:
s1, acquiring the position coordinates of the shooting target, wherein the position coordinates are the coordinates corresponding to the center position of the shooting target;
s2, adding a first preset value to the ordinate of the position coordinate, and adding a second preset value to the abscissa of the position coordinate to obtain a first coordinate;
s3, adding a first preset value to the ordinate of the position coordinate, and subtracting a second preset value from the abscissa of the position coordinate to obtain a second coordinate;
s4, subtracting a first preset value from the ordinate of the position coordinate, and adding a second preset value to the abscissa of the position coordinate to obtain a third coordinate;
s5, subtracting a first preset value from the ordinate of the position coordinate, and subtracting a second preset value from the abscissa of the position coordinate to obtain a fourth coordinate;
and S6, acquiring the adsorption area according to the first coordinate, the second coordinate, the third coordinate and the fourth coordinate.
By implementing this alternative embodiment, it is possible to obtain a plurality of coordinate points (the above-described first coordinate, second coordinate, third coordinate, and fourth coordinate) by performing transformation according to the coordinates of the position where the shooting target is located, and obtain the adsorption area according to these coordinate points. The adsorption area can be obtained only by obtaining the coordinates of the position of the shooting target in the process, the obtaining mode is simple and quick, and the efficiency of obtaining the preset adsorption area is improved.
As another alternative, before determining the second firing point on the basis of the height value of the isocenter and the firing target, the following steps can also be carried out:
s1, acquiring the position coordinates of the central point of the shooting target;
and S2, subtracting a third preset value from the ordinate of the central point position coordinate to obtain a preset lowest value, and adding a fourth preset value to the ordinate of the central point position coordinate to obtain a preset highest value.
In an embodiment of the present invention, the centroid position coordinate may be (x, y), wherein the centroid height value is y, the center point coordinate of the shooting target may be (a, b), the third preset value may be a preset value, for example, the third preset value may be x1, the fourth preset value may also be a preset value, for example, the fourth preset value may be x2, the preset lowest value may be b-x1, and the preset highest value may be b + x 2. If y > b + x2, (a, b + x2) is determined as the second shot; if y < b-x1, (a, b-x1) is determined as the second shot; if b-x2 ≦ y ≦ b + x2, (a, y) is determined as the second shot point.
By implementing the alternative implementation mode, the second shooting point can be adsorbed to the preset segment area according to the difference of the quasi-center height values in the segment adsorption game mode, such as the above-mentioned (a, b + x2) to (a, b-x1), so that the shooting of different position areas of the shooting target can be realized, some special shooting operations (such as shooting the head of the target object) can be completed, and the auxiliary shooting requirements of different game scenes can be realized.
Referring to fig. 3, fig. 3 is a schematic flow chart of a point adsorption game mode, please refer to steps S301 to S309.
S301, when the player uses the point absorption game mode, the step S302 is triggered and executed.
In the embodiment of the present invention, when the player plays a shooting game, the point adsorption game mode is one of the shooting adsorption modes, and the shooting adsorption moves toward the fixed point in the point adsorption mode.
S302, judging whether the player clicks to fire, if so, executing steps S303 to S304, and if not, executing step S301.
In the embodiment of the present invention, the manner of determining whether the player clicks the firing key may specifically be determining whether the player clicks the firing key on the man-machine interface, and if so, determining that the player clicks the firing, that is, the firing triggering instruction described in the above embodiment, at this time, steps S303 to S304 are executed, and if not, determining that the player does not trigger the firing instruction, and repeating the determining step.
And S303, emitting rays to detect the target.
In the embodiment of the invention, a gun on the human-computer interaction interface can emit rays to detect the target, namely the target object described in the above embodiment, and the target object includes but is not limited to a virtual enemy.
S304, judging whether the adsorption is generated, if so, executing the steps S305 to S307, and if not, executing the step S303.
In the embodiment of the present invention, the specific way of determining whether the adsorption is generated is to determine whether the centroid position is located in the adsorption region, determine that the adsorption is generated when the centroid position is located in the adsorption region, and determine that the adsorption is not generated when the centroid position is not located in the adsorption region.
S305, acquiring the central position of the target.
And S306, controlling the alignment position to move towards the target direction.
In the embodiment of the present invention, the target direction is the direction of the center position.
S307, judging whether the fire is stopped, if so, executing step S309, and if not, executing step S308.
In the embodiment of the invention, the mode of judging whether to stop the fire can be specifically judging whether the player clicks a fire stopping button on the human-computer interaction interface, the fire stopping can be judged when the player clicks the fire stopping button on the human-computer interaction interface, and the fire stopping can be judged when the player does not click the fire stopping button on the human-computer interaction interface.
And S308, judging whether the target center position is reached, if so, executing the step S309, and if not, ending the process.
S309, stopping adsorption.
In the embodiment of the invention, under the condition that a player selects a point adsorption game mode, whether the player clicks to fire (namely, a shooting instruction is triggered) is detected, if the player clicks to fire, a gun is controlled to send out a ray detection target (namely, a centering position), whether the current centering position is located in an adsorption area is judged, adsorption is generated if the current centering position is located in the adsorption area, and if the current centering position is not located in the adsorption area, the gun is continuously controlled to send out the ray detection target; after the adsorption is generated, the center position is controlled to move towards the target object, in the process that the center position moves towards the target object, if a player stops firing (namely, a firing stopping instruction is triggered), the adsorption is stopped, the process is finished, if the player does not stop firing, whether the center position reaches the target center position (namely, a firing position) is judged, and when the center position reaches the firing position is judged, the adsorption is stopped, and the process is finished.
Referring to fig. 4, fig. 4 is a flowchart of a line segment adsorption game mode, please refer to steps S401 to S415.
S401, when the player uses the line segment adsorption game mode, the step S402 is triggered and executed.
In the embodiment of the present invention, when a player plays a shooting game, the line segment adsorption game mode is one of the shooting adsorption modes, and the shooting adsorption moves toward the fixed line segment area in the line segment adsorption mode.
S402, judging whether the player clicks to fire, if yes, executing steps S403 to S404, and if not, executing step S403.
In the embodiment of the present invention, the manner of determining whether the player clicks the firing key may specifically be determining whether the player clicks the firing key on the man-machine interface, and if so, determining that the player clicks the firing, that is, the firing triggering instruction described in the above embodiment, at this time, steps S403 to S404 are executed, and if not, determining that the player does not trigger the firing instruction, and repeating the determining step.
And S403, emitting rays to detect the target.
In the embodiment of the invention, a gun on the human-computer interaction interface can emit rays to detect the target, namely the target object described in the above embodiment, and the target object includes but is not limited to a virtual enemy.
S404, judging whether the adsorption is generated, if so, executing the steps S405 to S407, and if not, executing the step S403.
In the embodiment of the present invention, the specific way of determining whether the adsorption is generated is to determine whether the centroid position is located in the adsorption region, determine that the adsorption is generated when the centroid position is located in the adsorption region, and determine that the adsorption is not generated when the centroid position is not located in the adsorption region.
S405, acquiring the center position of the target.
And S406, controlling the alignment position to move towards the target direction.
In the embodiment of the present invention, the target direction is the direction of the center position.
S407, judging whether the centroid height is within the line segment range, if so, executing steps S408 and S412 to S413, and if not, executing step S409.
In the embodiment of the invention, the line segment range is preset and can be positioned on the target (namely the target object) so as to obtain better shooting experience.
S408, the height of the center of alignment position is not changed, and only the abscissa is changed.
In the embodiment of the invention, if the height of the collimation center is in the range of the line segment, the height of the collimation center is kept unchanged, and the horizontal coordinate of the position of the collimation center is changed, so that the collimation center falls into the position on the line segment matched with the height of the collimation center.
S409, whether the maximum point is larger than the maximum point is judged, if yes, steps S411 to S413 are executed, and if not, steps S410 and S412 to S413 are executed.
In the embodiment of the invention, if the quasi-center height value is greater than the highest point, the quasi-center is controlled to move to the highest point of the line segment, and if the quasi-center height value is not less than the highest value and is not located in the range of the preset line segment, the quasi-center height value is less than the lowest value, and the quasi-center is controlled to move to the lowest point of the line segment.
And S410, moving to the lowest point of the line segment.
And S411, moving to the highest point of the line segment.
S412, the collimation center moves towards the target direction.
In the embodiment of the invention, when the quasi-center height is within the line segment range, the target direction is the position in the line segment range, which is consistent with the quasi-center height; when the quasi-center height is not in the range of the line segment, the target direction of the quasi-center height value which is greater than the highest value is the highest point of the line segment, and the target direction of the quasi-center height value which is less than the lowest value is the lowest point of the line segment.
And S413, judging whether the fire is stopped, if so, executing the step S415, and if not, executing the step S414.
And S414, judging whether the target center position is reached, if so, executing the step S415, and if not, ending the process.
S415, stopping the adsorption.
In the embodiment of the invention, under the condition that a player selects a line segment adsorption game mode, whether the player clicks to fire is detected (namely a shooting instruction is triggered), if the player does not click to fire, whether the player clicks to fire is continuously detected, if the player clicks to fire, a gun is controlled to send out a ray detection target (namely an alignment position), whether the current alignment position is located in an adsorption area is judged, adsorption is generated if the current alignment position is located in the adsorption area, and if the current alignment position is not located in the adsorption area, the gun is continuously controlled to send out the ray detection target; after adsorption is generated, the central position (namely the shooting position) of the target is obtained, the center position is controlled to move towards the corresponding direction of the shooting position, whether the height value of the center position is within the range of the preset line segment or not is judged, when the height value of the position of the collimation center is judged to be in the range of the preset line segment, the abscissa of the position of the collimation center is changed, the ordinate of the position of the collimation center is kept, so that the collimation center moves towards the corresponding direction of the shooting position, when the height value of the alignment position is judged not to be in the range of the preset line segment, whether the height value of the alignment position is larger than the maximum value in the range of the preset line segment is judged, if so, controlling the position of the center of alignment to move towards the highest point of the line segment corresponding to the preset line segment range, or alternatively, and if the height value of the alignment center position is smaller than the minimum value in the preset line segment range, controlling the alignment center position to move towards the lowest point of the line segment corresponding to the preset line segment range. In the process of moving the sighting center position, if the player stops firing (namely triggering a firing stopping instruction), stopping adsorption, ending the process, if the player does not stop firing, judging whether the sighting center position reaches a target center position (namely a firing position), and when the sighting center position is judged to reach the firing position, stopping adsorption and ending the process.
Referring to fig. 5, fig. 5 is a schematic shooting diagram in a shooting adsorption mode, as shown in fig. 5, a user a aims at a target object B by using a gun, the aiming position is a centering position, a dashed box around the target object is an adsorption area, and when the centering position is located in the adsorption area matched with the target object, the centering position is controlled to move towards the target object (the target object B shown in fig. 5), so that adsorption in a shooting process is achieved, and shooting complexity is reduced. If the alignment is outside the adsorption region, i.e., outside the dotted line frame around the target object B in fig. 5, no adsorption occurs. Further, under the condition that the alignment position is located in the adsorption area matched with the target object, if the user A or the target object B moves to cause the alignment position to be separated from the adsorption area in the moving process, the adsorption effect is stopped, after the alignment position is separated from the adsorption area, if the alignment position enters the adsorption area again, adsorption is generated again, and the alignment position is controlled to move towards the target object. Note that the adsorption region moves as the target object B moves.
Referring to fig. 6, fig. 6 is a schematic view of a shot in a point adsorption game mode, as shown in fig. 6, a user a uses a gun to aim at a target object B to complete the shot, and an adsorption area is located around the target object B, at this time, the center of gravity of the user a aiming at the target object B using the gun falls into the adsorption area, and the center of gravity is controlled to move towards a certain point of the target object B (e.g. the middle part of the body of the shot target B). In the spot suction game mode, the centroid position may also be controlled to move toward the head of the target object, or alternatively, the centroid position may also be controlled to move toward the heart of the target object, which is not limited in the embodiment of the present invention. In the process of moving the centroid position toward the target object, the centroid position may be controlled to move at a constant speed at the target speed, or the centroid position may be controlled to move at a variable speed.
Referring to fig. 7, fig. 7 is a schematic diagram of a shot in a line segment adsorption game mode, as shown in fig. 7, a user a uses a gun to aim at a target object B to complete the shot, and the diagram shows scenes of three kinds of alignment positions, when the alignment position is the uppermost position in the diagram, since the height value of the position exceeds the preset maximum value (the height of the broken line extending from the top of the head of B in fig. 7), the alignment is controlled to move toward a part of the shot target B matching the preset maximum value, i.e., the top of the head position in fig. 7; when the position of the sighting center is the middle position in the figure, because the height value of the position is larger than the preset lowest value (the height of the broken line extended from the sole of the B in the figure 7) and is smaller than the preset highest value, the sighting center is controlled to move towards the position of the shooting target B matched with the height of the sighting center, namely the middle part of the shooting target B in the figure 7; when the center position is the lower position in the figure, since the height value of the position is smaller than the preset lowest value, the center is controlled to move toward the part of the shooting target B matching the preset lowest value, that is, the sole position in fig. 7. It should be noted that, the preset maximum value may also be set to be lower than the top of the head of the target object, and the preset minimum value may also be set to be higher than the sole of the target object. In the process of moving the centroid position toward the target object, the centroid position may be controlled to move at a constant speed at the target speed, or the centroid position may be controlled to move at a variable speed.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
According to another aspect of the embodiments of the present invention, there is also provided an operation control apparatus for implementing the above-described operation control method. As shown in fig. 8, the apparatus includes:
an obtaining unit 801, configured to obtain a shooting instruction triggered by performing an operation on a human-computer interaction interface, where the shooting instruction is used to request a target object in a game task to perform a shooting operation;
a first determining unit 802, configured to determine, in response to a shooting instruction, a current alignment position of a shooting operation;
a first control unit 803 for controlling the isocenter position to move toward the target object in a case where the isocenter position is located in a suction area matched with the target object, wherein the suction area is a sighting area set for the target object.
By implementing the operation control device shown in fig. 8, a shooting instruction triggered by the execution of an operation on the human-computer interaction interface is acquired, the current centering position of the shooting operation is determined in response to the shooting instruction, and the centering position is controlled to move towards the target object under the condition that the centering position is located in the adsorption area matched with the target object. The process enables the center of gravity position of the adsorption area to automatically move towards the target object by setting the adsorption area, thereby reducing the control complexity of the shooting operation.
As an alternative embodiment, please refer to fig. 9, fig. 9 is a schematic structural diagram of another operation control apparatus, and as shown in fig. 9, the first control unit 803 includes:
a first determination module 8031 configured to determine a shooting adsorption mode set for the game task;
a second determining module 8032, configured to determine, according to the shooting adsorption mode, a shooting position corresponding to the shooting operation in the adsorption area;
a control module 8033 configured to control the center of gravity position to move to a shooting position, where the shooting position is located in an object display area corresponding to the target object.
Through the implementation of the optional implementation mode, the shooting position corresponding to the shooting operation can be determined according to different shooting adsorption modes, and the operation control is more complete and comprehensive.
As an alternative embodiment, the second determining module 8032 includes:
a first determining submodule 80321 for determining a first position in an object display area corresponding to the target object in the adsorption area, in a case where the shooting adsorption mode is the point adsorption game mode; taking the first position as a shooting position;
a second determining sub-module 80322 for determining a second position according to the height value of the isocenter position and the object height of the target object in the case where the shooting adsorption mode is the line segment adsorption game mode; the second position is taken as the shooting position.
Through the implementation of the optional implementation mode, the shooting adsorption mode can be a point adsorption game mode or a line adsorption game mode, the first position can be determined according to the point adsorption game mode and is used as the shooting position, the second position can also be determined according to the line adsorption game mode and is used as the shooting position, so that the shooting position determination in different modes is realized, and the operation control is more complete and comprehensive.
As an optional implementation manner, the manner that the first determining submodule 80321 is used to determine the first position in the object display area corresponding to the target object in the adsorption area is specifically:
the first determining sub-module 80321 is configured to determine that the center position in the object display area is the first position.
By implementing such an embodiment, the center position in the object display area can be taken as the first position, so that the shooting position tends to the center point of the shooting target, and the hit rate is higher.
As an optional implementation manner, the manner for determining the second position according to the height of the isocenter position and the object height of the target object by the second determining sub-module 80322 is specifically as follows:
a second determination sub-module 80322 for determining a position in the object display area that matches the first height value as a second position in the case where the height value of the isocenter position is higher than the first height value; determining a position in the object display area that matches the second height value as a second position in the case where the height value of the isocenter position is lower than the second height value; in a case where the height value of the isocenter position is lower than the first height value and higher than the second height value, a position in the object display area that matches the height value of the isocenter position is determined as the second position.
By implementing the optional implementation mode, the second position can be determined according to the height value of the quasi-center position, the second position can be determined according to the quasi-center height value aimed by the user, the reality of the game experience of the user is improved, and the position of the quasi-center height value aimed by the user, which exceeds the preset line segment area, can be returned to the endpoint of the preset line segment area, so that the range of the second position is controllable, and the game experience of the user is further improved.
As an optional implementation manner, the manner for the first control unit 803 to control the movement of the isocenter position to the target object is specifically:
a first control unit 803 for controlling the isocenter position to move to the target object at the target speed.
As an optional implementation, the apparatus may further include:
a second determining unit 804, configured to determine, in a process of controlling the center of gravity position to move to the target object, a target position after the target object moves when the position of the target object is detected to move;
an obtaining unit 805, configured to update, according to the target position, an adsorption area matched with the target object, to obtain an updated adsorption area;
a second control unit 806 configured to control the isocenter position to stop moving to the target object when the isocenter position is separated from the updated adsorption area;
a third control unit 807 for controlling the isocenter position to move to the target object located on the target position in the case where the isocenter position is located in the updated adsorption area.
By implementing the optional implementation mode, the adsorption area matched with the target object can be continuously updated in the process of controlling the movement of the alignment position to the target object, the alignment position is controlled to stop moving to the target object under the condition that the alignment position is separated from the adsorption area, and the alignment position is controlled to move to the target object under the condition that the alignment position is located in the adsorption area, so that the movement real-time performance of the alignment position is higher.
As an alternative embodiment, the determining of the current alignment position of the shooting operation by the first determining unit 802 may include:
the first determination unit 802 generates a shooting ray in the current muzzle direction with the position of the gun branch in the shooting operation as an end point.
The first determination unit 802 determines a position at which the shooting ray is directed as a current isocenter position.
By implementing the optional implementation mode, the current centroid position can be determined by generating the shooting ray, the operation is convenient and fast, the accuracy is high, and the centroid position can be determined more accurately.
As an optional implementation manner, the first determining unit 802 is further configured to obtain a position coordinate where the shooting target is located, where the position coordinate is a coordinate corresponding to a center position of the shooting target; adding a first preset value to the ordinate of the position coordinate, and adding a second preset value to the abscissa of the position coordinate to obtain a first coordinate; adding a first preset value to the ordinate of the position coordinate, and subtracting a second preset value from the abscissa of the position coordinate to obtain a second coordinate; subtracting a first preset value from the ordinate of the position coordinate, and adding a second preset value to the abscissa of the position coordinate to obtain a third coordinate; subtracting a first preset value from the ordinate of the position coordinate, and subtracting a second preset value from the abscissa of the position coordinate to obtain a fourth coordinate; and obtaining the adsorption area according to the first coordinate, the second coordinate, the third coordinate and the fourth coordinate.
By implementing the optional implementation mode, the adsorption area can be obtained only by obtaining the coordinates of the position where the shooting target is located, the obtaining mode is simple and quick, and the efficiency of obtaining the preset adsorption area is improved.
As another alternative, the second determining sub-module 80322 is further configured to obtain coordinates of the center point position of the shooting target before determining the second position according to the height of the isocenter position and the object height of the target object; and subtracting a third preset value from the ordinate in the central point position coordinate to obtain a preset lowest value, and adding a fourth preset value to the ordinate in the central point position coordinate to obtain a preset highest value.
By implementing the optional implementation mode, the second shooting point can be adsorbed to the preset segment area according to the difference of the quasi-center height values in the segment adsorption game mode, some special shooting operations (such as shooting the head of a target object) can be completed, and the auxiliary shooting requirements of different game scenes are met.
According to yet another aspect of the embodiments of the present invention, there is also provided an electronic device for implementing the operation control method, as shown in fig. 10, the electronic device includes a memory 1002 and a processor 1004, the memory 1002 stores therein a computer program, and the processor 1004 is configured to execute the steps in any one of the method embodiments by the computer program.
Optionally, in this embodiment, the electronic apparatus may be located in at least one network device of a plurality of network devices of a computer network.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a shooting instruction triggered by the operation executed on the human-computer interaction interface, wherein the shooting instruction is used for requesting to execute the shooting operation on the target object in the game task;
s2, responding to the shooting instruction, and determining the current centering position of the shooting operation;
and S3, controlling the quasi-center position to move towards the target object under the condition that the quasi-center position is located in a suction area matched with the target object, wherein the suction area is an aiming area set for the target object.
Alternatively, it can be understood by those skilled in the art that the structure shown in fig. 10 is only an illustration, and the electronic device may also be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 10 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 10, or have a different configuration than shown in FIG. 10.
The memory 1002 may be used to store software programs and modules, such as program instructions/modules corresponding to the target-adsorption-based shooting method and apparatus in the embodiment of the present invention, and the processor 1004 executes various functional applications and data processing, i.e., implements the operation control method described above, by executing the software programs and modules stored in the memory 1002. The memory 1002 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 1002 may further include memory located remotely from the processor 1004, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 1002 may be used to store information such as operation instructions, but is not limited to this. As an example, as shown in fig. 10, the memory 1002 may include, but is not limited to, an obtaining unit 801, a first determining unit 802, and a first control unit 803 in the operation control device, and may further include, but is not limited to, other module units in the operation control device, which is not described in detail in this example.
Optionally, the above-mentioned transmission device 1006 is used for receiving or sending data via a network. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 1006 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices so as to communicate with the internet or a local area Network. In one example, the transmission device 1006 is a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In addition, the electronic device further includes: a display 1008 for displaying a human-computer interaction interface; and a connection bus 1010 for connecting the respective module parts in the above-described electronic apparatus.
According to a further aspect of embodiments of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above-mentioned method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring a shooting instruction triggered by the operation executed on the human-computer interaction interface, wherein the shooting instruction is used for requesting to execute the shooting operation on the target object in the game task;
s2, responding to the shooting instruction, and determining the current centering position of the shooting operation;
and S3, controlling the quasi-center position to move towards the target object under the condition that the quasi-center position is located in a suction area matched with the target object, wherein the suction area is an aiming area set for the target object.
Alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be substantially or partially implemented in the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and including instructions for causing one or more computer devices (which may be personal computers, servers, or network devices) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit is merely a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (11)

1. An operation control method characterized by comprising:
acquiring a shooting instruction triggered by executing operation on a human-computer interaction interface, wherein the shooting instruction is used for requesting to execute shooting operation on a target object in a game task;
responding to the shooting instruction, and determining the current centering position of the shooting operation;
controlling the alignment position to move toward the target object in a case where the alignment position is located in a suction area matched with the target object, wherein the suction area is an aiming area provided for the target object;
the controlling the isocenter position to move towards the target object comprises: determining a shooting adsorption mode set for the game task; determining a shooting position corresponding to the shooting operation in the adsorption area according to the shooting adsorption mode; controlling the center of sight position to move to the shooting position, wherein the shooting position is located in an object display area corresponding to the target object;
the determining, according to the shot adsorption pattern, a shot position corresponding to the shooting operation in the adsorption area includes:
determining a first position in the object display area corresponding to the target object in the adsorption area when the shooting adsorption mode is a point adsorption game mode; taking the first position as the shooting position;
determining a second position according to the height value of the quasi-center position and the object height of the target object under the condition that the shooting adsorption mode is a line segment adsorption game mode; and taking the second position as the shooting position.
2. The method of claim 1, wherein the determining a first location in the object display region corresponding to the target object in the suction region comprises:
and determining that the central position in the object display area is the first position.
3. The method of claim 1, wherein determining a second position based on the height of the isocenter position and the object height of the target object comprises:
determining a position in the object display area that matches the first height value as the second position if the height value of the isocenter position is higher than the first height value;
determining a position in the object display area that matches a second height value as the second position if the height value of the isocenter position is lower than the second height value;
determining a location in the object display area that matches the height value of the isocenter position as the second location if the height value of the isocenter position is lower than the first height value and higher than the second height value.
4. The method of claim 1, wherein the controlling the movement of the isocenter position toward the target object comprises:
and controlling the alignment position to move towards the target object according to the target speed.
5. The method according to any one of claims 1 to 4, wherein in the controlling the movement of the isocenter position to the target object, further comprising:
under the condition that the position of the target object is detected to move, determining the target position of the target object after the target object moves;
updating the adsorption area matched with the target object according to the target position to obtain an updated adsorption area;
controlling the alignment position to stop moving towards the target object when the alignment position is separated from the updated adsorption area;
controlling the centering position to move toward the target object located on the target position in a case where the centering position is located in the updated adsorption region.
6. An operation control device characterized by comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a shooting instruction triggered by executing operation on a human-computer interaction interface, and the shooting instruction is used for requesting to execute shooting operation on a target object in a game task;
the first determining unit is used for responding to the shooting instruction and determining the current alignment position of the shooting operation;
a first control unit configured to control the centroid position to move toward the target object in a case where the centroid position is located in a suction area that matches the target object, wherein the suction area is an aiming area provided for the target object;
the first control unit includes:
the first determining module is used for determining a shooting adsorption mode set for the game task;
the second determination module is used for determining a shooting position corresponding to the shooting operation in the adsorption area according to the shooting adsorption mode;
a control module for controlling the sighting position to move to the shooting position, wherein the shooting position is located in an object display area corresponding to the target object;
the second determining module includes:
a first determination submodule configured to determine a first position in the object display area corresponding to the target object in the adsorption area, when the shooting adsorption mode is a point adsorption game mode; taking the first position as the shooting position;
the second determining submodule is used for determining a second position according to the height value of the quasi-center position and the object height of the target object under the condition that the shooting adsorption mode is a line segment adsorption game mode; and taking the second position as the shooting position.
7. The apparatus according to claim 6, wherein the first determining sub-module is configured to determine the first position in the object display area corresponding to the target object in the adsorption area by:
the first determining submodule is configured to determine that a central position in the object display area is the first position.
8. The apparatus of claim 6, wherein the second determining sub-module is configured to determine the second position according to the height of the isocenter position and the object height of the target object by:
the second determining submodule is used for determining a position matched with the first height value in the object display area as the second position under the condition that the height value of the quasi-center position is higher than the first height value; determining a position in the object display area that matches a second height value as the second position if the height value of the isocenter position is lower than the second height value; determining a location in the object display area that matches the height value of the isocenter position as the second location if the height value of the isocenter position is lower than the first height value and higher than the second height value.
9. The apparatus according to claim 6, wherein the first control unit is configured to control the movement of the isocenter position to the target object by:
the first control unit is used for controlling the alignment position to move towards the target object according to the target speed.
10. A storage medium comprising a stored program, wherein the program when executed performs the method of any of claims 1 to 5.
11. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method of any of claims 1 to 5 by means of the computer program.
CN201911143167.2A 2019-11-20 2019-11-20 Operation control method, operation control device, storage medium, and electronic device Active CN110935173B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911143167.2A CN110935173B (en) 2019-11-20 2019-11-20 Operation control method, operation control device, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911143167.2A CN110935173B (en) 2019-11-20 2019-11-20 Operation control method, operation control device, storage medium, and electronic device

Publications (2)

Publication Number Publication Date
CN110935173A CN110935173A (en) 2020-03-31
CN110935173B true CN110935173B (en) 2021-09-10

Family

ID=69907072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911143167.2A Active CN110935173B (en) 2019-11-20 2019-11-20 Operation control method, operation control device, storage medium, and electronic device

Country Status (1)

Country Link
CN (1) CN110935173B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764654B (en) * 2021-01-29 2022-10-25 北京达佳互联信息技术有限公司 Component adsorption operation method and device, terminal and storage medium
CN115721925A (en) * 2021-08-30 2023-03-03 网易(杭州)网络有限公司 Interaction method and device of virtual objects, storage medium and electronic equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150375110A1 (en) * 2014-06-30 2015-12-31 AlternativaPlatform Ltd. Systems and methods for shooting in online action games using automatic weapon aiming
CN105148520A (en) * 2015-08-28 2015-12-16 上海甲游网络科技有限公司 Method and device for automatic aiming of shooting games
CN107029428B (en) * 2016-02-04 2020-06-19 网易(杭州)网络有限公司 Control system, method and terminal for shooting game
CN110147159B (en) * 2017-09-21 2022-07-29 腾讯科技(深圳)有限公司 Target positioning method and device in virtual interaction scene and electronic equipment
CN109529327B (en) * 2017-09-21 2022-03-04 腾讯科技(深圳)有限公司 Target positioning method and device in virtual interaction scene and electronic equipment
CN107885417B (en) * 2017-11-03 2021-02-02 腾讯科技(深圳)有限公司 Target positioning method, device and computer readable storage medium in virtual environment
CN108404407B (en) * 2018-01-05 2021-05-04 网易(杭州)网络有限公司 Auxiliary aiming method and device in shooting game, electronic equipment and storage medium
CN110075521A (en) * 2019-05-22 2019-08-02 努比亚技术有限公司 Pressure rifle householder method, device, mobile terminal and the storage medium of shooting game
CN110404251A (en) * 2019-08-29 2019-11-05 网易(杭州)网络有限公司 The method and device aimed at is assisted in game

Also Published As

Publication number Publication date
CN110935173A (en) 2020-03-31

Similar Documents

Publication Publication Date Title
US10857462B2 (en) Virtual character controlling method and apparatus, electronic device, and storage medium
CN111265858B (en) Operation control method, operation control device, storage medium, and electronic device
US11877049B2 (en) Viewing angle adjustment method and device, storage medium, and electronic device
US20220105429A1 (en) Virtual prop control method and apparatus, computer-readable storage medium, and electronic device
CN111084986B (en) Display control method, display control device, storage medium, and electronic device
JP2022522699A (en) Virtual object control methods, devices, terminals and programs
CN112076467B (en) Method, device, terminal and medium for controlling virtual object to use virtual prop
CN111111168B (en) Control method and device of virtual prop, storage medium and electronic device
CN111659118B (en) Prop control method and device, storage medium and electronic equipment
CN110975289B (en) Shooting mode switching control method and device, storage medium and electronic device
WO2022247592A1 (en) Virtual prop switching method and apparatus, terminal, and storage medium
US20190314722A1 (en) Method and apparatus for configuring an accessory device
CN110935173B (en) Operation control method, operation control device, storage medium, and electronic device
CN111111171A (en) Operation control method, operation control device, storage medium, and electronic device
US10825261B2 (en) Method and device for determining and adjusting spatial attribute of virtual character in virtual reality applications
CA2917962A1 (en) Image processing program, server device, image processing system, and image processing method
CN111135554B (en) Operation control method, operation control device, storage medium, and electronic device
WO2021143253A1 (en) Method and apparatus for operating virtual prop in virtual environment, device, and readable medium
CN111185007B (en) Control method and device of virtual prop, storage medium and electronic device
CN111330278B (en) Animation playing method, device, equipment and medium based on virtual environment
KR20220032626A (en) Virtual prop acquisition method and device, storage medium and electronic device
US9165425B2 (en) Method and apparatus for configuring a computing environment
CN111265868A (en) Display control method, display control device, storage medium, and electronic device
US9415301B2 (en) Method and apparatus for processing control signals of an accessory
WO2024098984A1 (en) Virtual-prop control method and apparatus, and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022503

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant