WO2022061712A1 - 无人机对战方法、无人机对战控制装置、无人机及存储介质 - Google Patents

无人机对战方法、无人机对战控制装置、无人机及存储介质 Download PDF

Info

Publication number
WO2022061712A1
WO2022061712A1 PCT/CN2020/117741 CN2020117741W WO2022061712A1 WO 2022061712 A1 WO2022061712 A1 WO 2022061712A1 CN 2020117741 W CN2020117741 W CN 2020117741W WO 2022061712 A1 WO2022061712 A1 WO 2022061712A1
Authority
WO
WIPO (PCT)
Prior art keywords
battle
target
combat
processor
drone
Prior art date
Application number
PCT/CN2020/117741
Other languages
English (en)
French (fr)
Inventor
翁松伟
付洁
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/117741 priority Critical patent/WO2022061712A1/zh
Priority to CN202080008657.6A priority patent/CN113395999A/zh
Publication of WO2022061712A1 publication Critical patent/WO2022061712A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/843Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/307Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying

Definitions

  • the invention relates to the technical field of UAV competitive games, in particular to a UAV battle method, a UAV battle control device, and an UAV and a storage medium, wherein the UAV battle control device can be a terminal for battle control Devices such as smart glasses, smartphones, tablets or remote controls, etc.
  • the model drones are also easily damaged in simulated air combat games, resulting in a great waste of resources.
  • the embodiment of the present invention provides a drone battle scheme, which adopts the technical concept of using existing drones to apply drones in the field of game competition without adding hardware, thereby broadening the application of drones
  • the field and scope of use increase the fun and audience of drones.
  • an embodiment of the present invention provides a drone battle method, which is suitable for a scenario in which a drone is used to play an aerial combat game, and the method includes:
  • a target tracking map output display is generated according to the first battle parameter information and the second battle parameter information, wherein the target tracking map displays real-time azimuth information of the battle target.
  • an embodiment of the present invention provides a drone battle control device for controlling the drone to perform an air combat competitive game, which includes:
  • processor for executing executable instructions stored in a memory that, when executed by the processor, cause the processor to:
  • the target tracking map is generated according to the first battle parameter information and the second battle parameter information, and is output and displayed by the display module, wherein the target tracking map displays the real-time orientation information of the battle target.
  • an embodiment of the present invention provides an unmanned aerial vehicle, which can be used in an air combat competitive game, including:
  • Airborne positioning module
  • processor for executing executable instructions stored in a memory that, when executed by the processor, cause the processor to:
  • a target tracking map is output at the operating terminal, wherein the target tracking map displays real-time orientation information of the battle target.
  • an embodiment of the present invention provides an electronic device, which includes: at least one processor, and a memory connected in communication with the at least one processor, wherein the memory stores instructions executable by the at least one processor, and the instructions Executed by at least one processor to enable the at least one processor to perform the steps of the above-described method.
  • the present invention provides a storage medium on which a computer program is stored, which implements the steps of the above method when the program is executed by a processor.
  • the beneficial effect of the embodiment of the present invention is that: according to the solution provided by the present invention, the existing UAV equipment can be brought into the competitive game mode by mutually transmitting the battle parameter information with the battle target, so that there is no need to participate in the battle of the mutual transmission of the battle parameters.
  • the competitive game function is developed between humans and machines, without any hardware modification of the existing drone; and the solution of the present invention can also generate the virtual target tracking map output of the drone based on the opponent's and its own combat parameter information.
  • the virtual target tracking map shows the azimuth difference between the UAV and the target, which can provide the user with flight guidance, so that the user does not need to judge and control the flight direction through direct vision, and avoid blind flying; further, through
  • the method of outputting and displaying the virtual target tracking map to display the orientation of the battle target to the user in real time can also improve the user's sense of participation in the competitive game using the drone and enhance the fun of the game.
  • FIG. 1 is a schematic block diagram of a drone battle control device according to an embodiment of the present invention
  • FIG. 2 is a flow chart of a UAV combat method according to an embodiment of the present invention.
  • Fig. 3 is a virtual radar display effect diagram according to an embodiment of the present invention.
  • FIG. 4 is a flow chart of a UAV combat method according to another embodiment of the present invention.
  • FIG. 5 is an effect diagram showing the locked state of the drone battle simulation according to an embodiment of the present invention.
  • FIG. 6 is a flow chart of a UAV combat method according to another embodiment of the present invention.
  • FIG. 7 is a schematic block diagram of an unmanned aerial vehicle according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of an embodiment of an electronic device of the present invention.
  • the invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, elements, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including storage devices.
  • module refers to relevant entities applied to a computer, such as hardware, a combination of hardware and software, software or software in execution, and the like.
  • an element may be, but is not limited to, a process running on a processor, a processor, an object, an executable element, a thread of execution, a program, and/or a computer.
  • an application program or script program running on the server, and the server can be a component.
  • One or more elements may be in a process and/or thread of execution and an element may be localized on one computer and/or distributed between two or more computers and may be executed from various computer readable media .
  • Elements may also pass through a signal having one or more data packets, for example, a signal from one interacting with another element in a local system, in a distributed system, and/or with data interacting with other systems through a network of the Internet local and/or remote processes to communicate.
  • the UAV combat method and device in the embodiments of the present invention can be used in situations where the UAV is controlled to play competitive games, and it can be configured on a terminal device, where a display screen is configured on the terminal device or the terminal device can project a display screen.
  • the interface is used to display the corresponding screen and interact with the user, so as to realize the use of terminal equipment to control the drone for competitive games.
  • These terminal equipment can be, for example, any smart phone, tablet computer, PC, smart glasses, remote control, etc.
  • Intelligent hardware of course, the drone battle method and device in the embodiment of the present invention can also be directly configured on the drone, and the drone competitive game can be played through the processor on the drone and the operation terminal supporting the drone.
  • the battle control operation is not limited in the present invention.
  • FIG. 1 schematically shows a drone battle control device according to an embodiment of the present invention.
  • the control device in this embodiment at least includes a display module 10, a memory 20 for storing executable instructions, and a processor 30 connected in communication with the display module 10 and the memory 20, respectively, for executing executable instructions stored in the memory.
  • the drone battle control device may be a smart phone, a tablet computer, smart glasses or a remote control, etc., so that the processor 30 can execute corresponding executable instructions to realize the control of the drone in the air combat game. Battle control.
  • the executable instructions may specifically be program instructions corresponding to the UAV combat method below.
  • FIG. 2 schematically shows the flow of the UAV battle method according to an embodiment of the present invention.
  • the execution body of the battle method is the processor 30 of the UAV battle control device shown in FIG. 1 .
  • the method of this embodiment includes the following steps:
  • Step S201 Acquire the first battle parameter information in real time, and transmit the first battle parameter information to the battle target in real time.
  • Step S202 Receive second battle parameter information from the battle target in real time.
  • Step S203 generating a target tracking graph output display according to the first battle parameter information and the second battle parameter information, wherein the target tracking graph displays real-time orientation information of the battle target.
  • the UAV generally establishes a communication relationship with the paired operation terminal, so that the user can control the corresponding UAV through the operation terminal, such as operation control of the flight direction.
  • the UAV combat control device in the embodiment of the present invention can establish a communication connection relationship with the corresponding UAV by using the existing technology, so as to realize the flight control thereof and obtain the data returned by the UAV through the image transmission function, For example live images.
  • the determination of the battle target is also achieved through the communication connection relationship between the UAV battle control devices for controlling the UAV.
  • a communication relationship is established between the UAV combat control devices participating in the battle in advance, and based on the communication relationship, data mutual transmission is performed between them, including in step S201, obtaining the first Sending the battle parameter information to the battle target; and in step S202, receiving the second battle parameter information sent by the battle target.
  • the first battle parameter information and the second battle parameter information in the embodiment of the present invention are to distinguish whether the battle parameter information is the UAV or the battle target, that is, the first battle parameter information in the embodiment of the present invention refers to its own The battle parameter information of the UAV, and the second battle parameter information refers to the battle parameter information of the battle target. This name is only to distinguish the difference between the corresponding UAV entities.
  • the first battle parameter information and the second battle parameter information may be the same or different, which is not limited in this embodiment of the present invention. Since in the embodiment of the present invention, real-time data transmission will be performed between the UAV battle control devices participating in the battle, each UAV battle control device participating in the battle is sending battle parameter information (that is, the first battle parameter) to the other party. information), the real-time battle parameter information (ie the second battle parameter information) sent by the opponent will also be received.
  • the real-time battle parameter information in the embodiment of the present invention refers to the data information corresponding to the parameter item used to indicate the content of the battle, including but not limited to the location information (such as longitude and latitude) of the UAV, the model of the UAV, the The flight information of the man-machine (such as flight altitude and flight speed), the use of special combat skills of the drone (such as whether to emit smoke interference commands to hide), etc.
  • the combat status information of the UAV (such as whether it is locked, whether to destroy, etc.) can also be sent to the other party through the content of the real-time combat parameter information.
  • the combat status information of the UAV can also be sent in real time as needed send directly to the other party.
  • establishing a communication relationship between the UAV battle control devices participating in the battle in advance can be implemented as follows: when the UAV needs to be used to play a competitive game, before steps S201 and S202 are performed, the user can manually operate to establish a communication relationship.
  • the target matching relationship between the drones that need to participate in the competition is used to determine the target drone that needs to compete through the target matching (hereinafter collectively referred to as the combat target in the embodiment of the present invention).
  • the method of matching the battle target may be a method of matching through a network, or a method of matching through a similar channel.
  • the smartphone generally includes a communication module, so network matching can be realized through the communication module of the smartphone, to determine the target of the battle.
  • the method of using the communication module of the smart phone to realize the network matching may be: input the user information of the battle target that needs to be battled on the corresponding user interface, such as user ID, corresponding network information, etc., to pass the network of the smart phone. Establish a communication relationship with the smartphone of the designated user.
  • the smart glasses in this embodiment of the present invention further includes a network module that supports the SIM card to access the Internet, so it can be accessed through a SIM card.
  • the network module of the smart glasses can realize network matching to determine the battle target.
  • the processing process of using the network module supporting the SIM card function of the smart glasses to realize the network matching may be as follows: input the user information of the battle target to be battled on the corresponding input operation interface of the smart glasses, such as the user ID, the corresponding network information, etc., to establish a communication relationship with the designated user's smart glasses through the network provided by the SIM card.
  • the remote controller in this embodiment of the present invention can also communicate with a smartphone, so as to utilize the network of the smartphone Signal to achieve network matching, determine the target of the battle.
  • the process of using the remote control to achieve network matching is as follows: first, establish a connection relationship between the remote control and the smartphone through functions such as Bluetooth or WiFi, and then input the user information of the battle target to be battled through the remote control, such as the user ID. , corresponding network information, etc., so as to send the information to the UAV battle control device of the designated user through the mobile phone, so as to establish a communication relationship with the UAV battle control device of the designated user.
  • the smart glasses in the embodiment of the present invention can also implement similar channel matching through the Bluetooth module of the smart glasses, to determine the target of the battle.
  • the process of using the Bluetooth module of the smart glasses to achieve close channel matching is as follows: establishing a communication relationship with a designated user's drone battle control device through Bluetooth.
  • the smart glasses of the embodiment of the present invention can also achieve similarities through the physical connection between the smart glasses. Channel matching to determine battle targets. Specifically, the process of using the physical connection of the smart glasses to achieve close channel matching is as follows: connect the control terminals of the smart glasses corresponding to the drones to participate in the battle with each other through a physical connection, so as to establish a connection with the corresponding unmanned aerial vehicle through the physical connection. Communication relationship between machine-to-machine combat control devices.
  • the communication relationship between the UAV battle control devices participating in the battle can also be established through other communication methods, in order to make Data communication can be performed between the UAV battle control devices participating in the battle, so as to perform the data inter-transmission operation of step S201 and step S202 between the UAV battle control devices participating in the battle.
  • the way of establishing a communication relationship between the UAV battle control devices is not limited, as long as the data can be exchanged between the UAV battle control devices participating in the battle.
  • step S201 and step S202 it is preferable to use the communication between the UAV battle control devices participating in the battle that is different from the communication between the UAV battle control device and the UAV.
  • Protocol for data communication that is, if the first communication protocol is used for communication between the UAV and the UAV battle control device, then the UAV battle control device participating in the battle preferably adopts a second communication protocol different from the first communication protocol. communication protocol.
  • the first communication protocol can also be used to communicate between the UAV battle control devices participating in the battle to achieve the purpose of the embodiments of the present invention.
  • the target tracking map in step S203 can be implemented as a virtual radar map
  • the first battle parameter information includes the position information of the own drone
  • the second battle parameter information includes the battle target. location information.
  • a virtual radar map that displays the real-time relative position of the battle target centered on the position of the own UAV can be drawn and output, so that the user can pass the virtual radar.
  • the map can see the relative position of the battle target in real time, which is convenient for it to purposefully control the flight direction of its own drone, so as to accurately and quickly pursue the target and avoid blind flight.
  • the position information of the combat drones can be obtained according to the positioning module loaded on the drones participating in the combat.
  • the combat control device of the drones can pass The communication relationship with its own UAV obtains the real-time GPS positioning information of the UAV from the onboard GPS of its own UAV, and sends the positioning information to the combat target through the communication relationship with the combat control device of the combat target. .
  • the participating parties can obtain the position information of the battle target in real time.
  • the real-time relative position of the combat target can be determined according to the position information of the own UAV, and based on the real-time relative position, a virtual radar map can be drawn and output.
  • a virtual radar map can be drawn and generated by taking the position of its own UAV as the center origin, and the azimuth difference between the position of its own UAV and the target position of the battle as the coordinate identification, and displaying the position of its own UAV on the virtual radar map.
  • the center origin position of the radar, and the real-time relative position of the battle target is displayed according to the position and azimuth difference between the battle target and its own UAV, so that the user can clearly see the real-time relative position of the battle target through the UAV battle control device, thereby More purposefully control the flight direction of your drone to avoid blind flying.
  • the position information obtained by the onboard positioning module of the UAV includes its own latitude and longitude data, then the azimuth difference between the two can be calculated based on these longitude and latitude data, and the longitude and latitude position information of its own UAV is converted into origin coordinates.
  • Figure 3 shows the display effect of a virtual radar map of an implementation. As shown in Figure 3, a virtual radar map with the position of its own drone as the center origin is presented in the lower right corner of the displayed real-time shooting picture. , the user can clearly see the azimuth difference between his UAV and the combat target according to the virtual radar map.
  • the drawn virtual radar chart may not be limited to a two-dimensional image, and may, for example, also be a three-dimensional virtual radar chart.
  • the implementation method can be as follows: the position information obtained by the onboard positioning module of the UAV includes its own latitude and longitude data and flight altitude data, then the azimuth difference and altitude difference between the two can be calculated based on these longitude and latitude data and flight altitude data. Convert the longitude, latitude and flight altitude position information of its own UAV into origin coordinates (0,0,0), draw a three-dimensional radar map based on the origin coordinates, and determine the target in the radar based on the difference in azimuth and altitude between the two. Based on the three-dimensional coordinates in the figure, the corresponding display position of the battle target in the three-dimensional radar chart can be drawn based on the position coordinates.
  • the target tracking map in step S203 can be implemented as including a lock icon and a battle target display icon
  • the first battle parameter information includes the position information of its own drone
  • the second battle parameter information includes position information of the combat target.
  • the position information of the combat drone can be obtained according to the positioning module loaded on the drone participating in the combat, and the positioning information is sent to the combat target through the communication relationship with the combat control device of the combat target, so as to This enables the participating parties to obtain the location information of the battle target in real time.
  • the real-time relative orientation of the battle target can be determined according to the position information of the own UAV, and the lock icon and the battle target display icon are output based on the real-time relative orientation.
  • the target display icon is an output display at the real-time relative orientation of the lock icon.
  • the location information obtained by the onboard positioning module of the UAV includes its own latitude and longitude data, then based on these latitude and longitude data, the real-time relative orientation of the combat target can be obtained, wherein the real-time relative orientation refers to the identification of the combat target.
  • the orientation identification of the relative direction is the orientation identification of the battle target in front, rear, left, and right of its own UAV, so that the battle target display icon can be displayed according to the relative orientation, such as when When it is judged that the real-time relative orientation is that the battle target is in front of its own drone, the display icon of the battle target will be displayed in front of the lock icon; when it is judged that the real-time relative orientation is that the battle target is to the left of its own drone, the battle target display icon will be displayed in front of the lock icon.
  • the target display icon is displayed to the left of the lock icon.
  • the lock icon in this embodiment refers to an icon used to frame a lock target (the box shown in FIG.
  • the battle target display icon in the embodiment of the present invention is a display icon used to identify the battle target. It can be the user name, UAV code, UAV code, etc. of the battle target to identify the battle target.
  • the battle target display icon can be displayed at the corresponding position of the locked icon, so that the user can display according to the icon You can clearly see the azimuth difference between your drone and the target, so you can more purposefully control the flight direction of your drone and avoid blind flying.
  • the lock icon and the battle target display icon can also be displayed in the form of a three-dimensional image, and the referenced position information at this time also includes the flight altitude of the aircraft.
  • a tracking direction prompt message is also displayed in the lock icon according to the determined real-time relative orientation of the battle target.
  • the tracking direction prompt message may be text or arrows or a combination of text and arrows. Take the arrow as an example.
  • the battle target display icon will be displayed behind the lock icon, and an arrow pointing backward will be displayed in the lock icon to indicate The user has a target behind it and wants to fly backwards (eg by flipping backwards).
  • the lock icon when the tracking prompt message is displayed in the lock icon, the lock icon is also displayed in color, for example, the lock icon is displayed in a color different from the lock state to remind the user that the battle target is not locked.
  • the lock icon in the locked state, the lock icon is displayed in red, while in the tracking state, for example, when the real-time relative orientation is determined as the combat target is behind its own drone, the combat target display icon is displayed on the locked icon. , and displays the lock icon in yellow.
  • an operation module for triggering entering the game mode can also be set, so that the user enters the game according to the user.
  • Mode input operation to open the game mode of the drone.
  • FIG. 4 schematically shows the flow of the UAV battle method according to an embodiment of the present invention.
  • the execution body of the battle method is the processor 30 of the UAV battle control device shown in FIG. 1 .
  • the method of this embodiment further includes:
  • Step S204 acquiring the combat target detection icon, and simulating locking on the combat target.
  • the battle target detection icon may specifically include a real-time image coordinate system, the coordinate position of the battle target in the real-time image, and a display icon.
  • the drone since the drone includes a camera module (such as a camera) and an image return module, the real-time camera picture can be obtained through the drone, and the real-time image can be sent back to the unmanned aerial vehicle through the image return function. Machine-to-machine combat controls.
  • the processor 30 of the drone combat control device can acquire the combat target detection icon based on the real-time image transmitted back by the own drone through the image, and simulate and lock the combat target.
  • Exemplary can be implemented as:
  • the UAV is calibrated in advance and the calibration information is stored (such as calibration and storage through a database), wherein the UAV model calibration information includes the UAV model (which can be a model identification) and the shape of the fuselage (for example, a A group of photos of the fuselage shape view in different states and the corresponding text calibration information (such as whether there is a protective cover, wing vibration shape, whether there is a propeller, etc.).
  • the UAV model calibration information includes the UAV model (which can be a model identification) and the shape of the fuselage (for example, a A group of photos of the fuselage shape view in different states and the corresponding text calibration information (such as whether there is a protective cover, wing vibration shape, whether there is a propeller, etc.).
  • the UAV model of the battle target can be known through the second battle parameter information, that is, when data mutual transmission is performed, the UAV model of the battle target will be added to the first battle parameter information and transmitted to the battle target.
  • the battle control device of the own UAV can learn the UAV type of the battle target based on the received second battle parameter information.
  • the fuselage shape of the combat target can be learned from the pre-calibrated and stored calibration information. In this way, object detection and recognition can be performed on the real-time image returned by the UAV according to the obtained fuselage shape, so as to determine whether the image includes a combat target.
  • a battle target detection icon is generated, and the battle target is simulated and locked.
  • generating a battle detection icon and simulating locking of the battle target can be implemented, for example, as follows: establishing a real-time image coordinate system according to the pixel points of the real-time image.
  • the coordinate system is established with the lower left corner as the origin and the pixel point as the basic coordinate measurement unit; according to the position of the battle target detected by object detection and recognition in the real-time image, that is, the corresponding pixel point matrix position, to determine the battle target in the real-time image.
  • the coordinate position in the real-time image coordinate system, the coordinate position is used as the display position of the display icon of the battle target, and the display icon of the battle target is output and displayed at the display position in a locked state.
  • outputting and displaying the display icon of the combat target at the display position in a locked state refers to displaying the display icon of the combat target in the locked icon at the display position.
  • the above-mentioned specific processing process of object detection and identification of the combat target may also be performed by the processor or the camera module or the camera module of the UAV.
  • the calibration information of the UAV is pre-configured and Stored on the UAV, the UAV battle control device will transmit the UAV model to its own UAV after determining the UAV type of the battle target.
  • the UAV obtains real-time images through its camera module or camera module, it will perform object detection and recognition on the real-time image information according to the fuselage model of the target and the pre-stored UAV calibration information.
  • the battle target detection icon information is obtained, and then the real-time image and the battle target are sent back to the UAV battle control device through the UAV's image transmission module.
  • the detection icon is detected, and the processor of the drone combat control device performs receiving the detection icon of the combat target, and when receiving the detection icon, simulates the locking of the combat target.
  • the above-mentioned method of performing object detection and identification on real-time image information to determine the combat target may not be limited to the method of object detection and identification based on the drone model of the combat target and pre-stored drone calibration information.
  • the method of detection and identification can be based on the positioning information and flight information of the UAV, such as the flight speed, flight altitude and latitude and longitude information of the UAV to determine the position of the battle target.
  • the aircraft is determined as a battle target. For example, according to the flight altitude and latitude and longitude information of the UAV, an object that meets the characteristics can be found from the corresponding position of the real-time image, and then the object that meets the position characteristics is further judged according to the flight speed.
  • the flight speed, flight altitude and latitude and longitude information of the UAV can be transmitted to the battle target in real time as the first battle parameter information, so that the UAV control device can know the flight of the battle target from the second battle parameter information.
  • Speed, altitude and latitude and longitude information can be transmitted to the battle target in real time as the first battle parameter information, so that the UAV control device can know the flight of the battle target from the second battle parameter information.
  • the method of object detection and identification based on the UAV model of the combat target and the pre-stored UAV calibration information can also be combined with the flight speed, flight altitude, and longitude and latitude information of the UAV. Combining with the method of determining the battle target to detect the battle target, so as to further improve the recognition accuracy of the battle target.
  • the identification strategy may also be set according to the positional distance between the combat target and the own UAV. For example, for UAVs at a relatively short distance (such as within 500 meters), the method of object detection and identification based on the UAV model of the target and the pre-stored UAV calibration information is used to determine the target; For long-distance UAVs (such as 500 meters away), the combat target detection is realized by determining the combat target based on the UAV's flight speed, flight altitude, and longitude and latitude information.
  • the simulated locking of the combat target can be specifically implemented as: outputting and displaying the combat target in a locked state according to the acquired detection icon.
  • the UAV combat control device in the embodiment of the present invention all has a display module 10, and the target tracking diagram mentioned in the above-mentioned embodiment of the present invention and the output display of the combat target in a locked state are all displayed in the The output display is displayed on the display module 10 of the drone battle control device.
  • the display module 10 may be a display screen or a projected display interface, etc., according to the product design features of the drone battle control device, which is not limited in this embodiment of the present invention.
  • FIG. 5 shows a display effect of simulated locking of an implementation.
  • a display icon of a battle target displayed in a locked state is displayed at the corresponding position of the displayed real-time shooting screen, wherein, The locked state in the figure is identified by a specific color such as red, and the battle target is indicated by a display icon such as "x" in the locked frame.
  • Fig. 5 further presents a virtual radar image with its own UAV position as the center origin in the lower right corner of the display interface. As a result, the user can clearly see the azimuth difference between his drone and the battle target, and the locked state of the battle target according to the virtual battlefield image displayed by the output.
  • the drone battle control device when outputting and displaying the combat target in a locked state, it also includes outputting the locked state information of the combat target to the combat target. Further, in response to the received locked information, a locked prompt message is also output. For example, by adding state information that the battle target has been locked into the first battle parameter information, the locked state of the battle target can be transmitted to the battle target in real time.
  • the drone battle control device can determine whether its own drone is locked according to the received second battle parameter information.
  • outputting prompt information that has been locked is used to remind the user, so that the user can perform corresponding control in time to avoid the opponent's pursuit in time, Improve the fun of user battle experience.
  • the output locked prompt information may be a sound prompt, such as playing a prompt voice of "locked” or playing a prompt sound of "DiDi".
  • the output locked prompt may also be a screen flashing prompt, or a flight control suggestion prompt, such as inputting "flip to avoid” on the display screen, which is not limited in this embodiment of the present invention.
  • FIG. 6 schematically shows the flow of the UAV battle method according to an embodiment of the present invention.
  • the execution body of the battle method is the processor 30 of the UAV battle control device shown in FIG. 1 .
  • the method of this embodiment further includes:
  • Step S205 monitor the simulated locking duration of the combat target, and simulate destroying the combat target according to the simulated locking duration.
  • the destruction process of the UAV is simulated by monitoring the simulated locking duration.
  • the simulated destruction process may be implemented in two stages, exemplarily, the first stage is implemented to confirm whether it is a launchable destroy state, and the second stage is implemented to confirm whether it has been destroyed.
  • the timing of the locked state is started to obtain the first locking duration by statistics, and according to the first locking duration of the combat target, It is determined whether the combat target is in a launchable and destroyed state.
  • whether the combat target is in a launchable and destroyed state is determined according to whether the first locking duration of the combat target exceeds a first preset value such as 1s.
  • a first preset value such as 1s.
  • a prompt message that is, a destroy trigger prompt
  • the output destruction trigger prompt can be a sound prompt, such as playing a prompt voice of "lock on the target, sort out” or play a prompt sound of "beep beep", etc. to trigger a trigger prompt for launchable destruction.
  • the output destruction trigger prompt can also be a screen flashing prompt, or a guiding prompt on the input module that can trigger the launch and destruction, such as flashing lights on the input button or displaying a logo box with a specific color, etc. This embodiment of the present invention does not limit this.
  • the above-mentioned UAV combat control device may also be implemented to include a first input module for triggering the user instruction for launching and destroying, for example, the first input module may be a display interface A virtual button or other menu option for user input or may be a key communicatively connected to the processor.
  • the simulation of the destruction process can be performed according to the user's feedback to the destruction trigger prompt of the first stage.
  • the combat target is in a launchable and destroyable state
  • the user uses a shortcut button on the remote control or a menu option on the app side loaded on the mobile phone or the display on the glasses side
  • the menu option on the interface triggers the launch and destruction, it starts to time the continuous locking state of the combat target, so as to statistically determine the second locking duration of the combat target.
  • it may be determined whether the combat target is destroyed according to the second locking duration of the combat target, eg, judging whether the second locking duration exceeds a second preset value such as 1 s.
  • the combat target is still locked for the second preset time period, and the destruction state of the combat target is determined to be in the destroyed state.
  • destroyed information when it is determined that the destroyed state of the combat target is in the destroyed state, destroyed information may also be output to the combat target to prompt the enemy that its drone has been destroyed. Further, when a prompt of the destroyed information is received, the output of the destroyed prompt information may be further performed in response to the received destroyed information.
  • outputting the destroyed information to the battle target may be, for example, adding the destroyed information to the first battle parameter information and sending it to the battle target, so that it can be determined whether the received second battle parameter information includes the destroyed information. information to make a reminder that it has been destroyed.
  • the destroyed prompt information can be displayed on the display module 10 of the drone combat control device of the combat target, or the destroyed drone can be played on the drone combat control device.
  • the prompt sound is used to inform the opponent's drone of the destruction status. When it is confirmed that one of the opponent's drones has been destroyed, the difference between the winner and the loser of the competitive game is realized.
  • the embodiment of the present invention realizes the simulation of the destruction process in the UAV competitive game by counting and monitoring the duration of the simulation lock, so that the existing UAV can realize the simulation without modifying the hardware equipment.
  • Lock and simulate destruction complete the competitive game simulation process, improve the fun of the drone, and can improve the user's sense of participation in competitive games and expand the application scope of the drone.
  • the above-mentioned drone battle control device can also be implemented to include a second input module for triggering the emission of smoke to interfere with user instructions.
  • the second input module can be a virtual button on the display interface or Other menu options available for user input may be a key communicatively connected to the processor.
  • the above-mentioned battle method may further include: in response to the received smoke interference user instruction, adding the hidden state of the battle target into the first battle parameter information.
  • the drone battle control device can determine whether the battle target has selected the function of the smoke jamming radar according to the received second battle parameter information.
  • the real-time orientation information of the combat target is hidden in the target tracking map for a preset duration.
  • the hidden state means that the orientation of the combat target is not displayed in the target tracking diagram.
  • the hidden state may further mean that the combat target is not displayed in the locked icon in the locked state.
  • the preset duration can be set according to requirements, for example, set to 3s.
  • the user's battle experience can also be improved by constructing a virtual battlefield image output.
  • the construction of the virtual battlefield image can be implemented, for example, by pre-storing virtual battlefield construction elements, such as virtual cannon maps, virtual rocket maps, etc., and when outputting real-time images, according to the above-established real-time image coordinate system, the virtual battlefield construction elements are added and displayed. into the real-time image to obtain a virtual presentation image output display.
  • the above-mentioned target tracking diagram can be realized as output display in the virtual battlefield image.
  • the target tracking diagram includes a lock icon and a battle target display icon
  • the lock icon and the battle target display icon can be displayed on the virtual battlefield image.
  • FIG. 7 schematically shows an unmanned aerial vehicle according to an embodiment of the present invention.
  • the unmanned aerial vehicle in this embodiment at least includes an operation terminal 70 with a display interface, an onboard positioning module 71 , and a camera module 72, a memory 74 for storing executable instructions, and a processor 75 for executing the executable instructions stored in the memory.
  • the user can perform user input through the operation terminal 70 , and view the presented battle interface through a display interface (eg, a display screen or a projection display interface) of the operation terminal 70 .
  • the UAV cooperates with its processor 75 to execute corresponding executable instructions according to the user's input, so as to implement the application of the UAV in the air combat game and complete the UAV battle game.
  • the operation terminal 70 in this embodiment may be a smart phone, a tablet computer, smart glasses, a remote control, or the like.
  • the user can input a user instruction through the operation terminal 70 to start the game mode of the drone, and input the battle target information (such as user information) through the operation terminal 70 to establish and battle The communication relationship of the target.
  • the processor 75 of the UAV performs the following operations:
  • a target tracking graph is output on the display interface of the operation terminal 70 according to the first battle parameter information and the second battle parameter information, wherein the target tracking graph displays real-time azimuth information of the battle target.
  • establishing the communication relationship with the battle target in this embodiment may be establishing a communication relationship between the operation terminals 70 of the UAV, and using the operation terminal 70 of the UAV to perform data transmission. It can also be that after receiving the battle target information input by the user through the operation terminal 70, according to the battle target information input by the user, the processor 75 of the UAV calls its communication module to directly establish a communication relationship with the battle UAV, so as to directly establish a communication relationship with the battle UAV. Direct real-time data transfer between UAVs participating in the battle.
  • the specific manner of establishing a communication connection by operating the terminal 70 may be described with reference to the foregoing method section.
  • the operation terminal of the UAV can also be used to establish a communication relationship between the UAVs participating in the battle through other communication methods.
  • the purpose is to enable data communication between the UAVs participating in the battle, so as to perform data inter-transmission operation between the UAVs participating in the battle. Therefore, the embodiment of the present invention establishes a communication relationship between the UAVs participating in the battle.
  • the method is not limited, as long as the data can be transmitted between the UAVs participating in the battle.
  • the first battle parameter information includes the position information of the own drone
  • the second battle parameter information includes the position information of the battle target.
  • the processor 75 of the UAV obtains the position information of the UAV through the onboard positioning module 71, and sends the position information to the battle target through the established communication relationship.
  • the onboard positioning module 71 can be, for example, an onboard GPS.
  • the processor 75 of the UAV can mutually transmit the first battle parameter information including the position information of the UAV between the battle targets through the established communication relationship, so that the UAV in the battle can obtain the battle parameter information.
  • Target location information the processor 75 of the UAV can draw and output the tracking target map according to the first combat parameter information and the second combat parameter information, so that the user can see the real-time orientation information of the combat target in real time through the tracking target map, which is convenient for the user. It purposefully controls the flight direction of its own drone to accurately and quickly pursue the target and avoid blind flight.
  • the target tracking map can be implemented as a virtual radar map.
  • the processor 75 of the UAV can determine the real-time relative position of the battle target based on the position information of the UAV and the positioning information of the battle target, and based on the position information of the UAV and the positioning information of the battle target The real-time relative position to draw the output virtual radar chart.
  • a virtual radar map can be drawn and output by taking the position of its own UAV as the center origin, and using the position difference between the combat target and its own drone as the coordinate position of the combat target, so as to display itself as a man-machine on the virtual radar map. relative azimuth difference with the combat target. And the user can clearly see the azimuth difference between his drone and the battle target according to the virtual radar map, and a display effect of the virtual radar map can be referred to as shown in FIG. 3 .
  • the virtual radar chart drawn according to the requirements may not be limited to a two-dimensional image, and may, for example, also be a three-dimensional virtual radar chart.
  • the location information obtained by the onboard positioning module 71 of the UAV includes its own longitude and latitude data, then the azimuth difference between the two can be calculated based on these longitude and latitude data, and the longitude and latitude of its own UAV can be calculated.
  • the position information is converted into the origin coordinates (0,0), and a circular radar chart is drawn based on the origin coordinates.
  • the azimuth coordinates of the combat target in the radar chart can be determined, and based on the position coordinates, it can be drawn.
  • the corresponding position of the battle target in the radar chart is displayed, and the display effect is shown in Figure 3.
  • the position information obtained by the onboard positioning module 71 of the UAV includes its own latitude and longitude data and flight altitude data, then based on these longitude and latitude data and flight altitude data, the azimuth difference and the flight altitude data can be calculated.
  • the target tracking map can also be implemented to include a lock icon and a battle target display icon.
  • the position information of the combat drones can be obtained according to the onboard positioning module 71 loaded on the drones participating in the combat, and the positioning information is sent to the combat target through the communication relationship with the combat target, so that the The participating parties can obtain the location information of the battle target in real time.
  • the processor of the UAV After learning the position information of the combat target, the processor of the UAV also determines the real-time relative bearing of the combat target according to the position information of its own drone, and displays the real-time relative bearing on the display interface of the operation terminal based on the real-time relative bearing.
  • the position information obtained by the onboard positioning module 71 of the UAV includes its own longitude and latitude data, then based on these longitude and latitude data, the real-time relative orientation of the combat target can be obtained.
  • the display icon is displayed in front of the lock icon; when it is determined that the real-time relative orientation is that the battle target is to the left of the drone, the display icon of the battle target is displayed to the left of the lock icon.
  • the lock icon in the embodiment of the present invention refers to an icon used to frame the lock target (the box shown in FIG. 3 is an example of the lock icon).
  • the target object means that the target object is locked, and the target object outside the range of the icon frame means that the target object is not locked;
  • the battle target display icon in the embodiment of the present invention is a display icon used to identify the battle target.
  • the battle target display icon can be displayed at the corresponding position of the locked icon, so that the user can display according to the icon You can clearly see the azimuth difference between your drone and the target, so you can more purposefully control the flight direction of your drone and avoid blind flying.
  • the lock icon and the battle target display icon can also be displayed in the form of a three-dimensional image, and the referenced position information at this time also includes the flight altitude of the aircraft.
  • the processor 75 of the drone also performs the following operations:
  • a tracking direction prompt message is also displayed in the lock icon according to the determined real-time relative orientation of the battle target.
  • the tracking direction prompt message may be text or arrows or a combination of text and arrows. Take the arrow as an example. For example, when it is judged that the real-time relative orientation is that the battle target is behind its own drone, the battle target display icon will be displayed behind the lock icon, and an arrow pointing backward will be displayed in the lock icon to indicate The user's target is behind and is to be flown backwards (eg, by backflipping).
  • the processor 75 of the drone also performs the following operations:
  • the lock icon is also displayed in color.
  • the lock icon in a different color than the locked state to remind the user that the battle target is not locked.
  • the lock icon in the locked state, the lock icon will be displayed in red, and in the tracking state, when the real-time relative orientation is determined as the target of the battle is behind the drone, the display icon of the battle target will be displayed on the side of the lock icon. rear and displays the lock icon in yellow.
  • processor 75 of the drone also performs the following operations:
  • the processor 75 of the UAV can call the camera module 72 (for example, a camera or a camera set on the UAV) to capture real-time image information, and the UAV processor 75 or the camera module 72 can monitor the captured image information.
  • the real-time image information is used for object detection and recognition, and when it is confirmed that the battle target is recognized according to the recognition result, the battle target detection icon is obtained, and the battle target detection icon is output and displayed on the display interface of the operation terminal 70 in a locked state, so as to realize the detection of unmanned persons. Simulation of the locking process in a machine-versus-player game.
  • the battle target detection icon may specifically include a real-time image coordinate system, the coordinate position of the battle target in the real-time image, and a display icon.
  • a display effect of the simulated locked state can be referred to as shown in FIG. 5 .
  • the object detection and identification of the captured real-time image information by the UAV processor 75 or the camera module 72 can be implemented as: pre-calibrating the UAV database, and storing the UAV calibration information in the database , wherein the calibration information includes the UAV model (which can be the model identification) and the fuselage shape (for example, a set of photos of the fuselage shape view in different states and the corresponding text calibration information such as whether there is a protective cover, wings or not) Vibration shape, whether there is a propeller, etc.); when real-time image information is captured, object detection is performed on the real-time image based on the calibration information and/or second battle parameter information of the UAV database to identify and determine the battle target, and generate a battle Object detection icon.
  • the UAV model which can be the model identification
  • the fuselage shape for example, a set of photos of the fuselage shape view in different states and the corresponding text calibration information such as whether there is a protective cover, wings or not
  • Vibration shape whether there is a propeller,
  • the fuselage shape of the combat target can be learned from the pre-calibrated and stored calibration information. In this way, object detection and recognition can be performed on the real-time image according to the obtained fuselage shape, so as to determine whether the image includes a combat target.
  • the calibration information of the UAV database can be stored on the UAV (for example, stored in the memory), or it can be stored in the cloud server, and the calibration information in the database can be obtained by connecting the cloud server through the Internet for object detection and identification.
  • the process of generating the combat target detection icon on the drone and outputting and displaying the combat target detection icon in the locked state on the display interface of the operation terminal 70 may be: establishing a real-time image coordinate system according to the pixel points of the real-time image.
  • the image includes 3*3 pixels, and the coordinate system is established according to the pixel matrix with the lower left corner as the origin and the pixel as the basic coordinate measurement unit; according to the location of the detected battle target in the real-time image through object detection , that is, the corresponding pixel point matrix position, to determine the coordinate position of the battle target in the real-time image coordinate system, use the coordinate position as the display position of the battle target display icon, and return the real-time image to the operation terminal through image return.
  • outputting and displaying the display icon of the combat target at the display position in a locked state refers to displaying the display icon of the combat target in the locked icon at the display position.
  • the processor 75 of the UAV performs object detection and recognition on the real-time image information to determine the implementation method of the combat target, which may not be limited to the above-mentioned combat target-based UAV model and pre-stored unmanned aerial vehicles.
  • the method of human-machine calibration information for object detection and identification can be based on the positioning information and flight information of the UAV, such as the flight speed, flight altitude and latitude and longitude information of the UAV to determine the position of the battle target, which will match the position.
  • the drone with flight information is determined as the target of the battle.
  • the processor 75 of the drone can also find the target object that meets the characteristics from the corresponding position of the real-time image according to the flight altitude and latitude and longitude information of the drone, and then according to the flight speed. To further determine whether the target object that meets the location characteristics is a battle target. Among them, the flight speed, flight altitude and latitude and longitude information of the UAV can be transmitted to the battle target in real time as the first battle parameter information, so that the processor 75 of the UAV can learn the battle target from the second battle parameter information flight speed, flight altitude and latitude and longitude information.
  • the processor 75 of the UAV can also perform object detection and identification based on the UAV type of the combat target and the pre-stored UAV calibration information with the method according to the flight of the UAV. Speed, flight altitude and longitude and latitude information are used to determine the battle target to detect the battle target, so as to further improve the recognition accuracy of the battle target.
  • the processor 75 of the UAV can also execute different identification strategies according to the distance between the detected target and the UAV. For example, for a UAV at a relatively short distance (such as within 500 meters), the processor 75 of the UAV adopts the method of object detection and identification based on the UAV model of the combat target and the pre-stored UAV calibration information. Determining the battle target; and for the UAV at a longer distance (such as 500 meters away), the processor 75 of the UAV adopts the method of determining the battle target based on the UAV's flight speed, flight altitude, and longitude and latitude information. Battle target detection.
  • the processor 75 of the drone also performs the following operations:
  • the locked information is also outputted to the combat target.
  • the locked state information of the combat target may be added to the first combat parameter information, so as to transmit the locked information of the combat target to the combat target in real time.
  • the drone can determine whether its own drone has been locked according to the received second battle parameter information.
  • the processor 75 of the drone also performs the following operations:
  • a locked prompt message is output.
  • the output locked prompt information may be a sound prompt, such as playing a prompt voice of "locked” or playing a prompt sound of "DiDi".
  • the output locked prompt may also be a screen flashing prompt on the operating terminal, or a flight control suggestion prompt, such as displaying a prompt message such as "whether to dodge a flip?" on the display interface of the operating terminal. The embodiments of the invention do not limit this.
  • the processor 75 of the drone also performs the following operations:
  • the processor 75 of the drone may implement the simulated destruction process in two stages.
  • the first stage is implemented to confirm whether it is a launchable destroy state
  • the second stage is implemented to confirm whether it is a destroyed state.
  • the drone processor 75 After the drone processor 75 outputs and displays the combat target in the locked state according to the acquired detection icon, it starts to time the locked state to obtain the first locking duration by statistics, and according to the combat target
  • the first locking duration of the combat target determines whether the combat target is in a launchable and destroyed state. Exemplarily, whether the combat target is in a launchable and destroyed state is determined according to whether the first locking duration of the combat target exceeds a first preset value such as 1s. When the first locking duration exceeds the first preset value, it is determined that the state can be launched and destroyed.
  • a prompt message that is, a destruction trigger prompt
  • the operation terminal 70 includes an audio module and a speaker
  • the output triggering prompt for destruction can be a sound prompt, such as playing a prompt voice of "lock on the target, sortie” or play a prompt sound of "beep beep", etc. to carry out launchable destruction.
  • Trigger alerts may also be a screen flash prompt on the display interface of the operation terminal 70 .
  • the operation terminal 70 includes a first input terminal for receiving user input instructions
  • the output destruction trigger prompt can also be a guide prompt on the first input terminal that can trigger the launch and destruction, such as A light flashes on the input button or a logo frame with a specific color is displayed, etc., which is not limited in this embodiment of the present invention.
  • the processor 75 of the UAV may also receive a user instruction input by the user to trigger destruction through the first input terminal, for example,
  • the first input terminal may be a virtual button on the display interface of the operation terminal 70 or other menu options available for user input, or may be a key on the operation terminal 70 .
  • the first input terminal may not be limited to being provided on the operation terminal, but may also be a separate input terminal.
  • the operation terminal 70 is glasses connected to the drone processor 75 in communication, and the first input terminal is An input terminal is a remote control button communicatively connected to the drone processor 75 .
  • the simulation of the destruction process can be performed according to the user's feedback to the destruction trigger prompt of the first stage.
  • the processor 75 of the UAV responds to a received user instruction for launching and destroying after determining that the combat target is in a launchable and destroyable state, for example, the user uses a shortcut button on the remote control or a menu option on the mobile phone app. Or the menu option on the display interface of the glasses side triggers the launch and destruction, and starts to time the continuous locking state of the battle target, so as to statistically determine the second lock duration of the battle target.
  • the combat target is destroyed according to the second locking duration of the combat target, eg, judging whether the second locking duration exceeds a second preset value such as 1 s.
  • the combat target is still locked for the second preset time period, and the destruction state of the combat target is determined to be in the destroyed state.
  • the processor 75 of the UAV when it is determined that the destroyed state of the combat target is in the destroyed state, can also output destroyed information to the combat target to inform the enemy that its drone has been destroyed. Further, when the processor 75 of the UAV receives the prompt of the destroyed information, it can continue to output the received prompt information of the destroyed, such as outputting the prompt message on the operation terminal 70 .
  • outputting the destroyed information to the battle target may be, for example, adding the destroyed information to the first battle parameter information and sending it to the battle target, so that it can be determined whether the received second battle parameter information includes the destroyed information. information to make a reminder that it has been destroyed.
  • the embodiment of the present invention realizes the simulation of the locking process in the drone battle game by displaying and outputting the battle target detection icon on the operation terminal in the locked state.
  • the simulation of the destruction process in the UAV competitive game is realized, so that the existing UAV can realize the simulated locking and simulated destruction without modifying the hardware equipment.
  • the processor 75 of the drone also performs the following operations:
  • the concealment state of the combat target is added to the first combat parameter information.
  • the UAV can judge whether the battle target has selected the function of the smoke jamming radar according to the received second battle parameter information.
  • the processor 75 of the drone also performs the following operations:
  • the real-time orientation information of the battle target is hidden in the target tracking map for a preset duration.
  • the hidden state means that the orientation of the battle target is not displayed in the target tracking graph or the battle target is not displayed in the lock icon in the locked state, and the preset time period can be set according to requirements, for example, 3s.
  • the smoke interference function it can provide users with special combat skills, such as hidden skills, for air combat, making the combat experience more interesting and challenging.
  • the drone can also be implemented to include a second input terminal for triggering emission of smoke to interfere with user instructions, where the second input terminal is, for example, an input button or a menu option provided on the display interface of the operation terminal.
  • the second input terminal may be the same as the above-mentioned first input terminal, or may be a different input terminal.
  • the processor 75 of the drone also performs the following operations:
  • a virtual battlefield image is constructed and output on the operation terminal, and the target tracking image is output and displayed in the virtual battlefield image or at a preset position of the virtual battlefield image.
  • the user's battle experience can be improved by constructing a virtual battlefield image output.
  • the construction of the virtual battlefield image can be realized, for example, by pre-storing the virtual battlefield construction elements, such as virtual cannon map, virtual rocket map, etc., and when outputting the real-time image, according to the above-established real-time image coordinate system, the virtual battlefield construction elements are added and displayed. into the real-time image to obtain a virtual presentation image output display.
  • the above-mentioned target tracking diagram can be realized as output display in the virtual battlefield image.
  • the target tracking diagram includes a lock icon and a battle target display icon
  • the lock icon and the battle target display icon can be displayed on the virtual battlefield image.
  • display on the real-time image; or output and display at the preset position of the virtual battlefield image for example, display the target tracking image in the lower left corner of the virtual battlefield image or real-time image or the lower right corner as shown in Figure 3 (in the virtual radar image). example).
  • embodiments of the present invention provide a non-volatile computer-readable storage medium, where one or more programs including execution instructions are stored in the storage medium, and the execution instructions can be read by an electronic device (including But it is not limited to a computer, a server, or a network device, etc.) to read and execute it, so as to execute the UAV combat method according to any one of the above embodiments of the present invention.
  • an electronic device including But it is not limited to a computer, a server, or a network device, etc.
  • embodiments of the present invention further provide a computer program product, the computer program product comprising a computer program stored on a non-volatile computer-readable storage medium, the computer program comprising program instructions, when all When the program instructions are executed by the computer, the computer is made to execute the UAV combat method of any one of the above embodiments.
  • embodiments of the present invention further provide an electronic device, which includes: at least one processor, and a memory communicatively connected to the at least one processor, wherein the memory stores data that can be accessed by the at least one processor. Instructions executed by one processor, the instructions being executed by the at least one processor to enable the at least one processor to execute the UAV combat method of any of the foregoing embodiments.
  • embodiments of the present invention further provide a storage medium on which a computer program is stored, characterized in that, when the program is executed by a processor, the UAV combat method of any one of the foregoing embodiments is implemented.
  • FIG. 8 is a schematic diagram of the hardware structure of an electronic device for executing a drone battle method provided by another embodiment of the present application. As shown in FIG. 8 , the device includes:
  • the apparatus for performing the drone combat method may further include: an input device 630 and an output device 640 .
  • the processor 610, the memory 620, the input device 630 and the output device 640 may be connected by a bus or in other ways, and the connection by a bus is taken as an example in FIG. 8 .
  • the memory 620 can be used to store non-volatile software programs, non-volatile computer-executable programs and modules, such as those corresponding to the drone combat method in the embodiments of the present application.
  • the processor 610 executes various functional applications and data processing of the server by running the non-volatile software programs, instructions and modules stored in the memory 620, that is, to implement the UAV combat method of the above method embodiments.
  • the memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the UAV combat method, and the like. Additionally, memory 620 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some embodiments, memory 620 may optionally include memory located remotely relative to processor 610, and these remote memories may be connected to the drone combat control device via a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the input device 630 may receive input numerical or character information, and generate signals related to user settings and function control of the drone combat control device.
  • the output device 640 may include a display device such as a display screen.
  • the one or more modules are stored in the memory 620, and when executed by the one or more processors 610, execute the drone combat method in any of the above method embodiments.
  • the above product can execute the method provided by the embodiments of the present application, and has functional modules and beneficial effects corresponding to the execution method.
  • the above product can execute the method provided by the embodiments of the present application, and has functional modules and beneficial effects corresponding to the execution method.
  • the electronic devices of the embodiments of the present application exist in various forms, including but not limited to:
  • Mobile communication equipment This type of equipment is characterized by having mobile communication functions, and its main goal is to provide voice and data communication.
  • Such terminals include: smart phones (eg iPhone), multimedia phones, feature phones, and low-end phones.
  • Ultra-mobile personal computer equipment This type of equipment belongs to the category of personal computers, has computing and processing functions, and generally has the characteristics of mobile Internet access.
  • Such terminals include: PDAs, MIDs, and UMPC devices, such as iPads.
  • Portable entertainment equipment This type of equipment can display and play multimedia content.
  • Such devices include: audio and video players (eg iPod), handheld game consoles, e-books, as well as smart toys and portable car navigation devices.
  • the composition of the server includes a processor, a hard disk, a memory, a system bus, etc.
  • the server is similar to a general computer architecture, but due to the need to provide highly reliable services, the processing power, stability , reliability, security, scalability, manageability and other aspects of high requirements.
  • the device embodiments described above are only illustrative, wherein the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed over multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each embodiment can be implemented by means of software plus a general hardware platform, and certainly can also be implemented by hardware.
  • the above-mentioned technical solutions can be embodied in the form of software products in essence, or the parts that make contributions to related technologies, and the computer software products can be stored in computer-readable storage media, such as ROM/RAM, magnetic disks , optical disc, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the methods described in various embodiments or some parts of the embodiments.

Abstract

一种无人机对战方法、无人机对战控制装置及无人机,适用于用无人机进行空战竞技游戏的场景,方法包括实时获取第一对战参数信息,并向对战目标实时传输第一对战参数信息(S201);实时接收来自对战目标的第二对战参数信息(S202);根据第一对战参数信息和第二对战参数信息生成目标追踪图输出展示,其中,目标追踪图中显示有对战目标的实时方位信息(S203)。装置和方法可以实现无需对现有无人机进行任何硬件的改装,就可以利用无人机进行竞技游戏,且游戏过程中能够为用户提供飞行导向,使得用户不需要通过直接目视来进行判断和控制飞行方向,避免盲飞;同时还能够提高用户利用无人机进行竞技游戏的参与感,提升游戏的趣味性。

Description

无人机对战方法、无人机对战控制装置、无人机及存储介质 技术领域
本发明涉及无人机竞技游戏技术领域,尤其涉及一种无人机对战方法、无人机对战控制装置、以及无人机和存储介质,其中,无人机对战控制装置可以为对战控制用终端设备,如智能眼镜、智能手机、平板或遥控器等。
背景技术
目前的模拟空战游戏大多是通过电脑游戏来实现,这种模拟空战游戏虽然能够比较真实的模拟战斗机的特性及武器发射等过程,但缺少了真实环境体验的乐趣。
为了提高游戏的真实性和对抗性,一种以自制的航模无人机作为空战游戏参与对象的航空模型运动得以快速发展。这种航空模型运动允许操作者通过遥控器控制航模无人机飞行来体验真实的“飞行”乐趣,并且发明者还通过在航模无人机中设置类似可发射弹丸的装置来提高“战斗感”。但,由于航模无人机只是为了游戏娱乐而制造,其功能相对简单,在游戏过程中还需要地面操作人员通过目视来判断地方战机方位和追击敌方战机,很容易产生盲追现象,用户体验感差。
此外,由于可发射弹丸这种真实的对抗性,使得航模无人机在模拟空战游戏中也容易损坏,造成极大的资源浪费。
发明内容
本发明实施例提供一种无人机对战方案,采用利用已有无人机,在不需要加装硬件的情况下,将无人机应用在游戏竞技领域的技术构思,拓宽无人机的应用领域和使用范围,提升无人机的趣味性和受众数量。
第一方面,本发明实施例提供一种无人机对战方法,适用于用无人机进行空战竞技游戏的场景,该方法包括:
实时获取第一对战参数信息,并向对战目标实时传输第一对战参数信息;
实时接收来自对战目标的第二对战参数信息;
根据第一对战参数信息和第二对战参数信息生成目标追踪图输出展示,其中,目标追踪图中显示有对战目标的实时方位信息。
第二方面,本发明实施例提供一种无人机对战控制装置,用于控制无人机进行空战竞技游戏,其包括:
显示模块;
存储器,用于存储可执行指令;以及
处理器,用于执行存储器中存储的可执行指令,所述可执行指令在由所述处理器执行时使得所述处理器:
实时获取第一对战参数信息,并向对战目标实时传输第一对战参数信息;
实时接收来自对战目标的第二对战参数信息;
根据第一对战参数信息和第二对战参数信息生成目标追踪图在所述显示模块输出显示,其中,目标追踪图中显示有对战目标的实时方位信息。
第三方面,本发明实施例提供了一种无人机,能够用于空战竞技游戏,其包括:
具有显示界面的操作终端;
机载定位模块;
存储器,用于存储可执行指令;以及
处理器,用于执行存储器中存储的可执行指令,所述可执行指令在由所述处理器执行时使得所述处理器:
实时获取第一对战参数信息,并向对战目标实时传输第一对战参数信息;
实时接收来自对战目标的第二对战参数信息;
根据第一对战参数信息和第二对战参数信息在操作终端输出目标追踪图,其中,目标追踪图中显示有对战目标的实时方位信息。
第四方面,本发明实施例提供了一种电子设备,其包括:至少一个处理器,以及与至少一个处理器通信连接的存储器,其中,存储器存储有可 被至少一个处理器执行的指令,指令被至少一个处理器执行,以使至少一个处理器能够执行上述方法的步骤。
第五方面,本发明提供了一种存储介质,其上存储有计算机程序,该程序被处理器执行时实现上述方法的步骤。
本发明实施例的有益效果在于:根据本发明提供的方案能够通过与对战目标互传对战参数信息将现有的无人机设备带入到竞技游戏模式,以在互传对战参数的对战参与无人机之间展开竞技游戏功能,无需对现有无人机进行任何硬件的改装;并且本发明的方案还能够基于对方和自身的对战参数信息来生成无人机虚拟目标追踪图输出,通过在虚拟目标追踪图上展示自身无人机与对战目标之间的方位差,能够为用户提供飞行导向,使得用户不需要通过直接目视来进行判断和控制飞行方向,避免盲飞;进一步地,通过输出展示虚拟目标追踪图的方式来实时向用户展示对战目标的方位,也能够提高用户利用无人机进行竞技游戏的参与感,提升游戏的趣味性。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施方式的无人机对战控制装置的原理框图;
图2为本发明一实施方式的无人机对战方法的流程图;
图3为本发明一实施方式的虚拟雷达展示效果图;
图4为本发明另一实施方式的无人机对战方法的流程图;
图5为本发明一实施方式的无人机对战模拟锁定状态展示效果图;
图6为本发明又一实施方式的无人机对战方法的流程图;
图7为本发明一实施方式的无人机的原理框图;
图8为本发明的电子设备的一实施例的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。
本发明可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、元件、数据结构等等。也可以在分布式计算环境中实践本发明,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
在本发明中,“模块”、“装置”、“系统”等指应用于计算机的相关实体,如硬件、硬件和软件的组合、软件或执行中的软件等。详细地说,例如,元件可以、但不限于是运行于处理器的过程、处理器、对象、可执行元件、执行线程、程序和/或计算机。还有,运行于服务器上的应用程序或脚本程序、服务器都可以是元件。一个或多个元件可在执行的过程和/或线程中,并且元件可以在一台计算机上本地化和/或分布在两台或多台计算机之间,并可以由各种计算机可读介质运行。元件还可以根据具有一个或多个数据包的信号,例如,来自一个与本地系统、分布式系统中另一元件交互的,和/或在因特网的网络通过信号与其它系统交互的数据的信号通过本地和/或远程过程来进行通信。
最后,还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”,不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本发明实施例中的无人机对战方法及装置可以用于控制无人机进行竞技游戏的场合,其可以配置于终端设备上,该终端设备上配置有显示屏或者该终端设备能够投影出显示界面用于展示相应的画面和与用户进行交互操作,从而实现利用终端设备控制无人机进行竞技游戏,这些终端设备例如可以是,智能手机、平板电脑、PC、智能眼镜、远程遥控器等任何智能硬件;当然,本发明实施例中的无人机对战方法及装置也可以直接配置于无人机上,通过无人机上的处理器和无人机配套的操作终端来进行无人机竞技游戏的对战控制操作,本发明对此不作限定。
图1示意性地显示了根据本发明一实施方式的无人机对战控制装置,如图1所示,本实施例中的控制装置至少包括显示模块10,用于存储可执行指令的存储器20,和分别与显示模块10、存储器20通信连接、用于执行存储器中存储的可执行指令的处理器30。在具体应用中,该无人机对战控制装置可以是智能手机、平板电脑、智能眼镜或遥控器等,以通过处理器30来执行相应的可执行指令实现对无人机在空战竞技游戏中的对战控制。其中,可执行指令具体可以是下文的无人机对战方法对应的程序指令。
图2示意性地显示了根据本发明一实施方式的无人机对战方法流程,本实施例中,对战方法的执行主体为图1所示的无人机对战控制装置的处理器30。如图2所示,本实施例的方法包括如下步骤:
步骤S201:实时获取第一对战参数信息,并向对战目标实时传输第一对战参数信息。
步骤S202:实时接收来自对战目标的第二对战参数信息。
步骤S203:根据第一对战参数信息和第二对战参数信息生成目标追踪图输出展示,其中,目标追踪图中显示有对战目标的实时方位信息。
在实际应用中,无人机一般通过与配对的操作终端建立通信关系,以使得用户能够通过操作终端来实现对相应无人机的控制,例如对飞行方向的操作控制。本发明实施例的无人机对战控制装置可以为采用该现有技术与相应的无人机建立通信连接关系,以实现对其的飞行控制和获取无人机通过图传功能回传的数据,例如实时图像。除此之外,在本发明实施例中,还通过用于控制无人机的无人机对战控制装置之间的通信连接关系,实现 对战目标的确定。即本发明实施例预先在参与对战的无人机对战控制装置之间建立有通信关系,并基于该通信关系在其之间进行数据互传,包括在步骤S201中,从自身无人机获取第一对战参数信息发送给对战目标;以及在步骤S202中,接收对战目标发送来的第二对战参数信息。其中,本发明实施例中的第一对战参数信息和第二对战参数信息是为了区分对战参数信息是自身无人机的还是对战目标的,即本发明实施例的第一对战参数信息是指自身无人机的对战参数信息,而第二对战参数信息是指对战目标的对战参数信息,这一命名只是为了区分对应的无人机实体的差异,在实际应用中,第一对战参数信息和第二对战参数信息包含的参数项的数量可以是相同的,也可以是不同的,本发明实施例不对此进行限制。由于本发明实施例中参与对战的无人机对战控制装置之间会进行实时地数据互传,因而每个参与对战的无人机对战控制装置在向对方发送对战参数信息(即第一对战参数信息)时,也会接收到对方发送来的实时对战参数信息(即第二对战参数信息)。其中,本发明实施例中的实时对战参数信息是指用于指示对战内容的参数项对应的数据信息,包括但不限于无人机的位置信息(如经纬度)、无人机的机型、无人机的飞行信息(如飞行高度、飞行速度)、无人机的特殊对战技能使用信息(如是否发射烟雾干扰指令进行隐藏)等。在具体实现中,也可以将无人机的对战状况信息(如是否锁定、是否摧毁等)通过实时对战参数信息的内容发送给对方,当然也可以根据需要将无人机的对战状况信息实时地直接发送给对方。
具体地,预先在参与对战的无人机对战控制装置之间建立通信关系可以实现为:在需要利用无人机进行竞技游戏时,在进行步骤S201和步骤S202之前,用户可以通过手动操作来建立需要参与竞技的无人机之间的目标匹配关系,以通过目标匹配确定需要对战的目标无人机(本发明实施例以下统称为对战目标)。其中,进行对战目标匹配的方式可以是通过联网匹配的方式,也可以是通过相近频道匹配的方式。
以无人机对战控制装置为智能手机、本实施例的对战方法的执行主体为智能手机的处理器为例,智能手机一般包括有通信模块,因而可以通过智能手机的通信模块来实现联网匹配,以确定对战目标。具体地,利用智能手机的通信模块来实现联网匹配的方式可以是:在相应的用户界面输入 需要进行对战的对战目标的用户信息,例如用户ID、对应的网络信息等,以通过智能手机的网络与该指定用户的智能手机建立通信关系。
以无人机对战控制装置为智能眼镜、本实施例的对战方法的执行主体为智能眼镜的处理器为例,本发明实施例的智能眼镜还包括有支持SIM卡上网的网络模块,因而可以通过智能眼镜的网络模块来实现联网匹配,以确定对战目标。具体地,利用智能眼镜的支持SIM卡功能的网络模块来实现联网匹配的处理过程可以为:在智能眼镜相应的输入操作界面输入需要进行对战的对战目标的用户信息,例如用户ID、对应的网络信息等,以通过SIM卡提供的网络与该指定用户的智能眼镜端建立通信关系。
以无人机对战控制装置为遥控器、本实施例的对战方法的执行主体为遥控器的处理器为例,本发明实施例的遥控器还可以与智能手机进行通信,以利用智能手机的网络信号来实现联网匹配,确定对战目标。具体地,利用遥控器来实现联网匹配的处理过程为:首先将遥控器与智能手机通过蓝牙或WiFi等功能建立连接关系,之后通过遥控器输入需要进行对战的对战目标的用户信息,例如用户ID、对应的网络信息等,以通过手机将该信息发送给指定用户的无人机对战控制装置,从而建立与该指定用户的无人机对战控制装置的通信关系。
以无人机对战控制装置为智能眼镜、本实施例的对战方法的执行主体为智能眼镜的处理器为例,本发明实施例的智能眼镜还可以通过智能眼镜的蓝牙模块来实现相近频道匹配,以确定对战目标。具体地,利用智能眼镜的蓝牙模块来实现相近频道匹配的处理过程为:通过蓝牙建立与指定用户的无人机对战控制装置的通信关系。
继续以无人机对战控制装置为智能眼镜、本实施例的对战方法的执行主体为智能眼镜的处理器为例,本发明实施例的智能眼镜还可以通过智能眼镜之间的物理连接来实现相近频道匹配,以确定对战目标。具体地,利用智能眼镜的物理连接来实现相近频道匹配的处理过程为:将要参与对战的无人机对应的智能眼镜控制端通过物理连线相互连接起来,以通过物理连线建立与相应无人机对战控制装置的通信关系。
在其他实现例中,根据无人机对战控制装置的类型和其提供的可能的通信方式,还可以通过其他的通信方式建立参与对战的无人机对战控制装 置之间的通信关系,目的是使得参与对战的无人机对战控制装置之间能够进行数据通信,以在参与对战的无人机对战控制装置之间进行步骤S201和步骤S202的数据互传操作,因而,本发明实施例对参与对战的无人机对战控制装置之间建立通信关系的方式不加以限制,只要能够使得参与对战的无人机对战控制装置之间能够进行数据互传即可。其中,需要说明的是,由于无人机对战控制装置与相应的无人机之间也需要进行数据传输,而基于本发明实施例的构思,也需要在无人机对战控制装置之间建立通信关系以进行步骤S201和步骤S202的数据互传,因而,为了减少数据传输干扰,参与对战的无人机对战控制装置之间优选采用不同于无人机对战控制装置与无人机之间的通信协议进行数据通信,即如果无人机与无人机对战控制装置之间采用第一通信协议进行通信,那么参与对战的无人机对战控制装置之间优选采用不同于第一通信协议的第二通信协议进行通信。当然,这不视为绝对的限制,参与对战的无人机对战控制装置之间采用第一通信协议进行通信也可以实现本发明实施例的发明目的。
示例性地,在具体实现中,步骤S203中的目标追踪图可以实现为虚拟雷达图,第一对战参数信息中包括有自身无人机的位置信息,第二对战参数信息中包括有对战目标的位置信息。由此,就可以根据第一对战参数信息和第二对战参数信息绘制输出以自身无人机位置为中心显示有对战目标所处的实时相对位置的虚拟雷达图,以使得用户能够实通过虚拟雷达图实时看到对战目标的相对位置,方便其有目的地控制自身无人机的飞行方向,以准确、快速地追击目标,避免盲飞。其中,对战无人机的位置信息可以是根据参与对战的无人机上装载的定位模块来获取,示例性地,对于装载了机载GPS的无人机,该无人机的对战控制装置可以通过其与自身无人机的通信关系从自身无人机的机载GPS获取到该无人机的实时GPS定位信息,并通过与对战目标的对战控制装置的通信关系将该定位信息发送给对战目标。由此,可以使得参战方能够实时获取到对战目标的位置信息。
在获知到对战目标的定位信息之后,就可以根据自身无人机的位置信息来确定对战目标所处的实时相对位置,并基于该实时相对位置来绘制输出虚拟雷达图。具体地,可以通过以自身无人机位置为中心原点,以自身 无人机位置和对战目标位置的方位差为坐标标识,来绘制生成一个虚拟雷达图,并将自身无人机位置显示在虚拟雷达的中心原点位置,而根据对战目标与自身无人机的位置方位差来显示对战目标的实时相对位置,使得用户能够清楚地通过无人机对战控制装置看到对战目标的实时相对位置,从而更有目的地控制自身无人机的飞行方向,避免盲飞。示例性地,通过无人机机载定位模块获取到的位置信息包括自身经纬度数据,那么基于这些经纬度数据就可以计算出二者的方位差,将自身无人机的经纬度位置信息转换为原点坐标(0,0),基于该原点坐标绘制由圆环组成的雷达图,基于二者的方位差就可以确定对战目标在该雷达图中的坐标位置,基于该位置坐标就可以绘制出对战目标在雷达图中的相应显示位置。示例性地,图3展示了一种实现方式的虚拟雷达图的展示效果,如图3所示,在展示的实时拍摄画面的右下角呈现了以自身无人机位置为中心原点的虚拟雷达图,用户根据该虚拟雷达图就可以清楚看到自身无人机与对战目标的方位差。
在其他实现例中,绘制的虚拟雷达图可以不局限为二维图像,示例性地,还可以为三维虚拟雷达图。其实现方式可以为:通过无人机机载定位模块获取到的位置信息包括自身经纬度数据和飞行高度数据,那么基于这些经纬度数据和飞行高度数据就可以计算出二者的方位差和高度差,将自身无人机的经纬度和飞行高度位置信息转换为原点坐标(0,0,0),基于该原点坐标绘制三维雷达图,基于二者的方位差和高度差就可以确定对战目标在该雷达图中的三维坐标,基于该位置坐标就可以绘制出对战目标在三维雷达图中的相应显示位置。
在另一优选实现例中,示例性地,步骤S203中的目标追踪图可以实现为包括锁定图标和对战目标显示图标,第一对战参数信息中包括有自身无人机的位置信息,第二对战参数信息中包括有对战目标的位置信息。其中,同样地,对战无人机的位置信息可以是根据参与对战的无人机上装载的定位模块来获取,并通过与对战目标的对战控制装置的通信关系将该定位信息发送给对战目标,以使得参战方能够实时获取到对战目标的位置信息。
在获知到对战目标的位置信息之后,就可以根据自身无人机的位置信息来确定对战目标所处的实时相对方位,并基于该实时相对方位来输出锁 定图标和对战目标显示图标,其中,对战目标显示图标是在锁定图标的实时相对方位处进行输出展示。示例性地,通过无人机机载定位模块获取到的位置信息包括自身经纬度数据,那么基于这些经纬度数据就可以对战目标所处的实时相对方位,其中,实时相对方位是指用于标识对战目标所处的相对方向的方位标识,例如实时相对方位为对战目标处于自身无人机的前方、后方、左面、右面等方位标识,由此就可以根据该相对方位来显示对战目标显示图标,如当判断实时相对方位为对战目标处于自身无人机的前方时,就将对战目标显示图标显示在锁定图标的前方;当判断实时相对方位为对战目标处于自身无人机的左方时,就将对战目标显示图标显示在锁定图标的左方。其中,需要说明的是,本实施例中的锁定图标是指用于框定锁定目标的图标(如图3中所示的方框即为锁定图标的一种示例),处于图标框定范围内的目标物即代表锁定了该目标物,而处于图标框定范围之外的目标物即代表并未锁定该目标物;本发明实施例中的对战目标显示图标是用于标识对战目标的显示图标,其具体是可以是对战目标的用户名、无人机代号、无人机代码等用于标识对战目标的内容。在根据第一对战参数和第二对战参数确定出对战目标的实时相对方位后,在未锁定状态下,就可以通过将对战目标显示图标显示在锁定图标的相应方位,以使得用户根据该图标显示位置就可以清楚看到自身无人机与对战目标的方位差,从而更有目的地控制自身无人机的飞行方向,避免盲飞。在优选实现例中,锁定图标和对战目标显示图标也可以以三维图像的方式进行显示,此时参考的位置信息还包括飞机的飞行高度。
在另一优选实现例中,在将对战目标显示图标在锁定图标的相对方位进行输出展示时,还根据确定的对战目标所处的实时相对方位,在锁定图标中显示追踪方向提示消息。其中,追踪方向提示消息可以为文字或箭头或文字与箭头的结合。以箭头为例,如当判断实时相对方位为对战目标处于自身无人机的后方时,就将对战目标显示图标显示在锁定图标的后方,同时在锁定图标中显示向后指示的箭头,以指示用户对战目标在其后面,要向后方飞行(例如通过向后空翻)。
在其他实现例中,在锁定图标中显示追踪提示消息的同时,还将锁定图标进行变色显示,例如,将锁定图标以不同于锁定状态的颜色进行显示, 以提醒用户并未锁定对战目标。示例性地,在锁定状态时,将锁定图标以红色显示,而在追踪状态时,如当判断实时相对方位为对战目标处于自身无人机的后方时,就将对战目标显示图标显示在锁定图标的后方,并将锁定图标显示为黄色。
进一步地,在优选实现例中,为了方便对无人机的功能进行区分,以减少对其原有功能的使用干扰,还可以设置用于触发进入游戏模式的操作模块,以使得根据用户进入游戏模式的输入操作,来开启无人机的游戏模式。由此,就可以响应于用于开启游戏模式的用户指令,来启动上述步骤S201至步骤S203的方法过程。
图4示意性地显示了根据本发明一实施方式的无人机对战方法流程,本实施例中,对战方法的执行主体为图1所示的无人机对战控制装置的处理器30。如图4所示,本实施例的方法在图2所示的步骤S203之后还包括
步骤S204:获取对战目标检测图标,将对战目标进行模拟锁定。
其中,对战目标检测图标具体可以为包括实时图像坐标系、对战目标在实时图像中所处的坐标位置和显示图标。在具体实现中,由于无人机包括有摄像模块(如摄像头)和图像回传模块,可以通过无人机来获取实时的摄像画面,并通过图传回传功能将实时图像传回给无人机对战控制装置。
在一种具体实现例中,无人机对战控制装置的处理器30可以基于自身无人机通过图传回传的实时图像来获取对战目标检测图标,将对战目标进行模拟锁定。示例性地可以实现为:
预先进行无人机标定并存储标定信息(例如通过数据库进行标定和存储),其中,无人机机型标定信息包括无人机机型(可以为机型标识)和机身形态(例如为一组不同状态下的机身形态视图照片以及对应的文字标定信息如是否有保护罩、机翼振动形态、是否有螺旋桨等)。在接收到自身无人机传来的拍摄画面时,根据存储的标定信息对自身无人机捕获的实时图像信息(即拍摄画面)进行物体检测识别,根据识别结果确定当前的实时图像中是否包括有对战目标。其中,对战目标的无人机机型可以通过第二对战参数信息获知,即在进行数据互传时,会将自身的无人机机型添 加到第一对战参数信息中传输给对战目标,由此自身无人机的对战控制装置就可以基于接收到的第二对战参数信息获知到对战目标的无人机机型。基于该无人机机型就可以从预先标定存储的标定信息中获知到对战目标的机身形态。由此,就可以根据获取的机身形态对无人机回传的实时图像进行物体检测识别,以确定图像中是否包括对战目标。在检测到对战目标时,生成对战目标检测图标,并将对战目标进行模拟锁定。其中,生成对战检测图标,并将对战目标进行模拟锁定例如可以实现为:根据实时图像的像素点建立实时图像坐标系,如假设实时图像包括有3*3个像素点,就根据其像素点矩阵以左下角作为原点、以像素点为基本坐标度量单位建立坐标系;根据通过物体检测识别检测到的对战目标在实时图像中所处的位置,即对应的像素点矩阵位置,来确定对战目标在实时图像坐标系中的坐标位置,将该坐标位置作为对战目标显示图标的显示位置,并将对战目标的显示图标以锁定状态在该显示位置输出展示。其中,将对战目标的显示图标以锁定状态在该显示位置输出展示是指在该显示位置,将对战目标的显示图标显示在锁定图标中。
在其他实现例中,上述进行对战目标的物体检测识别的具体处理过程也可以是由无人机的处理器或摄像模块或相机模块执行,这种情况下无人机的标定信息是预先配置并存储在无人机上,无人机对战控制装置在确定出对战目标的无人机机型后会将无人机机型传输给自身无人机。自身无人机在通过其摄像模块或相机模块获取到实时图像后,会根据对战目标的机身机型和预先存储的无人机标定信息对实时图像信息进行物体检测识别,在识别到对战目标时,进行实时图像坐标系的构建和确定检测到的对战目标的坐标位置,得到对战目标检测图标信息,之后通过无人机的图传模块向无人机对战控制装置回传实时图像和对战目标检测图标,而无人机对战控制装置的处理器则执行接收对战目标检测图标,并在接收到检测图标时,将对战目标进行模拟锁定。
在优选实现例中,上述的对实时图像信息进行物体检测识别以确定出对战目标的实现方式,还可以不局限于基于对战目标的无人机机型和预先存储的无人机标定信息进行物体检测识别的方式,而是可以根据无人机的定位信息和飞行信息,例如根据无人机的飞行速度、飞行高度和经纬度信 息来确定对战目标的位置,将符合该位置和飞行信息的无人机确定为对战目标,如根据无人机的飞行高度和经纬度信息从实时图像的相应位置来找到符合特点的物体,之后根据飞行速度来进一步判断符合位置特点的物体是否为对战目标。其中,无人机的飞行速度、飞行高度和经纬度信息可以作为第一对战参数信息实时传输给对战目标,由此,无人机控制装置就可以从第二对战参数信息中获知到对战目标的飞行速度、飞行高度和经纬度信息。
在其他实现例中,还可以通过将上述的基于对战目标的无人机机型和预先存储的无人机标定信息进行物体检测识别的方式与根据无人机的飞行速度、飞行高度和经纬度信息来确定对战目标的方式相结合来检测对战目标,以进一步提高对对战目标的识别准确率。
在另外的实现例中,还可以根据对战目标与自身无人机的位置距离来进行识别策略设定。如对较近距离(如500米以内的)的无人机,采用基于对战目标的无人机机型和预先存储的无人机标定信息进行物体检测识别的方式来确定对战目标;而对于较远距离(如500米以外的)的无人机,则采用基于无人机的飞行速度、飞行高度和经纬度信息来确定对战目标的方式实现对战目标检测。
作为一个具体实施方式示例,在接收到检测图标时,将对战目标进行模拟锁定具体可以实现为:根据获取到的检测图标将对战目标以锁定状态输出显示。
其中,如图1所示,本发明实施例中的无人机对战控制装置均具有显示模块10,本发明上述实施例提及的目标追踪图和将对战目标以锁定状态输出显示,均是在无人机对战控制装置的显示模块10上进行输出展示。显示模块10根据无人机对战控制装置的产品设计特点,可以是显示屏,也可以是投影出的显示界面等,本发明实施例不对此进行限制。
示例性地,图5展示了一种实现方式的模拟锁定的展示效果,如图5所示,在展示的实时拍摄画面的相应位置,展示有以锁定状态显示的对战目标的显示图标,其中,图中的锁定状态以特定的颜色如红色来标识,而对战目标在锁定框内以显示图标如“x”来表示。此外,图5中进一步还在展示界面的右下角呈现了以自身无人机位置为中心原点的虚拟雷达图。 由此,用户就可以根据输出展示的虚拟战场图像清楚看到自身无人机与对战目标的方位差、以及对战目标的锁定状态。
在其他实现例中,在将对战目标以锁定状态输出显示时,还包括将对战目标的已被锁定的状态信息输出给对战目标。进一步地,响应于接收到的已被锁定的信息,还输出已被锁定的提示消息。例如通过将对战目标已被锁定的状态信息添加入第一对战参数信息中,以将对战目标的锁定状态实时传送给对战目标。由此,无人机对战控制装置就可以根据接收到的第二对战参数信息判断自身无人机是否被锁定。优选地,在判断接收到的第二对战参数信息中包括锁定状态时,输出已被锁定的提示信息,以对用户进行提醒,这样用户就能够及时进行相应的控制,以及时躲避对方的追击,提升用户对战体验的趣味性。示例性地,输出的已被锁定的提示信息可以为声音提示,如播放“已被锁定”的提示语音或播放“滴滴”的提示音等。在其他实现例中,输出的已被锁定的提示也可以为屏幕闪烁提示,或者飞行控制建议提示,如在显示屏幕上输入“是否空翻躲避”等,本发明实施例不对此进行限制。
本发明实施例通过模拟对战中的锁定过程来实现利用无人机进行竞技游戏,既不需要更改无人机的硬件装备,又能带来真实的对战体验,大幅提高了用户对无人机的应用场景和用户粘性。特别是,本发明实施例基于目标检测来进行模拟锁定,能够精准地识别出对战目标,并将其进行锁定,使得用户的作战体验更真实有趣。图6示意性地显示了根据本发明一实施方式的无人机对战方法流程,本实施例中,对战方法的执行主体为图1所示的无人机对战控制装置的处理器30。如图6所示,本实施例的方法在图4所示的步骤S204之后还包括
步骤S205:监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁。
本发明实施例在进行模拟锁定之后,通过监测模拟锁定时长来模拟无人机的摧毁过程。具体地,可以分两个阶段来实施模拟摧毁过程,示例性地,第一个阶段实现为确认是否为可发射摧毁状态,第二个阶段实现为确认是否已摧毁的状态。
在第一个阶段,在根据获取到的检测图标将对战目标以锁定状态输出 显示之后,就开始对该锁定状态进行计时,以统计得到第一锁定时长,并根据对战目标的第一锁定时长,确定对战目标是否为可发射摧毁状态,示例性地,根据对战目标的第一锁定时长是否超过第一预设值如1s来确定对战目标是否为可发射摧毁状态,当判断第一锁定时长超过第一预设值则认定为可发射摧毁状态。
在确定对战目标为可发射摧毁状态时,可以通过输出提示消息即摧毁触发提示来提醒用户可发射摧毁。示例性地,输出的摧毁触发提示可以为声音提示,如播放“锁定目标,出击”的提示语音或播放“哔哔”的提示音等进行可发射摧毁的触发提示。在其他实现例中,输出的摧毁触发提示也可以为屏幕闪烁提示,或者在可以触发发射摧毁的输入模块上进行指引性提示,如在输入按钮上闪烁灯光或显示有特定颜色的标识框等,本发明实施例不对此进行限制。
在一些实现例中,上述无人机对战控制装置还可以实现为包括用于触发所述用于发射摧毁的用户指令的第一输入模块,示例性地,该第一输入模块可以是显示界面上的虚拟按钮或其他可供用户输入的菜单选项或可以是一个与处理器通信连接的按键。这样,在第二个阶段,就可以根据用户对第一阶段的摧毁触发提示的反馈来进行摧毁过程的模拟。优选地可以为,在确定对战目标为可发射摧毁状态之后,响应于接收到的用于发射摧毁的用户指令,如用户通过遥控器的快捷按钮或手机上装载的app端的菜单选项或眼镜端的显示界面上的菜单选项触发发射摧毁之后,就开始对对战目标的持续锁定状态进行计时,以统计确定出对战目标的第二锁定时长。之后,可以根据对战目标的第二锁定时长,如判断第二锁定时长是否超过第二预设值如1s来确定对战目标是否被摧毁。当发射摧毁之后,对战目标仍然被锁定第二预设值的时长,则确定对战目标的摧毁状态为处于已摧毁状态。
在优选实现例中,当确定对战目标的摧毁状态为处于已摧毁状态时,还可以向对战目标输出已摧毁信息,以提示敌对方其无人机已被摧毁。进一步地,在接收到已摧毁信息的提示时,可以进一步执行响应于接收到的已摧毁信息,输出已被摧毁的提示信息。其中,向对战目标输出已摧毁信息例如可以是通过将已摧毁信息添加到第一对战参数信息中发送给对战 目标,这样就可以通过判断接收到的第二对战参数信息中是否包括已被摧毁的信息来进行已被摧毁的提醒。由此,就可以在将对战目标摧毁时,通过在对战目标的无人机对战控制装置的显示模块10上显示已被摧毁的提示信息,或在无人机对战控制装置上播放已被摧毁的提示音,来告知对方无人机的摧毁状态,当确认其中一方对战无人机被摧毁时,则实现竞技游戏胜负的区分。
本发明实施例通过对模拟锁定的时长进行统计和监测,来实现对无人机竞技游戏中的摧毁过程的模拟,使得现有的无人机在无需改装硬件设备的情况下,就可以实现模拟锁定和模拟摧毁,完成竞技游戏模拟过程,提升无人机的趣味性,且能够提高用户竞技游戏的参与感,拓展无人机的应用范围。
在优选实现例中,上述无人机对战控制装置还可以实现为包括用于触发发射烟雾干扰用户指令的第二输入模块,示例性地,该第二输入模块可以是显示界面上的虚拟按钮或其他可供用户输入的菜单选项或可以是一个与处理器通信连接的按键。上述对战方法还可以包括:响应于接收到的烟雾干扰用户指令,将对战目标的隐藏状态添加入第一对战参数信息中。由此,无人机对战控制装置就可以根据接收到的第二对战参数信息判断对战目标是否选择了烟雾干扰雷达的功能。优选地,在判断接收到的第二对战参数信息中包括隐藏状态时,在目标追踪图中将对战目标的实时方位信息隐藏预设时长。其中,隐藏状态是指不在目标追踪图中显示对战目标的方位。当然,隐藏状态还可以进一步指在锁定状态下在锁定图标中不显示对战目标。示例性地,预设时长可以根据需求设定例如设定为3s。通过提供烟雾干扰功能可以为用户进行空战对战提供特殊对战技能,以通过隐藏自身无人机,来提升对战的趣味性和挑战性。
在其他优选实现例中,还可以通过构造虚拟战场图像输出来提升用户对战体验。其中,构造虚拟战场图像例如可以实现为预先存储虚拟战场构造元素,如虚拟大炮图、虚拟火箭图等,并在输出实时图像时,根据上述建立的实时图像坐标系,将虚拟战场构造元素添加显示到实时图像中,以得到虚拟展现图像输出显示。其中,上述的目标追踪图可以实现为在虚拟战场图像中进行输出展示,如目标追踪图为包括锁定图标和对战目标显示 图标的实现例中,可以将锁定图标和对战目标显示图标在虚拟战场图像或实时图像上进行显示;或在虚拟战场图像的预设位置输出显示,例如将目标追踪图显示在虚拟战场图像或实时图像的左下角或图3所示的右下角的位置(以虚拟雷达图为例)。图7示意性地显示了根据本发明一实施方式的无人机,如图7所示,本实施例中的无人机至少包括具有显示界面的操作终端70,机载定位模块71,摄像模块72,用于存储可执行指令的存储器74,和用于执行存储器中存储的可执行指令的处理器75。在具体应用中,用户可以通过操作终端70进行用户输入,通过操作终端70的显示界面(如为显示屏或投影显示界面)来查看呈现出的对战界面。而无人机根据用户的输入,配合其处理器75来执行相应的可执行指令,以实现将无人机应用在空战竞技游戏中,完成无人机对战游戏。
示例性地,本实施例的操作终端70可以为智能手机、平板电脑、智能眼镜或遥控器等。具体地,当需要利用无人机进行对战游戏时,用户可以通过操作终端70输入用户指令以启动无人机的游戏模式,并通过操作终端70输入对战目标信息(如用户信息)来建立与对战目标的通信关系。在建立了与对战目标的通信关系之后,无人机的处理器75即执行如下操作:
实时获取第一对战参数信息,并向对战目标实时传输第一对战参数信息;
实时接收来自对战目标的第二对战参数信息;
根据第一对战参数信息和第二对战参数信息在操作终端70的显示界面输出目标追踪图,其中,目标追踪图中显示有对战目标的实时方位信息。
其中,本实施例建立与对战目标的通信关系可以是在无人机的操作终端70之间建立通信关系,利用无人机的操作终端70进行数据传输。也可以是在接收到用户通过操作终端70输入的对战目标信息之后,根据用户输入的对战目标信息,由无人机的处理器75调用其通信模块直接建立与对战无人机的通信关系,以在参与对战的无人机之间直接进行实时数据传输。其中,通过操作终端70建立通信连接的具体方式可以参照前文方法部分叙述。
在其他实现例中,根据无人机的操作终端类型和其提供的可能的通信 方式,还可以通过其他的通信方式利用无人机的操作终端建立参与对战的无人机之间的通信关系,目的是使得参与对战的无人机之间能够进行数据通信,以在参与对战的无人机之间进行数据互传操作,因而,本发明实施例对参与对战的无人机之间建立通信关系的方式不加以限制,只要能够使得参与对战的无人机之间能够进行数据互传即可。
示例性,第一对战参数信息中包括有自身无人机的位置信息,第二对战参数信息中包括有对战目标的位置信息。其中,无人机的处理器75通过机载定位模块71来获取无人机的自身位置信息,并将该自身位置信息通过建立的通信关系发送给对战目标。该机载定位模块71示例性地可以为机载GPS。
在建立通信关系之后,无人机的处理器75可以通过建立的通信关系在对战目标之间互传包括自身无人机位置信息的第一对战参数信息,以使得对战的无人机获取到对战目标的定位信息。由此,无人机的处理器75就可以根据第一对战参数信息和第二对战参数信息绘制输出追踪目标图,以使得用户能够实通过追踪目标图实时看到对战目标的实时方位信息,方便其有目的地控制自身无人机的飞行方向,以准确、快速地追击目标,避免盲飞。
示例性地,在具体实现中,目标追踪图可以实现为虚拟雷达图。在获取第一对战参数信息和第二对战参数信息后,无人机的处理器75就可以基于自身无人机位置信息和对战目标的定位信息来确定对战目标所处的实时相对位置,并基于该实时相对位置来绘制输出虚拟雷达图。具体地,可以通过以自身无人机位置为中心原点,以对战目标和自身无人机的相差方位作为对战目标的坐标位置,来绘制输出虚拟雷达图,以在虚拟雷达图上展示自身为人机和对战目标的相对方位差。而用户根据该虚拟雷达图就可以清楚看到自身无人机与对战目标的方位差,其中,该虚拟雷达图的一种显示效果可以参照图3所示。
在具体实现例中,根据需求绘制的虚拟雷达图可以不局限为二维图像,示例性地,还可以为三维虚拟雷达图。以绘制二维图像为例,通过无人机的机载定位模块71获取到的位置信息包括自身经纬度数据,那么基于这些经纬度数据就可以计算出二者的方位差,将自身无人机的经纬度位 置信息转换为原点坐标(0,0),基于该原点坐标绘制圆环形雷达图,基于二者的方位差就可以确定对战目标在该雷达图中的方位坐标,基于该位置坐标就可以绘制出对战目标在雷达图中的相应位置,其展示效果如图3所示。以绘制三维图像为例,通过无人机的机载定位模块71获取到的位置信息包括自身经纬度数据和飞行高度数据,那么基于这些经纬度数据和飞行高度数据就可以计算出二者的方位差和高度差,将自身无人机的经纬度和飞行高度位置信息转换为原点坐标(0,0,0),基于该原点坐标绘制三维雷达图,基于二者的方位差和高度差就可以确定对战目标在该雷达图中的三维坐标,基于该位置坐标就可以绘制出对战目标在三维雷达图中的相应位置。
在另一优选实现例中,示例性地,目标追踪图还可以实现为包括锁定图标和对战目标显示图标。其中,同样地,对战无人机的位置信息可以是根据参与对战的无人机上装载的机载定位模块71来获取,并通过与对战目标的通信关系将该定位信息发送给对战目标,以使得参战方能够实时获取到对战目标的位置信息。
在获知到对战目标的位置信息之后,无人机的处理器还执行根据自身无人机的位置信息来确定对战目标所处的实时相对方位,并基于该实时相对方位来在操作终端的显示界面上输出锁定图标和对战目标显示图标的操作,其中,对战目标显示图标是在锁定图标的实时相对方位处进行输出展示。示例性地,通过无人机的机载定位模块71获取到的位置信息包括自身经纬度数据,那么基于这些经纬度数据就可以对战目标所处的实时相对方位,例如实时相对方位为对战目标处于自身无人机的前方、后方、左面、右面等方位,由此就可以根据该相对方位来显示对战目标显示图标,如当判断实时相对方位为对战目标处于自身无人机的前方时,就将对战目标显示图标显示在锁定图标的前方;当判断实时相对方位为对战目标处于自身无人机的左方时,就将对战目标显示图标显示在锁定图标的左方。其中,需要说明的是,本发明实施例中的锁定图标是指用于框定锁定目标的图标(如图3中所示的方框即为锁定图标的一种示例),处于图标框定范围内的目标物即代表锁定了该目标物,而处于图标框定范围之外的目标物即代表并未锁定该目标物;本发明实施例中的对战目标显示图标是用于标 识对战目标的显示图标,其可以是对战目标的用户名、无人机代号、无人机代码等实现方式。在根据第一对战参数和第二对战参数确定出对战目标的实时相对方位后,在未锁定状态下,就可以通过将对战目标显示图标显示在锁定图标的相应方位,以使得用户根据该图标显示位置就可以清楚看到自身无人机与对战目标的方位差,从而更有目的地控制自身无人机的飞行方向,避免盲飞。在优选实现例中,锁定图标和对战目标显示图标也可以以三维图像的方式进行显示,此时参考的位置信息还包括飞机的飞行高度。
在另一优选实现例中,无人机的处理器75还执行如下操作:
在将对战目标显示图标在锁定图标的相对方位进行输出展示时,还根据确定的对战目标所处的实时相对方位,在锁定图标中显示追踪方向提示消息。
其中,追踪方向提示消息可以为文字或箭头或文字与箭头的结合。以箭头为例,如当判断实时相对方位为对战目标处于自身无人机的后方时,就将对战目标显示图标显示在锁定图标的后方,同时在锁定图标中显示向后指示的箭头,以指示用户对战目标在后面,要向后方飞行(例如通过向后空翻)。
在其他实现例中,无人机的处理器75还执行如下操作:
在锁定图标中显示追踪提示消息的同时,还将锁定图标进行变色显示。
例如,将锁定图标以不同于锁定状态的颜色进行显示,以提醒用户并未锁定对战目标。示例性地,在锁定状态时,将锁定图标以红色显示,而在追踪状态时,当判断实时相对方位为对战目标处于自身无人机的后方时,就将对战目标显示图标显示在锁定图标的后方,并将锁定图标显示为黄色。
在另一实现例中,无人机的处理器75还执行如下操作:
获取对战目标检测图标,将对战目标进行模拟锁定。
具体地,无人机的处理器75可以调用摄像模块72(示例性地为无人机上设置的相机或摄像头)来捕获实时图像信息,并由无人机处理器75或摄像模块72对捕获的实时图像信息进行物体检测识别,根据识别结果 在确认识别到对战目标时,获取对战目标检测图标,并将对战目标检测图标在操作终端70的显示界面上以锁定状态输出展示,从而实现对无人机对战游戏中锁定过程的模拟。其中,对战目标检测图标具体可以为包括实时图像坐标系、对战目标在实时图像中所处的坐标位置和显示图标。该模拟锁定状态的一种展示效果可以参照图5所示。
作为一种优选实现例,由无人机处理器75或摄像模块72对捕获的实时图像信息进行物体检测识别可以实现为:预先进行无人机数据库标定,并在数据库中存储无人机标定信息,其中,标定信息包括无人机机型(可以为机型标识)和机身形态(例如为一组不同状态下的机身形态视图照片以及对应的文字标定信息如是否有保护罩、机翼震动形态、是否有螺旋桨等);在捕获到实时的图像信息时,基于无人机数据库的标定信息和/第二对战参数信息对实时图像进行物体检测,以识别确定出对战目标,并生成对战目标检测图标。具体地,在进行数据互传时,会将自身的无人机机型添加到第一对战参数信息中传输给对战目标,由此自身无人机就可以基于接收到的第二对战参数信息获知到对战目标的无人机机型。基于该无人机机型就可以从预先标定存储的标定信息中获知到对战目标的机身形态。由此,就可以根据获取的机身形态对实时图像进行物体检测识别,以确定图像中是否包括对战目标。
其中,无人机数据库标定信息可以存储在无人机上(如存储在存储器中),也可以存储在云端服务器,通过互联网络连接云端服务器来获取数据库中的标定信息,以进行物体检测识别。
其中,在无人机上生成对战目标检测图标并将对战目标检测图标在操作终端70的显示界面上以锁定状态输出展示的过程可以为:根据实时图像的像素点建立实时图像坐标系,如假设实时图像包括有3*3个像素点,就根据其像素点矩阵以左下角作为原点、以像素点为基本坐标度量单位建立坐标系;根据通过物体检测识别检测到的对战目标在实时图像中所处的位置,即对应的像素点矩阵位置,来确定对战目标在实时图像坐标系中的坐标位置,将该坐标位置作为对战目标显示图标的显示位置,并通过图像回传向操作终端回传实时图像坐标系、对战目标在实时图像坐标系中的坐标位置和对战目标显示图标,以将对战目标的显示图标以锁定状态在该显 示位置输出展示在无人机的操作终端上。其中,将对战目标的显示图标以锁定状态在该显示位置输出展示是指在该显示位置,将对战目标的显示图标显示在锁定图标中。
在优选实现例中,无人机的处理器75对实时图像信息进行物体检测识别以确定出对战目标的实现方式,还可以不局限于上述基于对战目标的无人机机型和预先存储的无人机标定信息进行物体检测识别的方式,而是可以根据无人机的定位信息和飞行信息,例如根据无人机的飞行速度、飞行高度和经纬度信息来确定对战目标的位置,将符合该位置和飞行信息的无人机确定为对战目标,如无人机的处理器75还可以根据无人机的飞行高度和经纬度信息从实时图像的相应位置来找到符合特点的目标物体,之后根据飞行速度来进一步判断符合位置特点的目标物体是否为对战目标。其中,无人机的飞行速度、飞行高度和经纬度信息可以作为第一对战参数信息实时传输给对战目标,由此,无人机的处理器75就可以从第二对战参数信息中获知到对战目标的飞行速度、飞行高度和经纬度信息。
在其他实现例中,无人机的处理器75还可以通过将上述的基于对战目标的无人机机型和预先存储的无人机标定信息进行物体检测识别的方式与根据无人机的飞行速度、飞行高度和经纬度信息来确定对战目标的方式相结合来检测对战目标,以进一步提高对对战目标的识别准确率。
在另外的实现例中,无人机的处理器75还可以根据检测目标与自身无人机的距离来执行不同的识别策略。如对较近距离(如500米以内的)的无人机,无人机的处理器75采用基于对战目标的无人机机型和预先存储的无人机标定信息进行物体检测识别的方式来确定对战目标;而对于较远距离(如500米以外的)的无人机,无人机的处理器75则采用基于无人机的飞行速度、飞行高度和经纬度信息来确定对战目标的方式实现对战目标检测。
在其他实现例中,无人机的处理器75还执行如下操作:
在将对战目标以锁定状态输出显示时,还向对战目标输出已被锁定的信息。
示例性地,可以通过将对战目标的已被锁定的状态信息添加入第一对战参数信息中,以将对战目标已被锁定的信息实时传送给对战目标。由此, 无人机就可以根据接收到的第二对战参数信息判断自身无人机是否已被锁定。
优选地,无人机的处理器75还执行如下操作:
响应于接收到的已被锁定的消息,例如在判断接收到的第二对战参数信息中包括锁定状态时,输出已被锁定的提示信息。
这样用户就能够及时收到被锁定的提醒和基于提醒进行相应的控制,从而躲避对方的追击,提升用户对战体验的趣味性。示例性地,输出的已被锁定的提示信息可以为声音提示,如播放“已被锁定”的提示语音或播放“滴滴”的提示音等。在其他实现例中,输出的已被锁定的提示也可以为在操作终端上进行屏幕闪烁提示,或者飞行控制建议提示,如在操作终端的显示界面上显示“是否空翻躲避”等提示消息,本发明实施例不对此进行限制。
在其他实现例中,无人机的处理器75还执行如下操作:
监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁。
具体地,无人机的处理器75可以分两个阶段来实施模拟摧毁过程。示例性地,第一个阶段实现为确认是否为可发射摧毁状态,第二个阶段实现为确认是否已摧毁的状态。
在第一个阶段,无人机处理器75在根据获取到的检测图标将对战目标以锁定状态输出显示之后,就开始对该锁定状态进行计时,以统计得到第一锁定时长,并根据对战目标的第一锁定时长,确定对战目标是否为可发射摧毁状态,示例性地,根据对战目标的第一锁定时长是否超过第一预设值如1s来确定对战目标是否为可发射摧毁状态,当判断第一锁定时长超过第一预设值则认定为可发射摧毁状态。
在确定对战目标为可发射摧毁状态时,可以通过在操作终端70上输出提示消息即摧毁触发提示来提醒用户可发射摧毁。示例性地,操作终端70包括有音频模块和喇叭,输出的摧毁触发提示可以为声音提示,如播放“锁定目标,出击”的提示语音或播放“哔哔”的提示音等进行可发射摧毁的触发提示。在其他实现例中,输出的摧毁触发提示也可以为在操作终端70的显示界面上进行屏幕闪烁提示。在另一可选实现例中,操作终端 70包括有用于接收用户输入指令的第一输入终端,输出的摧毁触发提示也可以为在可以触发发射摧毁的第一输入终端上进行指引性提示,如在输入按钮上闪烁灯光或显示有特定颜色的标识框等,本发明实施例不对此进行限制。
在操作终端70包括有用于接收用户输入指令的第一输入终端的实现例中,无人机的处理器75还可以通过该第一输入终端接收用户输入的触发摧毁的用户指令,示例性地,该第一输入终端可以是操作终端70的显示界面上的虚拟按钮或其他可供用户输入的菜单选项或可以是操作终端70上的按键。在其他实现例中,第一输入终端也可以不局限于是设于操作终端上的,还可以是一个单独的输入终端,如操作终端70为与无人机处理器75通信连接的眼镜,而第一输入终端为一个与无人机处理器75通信连接的遥控器按钮。这样,在第二个阶段,就可以根据用户对第一阶段的摧毁触发提示的反馈来进行摧毁过程的模拟。优选地可以为,无人机的处理器75在确定对战目标为可发射摧毁状态之后,响应于接收到的用于发射摧毁的用户指令,如用户通过遥控器的快捷按钮或手机app端的菜单选项或眼镜端的显示界面上的菜单选项触发发射摧毁之后,就开始对对战目标的持续锁定状态进行计时,以统计确定出对战目标的第二锁定时长。之后,可以根据对战目标的第二锁定时长,如判断第二锁定时长是否超过第二预设值如1s来确定对战目标是否被摧毁。当发射摧毁之后,对战目标仍然被锁定第二预设值的时长,则确定对战目标的摧毁状态为处于已摧毁状态。
在优选实现例中,当确定对战目标的摧毁状态为处于已摧毁状态时,无人机的处理器75还可以向对战目标输出已摧毁信息,以提示敌对方其无人机已被摧毁。进一步地,无人机的处理器75在接收到已摧毁信息的提示时,可以继续执行输出接收到的已被摧毁的提示信息,如在操作终端70上输出提示消息。其中,向对战目标输出已摧毁信息例如可以是通过将已摧毁信息添加到第一对战参数信息中发送给对战目标,这样就可以通过判断接收到的第二对战参数信息中是否包括已被摧毁的信息来进行已被摧毁的提醒。由此,就可以在将对战目标摧毁时,通过在对战目标的操作终端70上显示已被摧毁的提示信息,或在无人机或者操作终端70上播放 已被摧毁的提示音,来告知对方无人机的摧毁状态,当确认其中一方对战无人机被摧毁时,则实现竞技游戏胜负的区分。
本发明实施例通过以锁定状态在操作终端显示输出对战目标检测图标,实现对无人机对战游戏中的锁定过程的模拟。通过对模拟锁定的时长进行统计和监测,来实现对无人机竞技游戏中的摧毁过程的模拟,使得现有的无人机在无需改装硬件设备的情况下,就可以实现模拟锁定和模拟摧毁,完成竞技游戏模拟过程,提升无人机的趣味性,且能够提高用户竞技游戏的参与感,拓展无人机的应用范围。
在优选实现例中,无人机的处理器75还执行如下操作:
响应于接收到的烟雾干扰用户指令,将对战目标的隐藏状态添加入第一对战参数信息中。
由此,无人机就可以根据接收到的第二对战参数信息判断对战目标是否选择了烟雾干扰雷达的功能。
优选地,无人机的处理器75还执行如下操作:
在判断接收到的第二对战参数信息中包括隐藏状态时,在目标追踪图中将对战目标的实时方位信息隐藏预设时长。
其中,隐藏状态是指不在目标追踪图中显示对战目标的方位或在锁定状态下在锁定图标中不显示对战目标,预设时长可以根据需求设定例如3s。通过提供烟雾干扰功能可以为用户进行空战对战提供特殊对战技能,如隐藏技能,使得对战体验更有趣和富有挑战性。在具体实现中,无人机还可以实现为包括用于触发发射烟雾干扰用户指令的第二输入终端,第二输入终端例如为输入按键或在操作终端的显示界面上提供的菜单选项。其中,该第二输入终端可以为与上述第一输入终端相同,也可以为不同的输入终端。
在其他优选实现例中,无人机的处理器75还执行如下操作:
构造虚拟战场图像在操作终端上输出,并将目标追踪图在虚拟战场图像中或在虚拟战场图像的预设位置输出显示。
由此可以通过构造虚拟战场图像输出来提升用户对战体验。其中,构造虚拟战场图像例如可以实现为预先存储虚拟战场构造元素,如虚拟大炮图、虚拟火箭图等,并在输出实时图像时,根据上述建立的实时图像坐标 系,将虚拟战场构造元素添加显示到实时图像中,以得到虚拟展现图像输出显示。其中,上述的目标追踪图可以实现为在虚拟战场图像中进行输出展示,如目标追踪图为包括锁定图标和对战目标显示图标的实现例中,可以将锁定图标和对战目标显示图标在虚拟战场图像或实时图像上进行显示;或在虚拟战场图像的预设位置输出显示,例如将目标追踪图显示在虚拟战场图像或实时图像的左下角或图3所示的右下角的位置(以虚拟雷达图为例)。
在一些实施例中,本发明实施例提供一种非易失性计算机可读存储介质,所述存储介质中存储有一个或多个包括执行指令的程序,所述执行指令能够被电子设备(包括但不限于计算机,服务器,或者网络设备等)读取并执行,以用于执行本发明上述任一项实施例的无人机对战方法。
在一些实施例中,本发明实施例还提供一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行上述任一项实施例的无人机对战方法。
在一些实施例中,本发明实施例还提供一种电子设备,其包括:至少一个处理器,以及与所述至少一个处理器通信连接的存储器,其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行上述任一实施例的无人机对战方法。
在一些实施例中,本发明实施例还提供一种存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现上述任一项实施例的无人机对战方法。
图8是本申请另一实施例提供的执行无人机对战方法的电子设备的硬件结构示意图,如图8所示,该设备包括:
一个或多个处理器610以及存储器620,图8中以一个处理器610为例。
执行无人机对战方法的设备还可以包括:输入装置630和输出装置640。
处理器610、存储器620、输入装置630和输出装置640可以通过总 线或者其他方式连接,图8中以通过总线连接为例。
存储器620作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的无人机对战方法对应的程序指令/模块。处理器610通过运行存储在存储器620中的非易失性软件程序、指令以及模块,从而执行服务器的各种功能应用以及数据处理,即实现上述方法实施例的无人机对战方法。
存储器620可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据无人机对战方法的使用所创建的数据等。此外,存储器620可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器620可选包括相对于处理器610远程设置的存储器,这些远程存储器可以通过网络连接至无人机对战控制装置。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入装置630可接收输入的数字或字符信息,以及产生与无人机对战控制装置的用户设置以及功能控制有关的信号。输出装置640可包括显示屏等显示设备。
所述一个或者多个模块存储在所述存储器620中,当被所述一个或者多个处理器610执行时,执行上述任意方法实施例中的无人机对战方法。
上述产品可执行本申请实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本申请实施例所提供的方法。
本申请实施例的电子设备以多种形式存在,包括但不限于:
(1)移动通信设备:这类设备的特点是具备移动通信功能,并且以提供话音、数据通信为主要目标。这类终端包括:智能手机(例如iPhone)、多媒体手机、功能性手机,以及低端手机等。
(2)超移动个人计算机设备:这类设备属于个人计算机的范畴,有计算和处理功能,一般也具备移动上网特性。这类终端包括:PDA、MID和UMPC设备等,例如iPad。
(3)便携式娱乐设备:这类设备可以显示和播放多媒体内容。该类设备 包括:音频、视频播放器(例如iPod),掌上游戏机,电子书,以及智能玩具和便携式车载导航设备。
(4)服务器:提供计算服务的设备,服务器的构成包括处理器、硬盘、内存、系统总线等,服务器和通用的计算机架构类似,但是由于需要提供高可靠的服务,因此在处理能力、稳定性、可靠性、安全性、可扩展性、可管理性等方面要求较高。
(5)其他具有数据交互功能的电子装置。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (57)

  1. 无人机对战方法,适用于用无人机进行空战竞技游戏的场景,该方法包括:
    实时获取第一对战参数信息,并向对战目标实时传输所述第一对战参数信息;
    实时接收来自对战目标的第二对战参数信息;
    根据所述第一对战参数信息和第二对战参数信息生成目标追踪图输出展示,其中,所述目标追踪图中显示有对战目标的实时方位信息。
  2. 根据权利要求1所述的无人机对战方法,其中,所述目标追踪图实现为包括虚拟雷达图,根据所述第一对战参数信息和第二对战参数信息生成目标追踪图输出展示包括
    根据第一对战参数信息和第二对战参数信息确定对战目标所处的实时相对位置;
    以自身无人机位置为中心绘制虚拟雷达图输出,并在虚拟雷达图中以自身无人机位置为中心显示对战目标所处的实时相对位置。
  3. 根据权利要求1或2所述的无人机对战方法,其中,所述目标追踪图实现为包括锁定图标和对战目标显示图标,根据所述第一对战参数信息和第二对战参数信息生成目标追踪图输出展示包括
    根据第一对战参数信息和第二对战参数信息确定对战目标所处的实时相对方位;
    将所述对战目标显示图标在所述锁定图标的实时相对方位处进行输出展示。
  4. 根据权利要求3所述的无人机对战方法,其中,在将所述对战目标显示图标在所述锁定图标的相对方位进行输出展示时,还根据确定的对战目标所处的实时相对方位,在所述锁定图标中显示追踪方向提示消息。
  5. 根据权利要求4所述的无人机对战方法,其中,在所述锁定图标中显示追踪提示消息的同时,还将所述锁定图标进行变色显示。
  6. 根据权利要求3所述的无人机对战方法,其特征在于,还包括:
    获取对战目标检测图标,将对战目标进行模拟锁定。
  7. 根据权利要求6所述的无人机对战方法,其特征在于,还包括:
    监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁。
  8. 根据权利要求7所述的无人机对战方法,其特征在于,所述获取对战目标检测图标,将对战目标进行模拟锁定包括:
    接收自身无人机通过图传回传的对战目标检测图标;
    根据接收到的检测图标将对战目标以锁定状态输出显示。
  9. 根据权利要求7所述的无人机对战方法,其特征在于,所述获取对战目标检测图标,将对战目标进行模拟锁定包括:
    接收自身无人机通过图传回传的实时图像;
    根据预先存储的无人机标定信息和/或第二对战参数对实时图像进行对战目标物体检测,在确认当前图像中包括有对战目标时,生成对战目标检测图标,将对战目标以锁定状态输出显示。
  10. 根据权利要求8或9所述的无人机对战方法,其特征在于,监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁包括:
    根据对战目标的第一锁定时长,确定对战目标是否为可发射摧毁状态;
    在确定对战目标为可发射摧毁状态时,输出摧毁触发提示。
  11. 根据权利要求10所述的无人机对战方法,其特征在于,监测对 战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁还包括
    在确定对战目标为可发射摧毁状态时,响应于接收到的用于发射摧毁的用户指令,确定对战目标的第二锁定时长;
    根据第二锁定时长确定对战目标的摧毁状态;
    在确定对战目标的摧毁状态为处于已摧毁状态时,向对战目标输出已摧毁信息。
  12. 根据权利要求11所述的无人机对战方法,其特征在于,还包括
    响应于接收到的已摧毁信息,输出已被摧毁的提示信息。
  13. 根据权利要求8或9所述的无人机对战方法,其特征在于,还包括:
    在将对战目标以锁定状态输出显示时,向对战目标输出已被锁定的信息。
  14. 根据权利要求13所述的无人机对战方法,其特征在于,还包括
    响应于接收到的已被锁定的信息,输出已被锁定的提示信息。
  15. 根据权利要求1所述的无人机对战方法,其特征在于,还包括:
    响应于接收到的烟雾干扰用户指令,将对战目标的隐藏状态添加入第一对战参数信息中。
  16. 根据权利要求15所述的无人机对战方法,其特征在于,还包括根据接收到的第二对战参数信息中包括的隐藏状态,在目标追踪图中将对战目标的实时方位信息隐藏预设时长。
  17. 根据权利要求1所述的无人机对战方法,其特征在于,还包括:
    构造虚拟战场图像输出,并将所述目标追踪图在所述虚拟战场图像中或在所述虚拟战场图像的预设位置输出显示。
  18. 根据权利要求9所述的无人机对战方法,其特征在于,所述无人机标定信息包括无人机机型和机身形态信息。
  19. 无人机对战控制装置,用于控制无人机进行空战竞技游戏,其特征在于,包括:
    显示模块;
    存储器,用于存储可执行指令;以及
    处理器,用于执行存储器中存储的可执行指令,所述可执行指令在由所述处理器执行时使得所述处理器:
    实时获取第一对战参数信息,并向对战目标实时传输所述第一对战参数信息;
    实时接收来自对战目标的第二对战参数信息;
    根据所述第一对战参数信息和第二对战参数信息生成目标追踪图在所述显示模块输出显示,其中,所述目标追踪图中显示有对战目标的实时方位信息。
  20. 根据权利要求19所述的无人机对战控制装置,其特征在于,所述处理器执行根据所述第一对战参数信息和第二对战参数信息生成目标追踪图在所述显示模块输出显示时,具体用于:
    根据第一对战参数信息和第二对战参数信息确定对战目标所处的实时相对位置;
    以自身无人机位置为中心绘制虚拟雷达图在所述显示模块输出,并在虚拟雷达图中以自身无人机位置为中心显示对战目标所处的实时相对位置。
  21. 根据权利要求19或20所述的无人机对战控制装置,其特征在于,所述处理器执行根据所述第一对战参数信息和第二对战参数信息生成目标追踪图在所述显示模块输出显示时,具体用于:
    根据第一对战参数信息和第二对战参数信息确定对战目标所处的实 时相对方位;
    在所述显示模块中,将所述对战目标显示图标在所述锁定图标的实时相对方位处进行输出展示。
  22. 根据权利要求21所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    根据确定的对战目标所处的实时相对方位,在所述锁定图标中显示追踪方向提示消息。
  23. 根据权利要求22所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    在所述锁定图标中显示追踪提示消息的同时,还将所述锁定图标进行变色显示。
  24. 根据权利要求21所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    获取对战目标检测图标,将对战目标进行模拟锁定。
  25. 根据权利要求24所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁。
  26. 根据权利要求25所述的无人机对战控制装置,其特征在于,所述处理器执行获取对战目标检测图标,将对战目标进行模拟锁定时,具体用于:
    接收自身无人机通过图传回传的对战目标检测图标;
    根据接收到的检测图标将对战目标以锁定状态在所述显示模块输出显示。
  27. 根据权利要求25所述的无人机对战控制装置,其特征在于,所述处理器执行获取对战目标检测图标,将对战目标进行模拟锁定时,具体用于:
    接收自身无人机通过图传回传的实时图像;
    根据预先存储的无人机标定信息和/或第二对战参数对实时图像进行对战目标物体检测,在确认当前图像中包括有对战目标时,生成对战目标检测图标,将对战目标以锁定状态输出显示。
  28. 根据权利要求26或27所述的无人机对战控制装置,其特征在于,所述处理器在执行监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁时,具体用于:
    根据对战目标的第一锁定时长,确定对战目标是否为可发射摧毁状态;
    在确定对战目标为可发射摧毁状态时,输出摧毁触发提示。
  29. 根据权利要求28所述的无人机对战控制装置,其特征在于,所述处理器在执行监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁时,具体还用于::
    在确定对战目标为可发射摧毁状态时,响应于接收到的用于发射摧毁的用户指令,确定对战目标的第二锁定时长;
    根据第二锁定时长确定对战目标的摧毁状态;
    在确定对战目标的摧毁状态为处于已摧毁状态时,向对战目标输出已摧毁信息。
  30. 根据权利要求29所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    响应于接收到的已摧毁信息,输出已被摧毁的提示信息。
  31. 根据权利要求29所述的无人机对战控制装置,其特征在于,还包括
    用于触发所述用于发射摧毁的用户指令的第一输入模块。
  32. 根据权利要求26或27所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    在将对战目标以锁定状态输出显示时,还向对战目标输出已被锁定的信息。
  33. 根据权利要求32所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    响应于接收到的已被锁定的信息,输出已被锁定的提示信息。
  34. 根据权利要求19所述的无人机对战控制装置,其特征在于,还包括
    用于触发所述用于发射烟雾干扰用户指令的第二输入模块;
    所述可执行指令在由所述处理器执行时还使得所述处理器:
    响应于接收到的烟雾干扰用户指令,将对战目标的隐藏状态添加入第一对战参数信息中。
  35. 根据权利要求34所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    根据接收到的第二对战参数信息中包括的隐藏状态,在目标追踪图中将对战目标的实时方位信息隐藏预设时长。
  36. 根据权利要求1所述的无人机对战控制装置,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    构造虚拟战场图像输出,并将所述目标追踪图在所述虚拟战场图像中或在所述虚拟战场图像的预设位置输出显示。
  37. 根据权利要求27所述的无人机对战控制装置,其特征在于,所述无人机标定信息包括无人机机型和机身形态信息。
  38. 无人机,能够用于空战竞技游戏,其特征在于,包括:
    具有显示界面的操作终端;
    机载定位模块;
    存储器,用于存储可执行指令;以及
    处理器,用于执行存储器中存储的可执行指令,所述可执行指令在由所述处理器执行时使得所述处理器:
    实时获取第一对战参数信息,并向对战目标实时传输所述第一对战参数信息;
    实时接收来自对战目标的第二对战参数信息;
    根据所述第一对战参数信息和第二对战参数信息在所述操作终端输出目标追踪图,其中,所述目标追踪图中显示有对战目标的实时方位信息。
  39. 根据权利要求38所述的无人机,其特征在于,所述处理器执行根据所述第一对战参数信息和第二对战参数信息输出目标追踪图时,具体用于:
    根据第一对战参数信息和第二对战参数信息确定对战目标所处的实时相对位置;
    以自身无人机位置为中心绘制虚拟雷达图在所述操作终端输出,并在虚拟雷达图中以自身无人机位置为中心显示对战目标所处的实时相对位置。
  40. 根据权利要求38或39所述的无人机,其特征在于,所述处理器执行根据所述第一对战参数信息和第二对战参数信息在所述操作终端输出目标追踪图时,具体用于:
    根据第一对战参数信息和第二对战参数信息确定对战目标所处的实时相对方位;
    在所述操作终端上,将所述对战目标显示图标在所述锁定图标的实时相对方位处进行输出展示。
  41. 根据权利要求40所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    根据确定的对战目标所处的实时相对方位,在所述锁定图标中显示追踪方向提示消息。
  42. 根据权利要求41所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    在所述锁定图标中显示追踪提示消息的同时,还将所述锁定图标进行变色显示。
  43. 根据权利要求40所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    获取对战目标检测图标,将对战目标进行模拟锁定。
  44. 根据权利要求43所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁。
  45. 根据权利要求44所述的无人机,其特征在于,所述处理器执行获取对战目标检测图标,将对战目标进行模拟锁定时,具体用于:
    根据预先配置存储的无人机标定和/或第二对战参数对实时图像进行对战目标物体检测,在确认当前图像中包括有对战目标时,生成对战目标检测图标,将对战目标以锁定状态在所述操作终端输出显示。
  46. 根据权利要求45所述的无人机,其特征在于,所述处理器在执行监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁时,具体用于:
    根据对战目标的第一锁定时长,确定对战目标是否为可发射摧毁状态;
    在确定对战目标为可发射摧毁状态时,输出摧毁触发提示。
  47. 根据权利要求46所述的无人机,其特征在于,所述处理器在执行监测对战目标的模拟锁定时长,根据模拟锁定时长,对对战目标进行模拟摧毁时,具体还用于:
    在确定对战目标为可发射摧毁状态时,响应于接收到的用于发射摧毁的用户指令,确定对战目标的第二锁定时长;
    根据第二锁定时长确定对战目标的摧毁状态;
    在确定对战目标的摧毁状态为处于已摧毁状态时,向对战目标输出已摧毁信息。
  48. 根据权利要求47所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    响应于接收到的已摧毁信息,输出已被摧毁的提示信息。
  49. 根据权利要求48所述的无人机,其特征在于,还包括
    用于触发所述用于发射摧毁的用户指令的第一输入终端。
  50. 根据权利要求45所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    在将对战目标以锁定状态输出显示时,向对战目标输出已被锁定的信息。
  51. 根据权利要求50所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    响应于接收到的已被锁定的信息,输出已被锁定的提示信息。
  52. 根据权利要求38所述的无人机,其特征在于,还包括:
    用于触发所述用于发射烟雾干扰用户指令的第二输入终端;
    所述可执行指令在由所述处理器执行时还使得所述处理器:
    响应于接收到的烟雾干扰用户指令,将对战目标的隐藏状态添加入第一对战参数信息中。
  53. 根据权利要求52所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    根据接收到的第二对战参数信息中包括的隐藏状态,在目标追踪图中将对战目标的实时方位信息隐藏预设时长。
  54. 根据权利要求38所述的无人机,其特征在于,所述可执行指令在由所述处理器执行时还使得所述处理器:
    构造虚拟战场图像在所述操作终端上输出,并将所述目标追踪图在所述虚拟战场图像中或在所述虚拟战场图像的预设位置输出显示。
  55. 根据权利要求45所述的无人机对战方法,其特征在于,所述无人机标定信息包括无人机机型和机身形态信息。
  56. 一种电子设备,其包括:至少一个处理器,以及与所述至少一个处理器通信连接的存储器,其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-18中任意一项所述方法的步骤。
  57. 一种存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现权利要求1-18中任意一项所述方法的步骤。
PCT/CN2020/117741 2020-09-25 2020-09-25 无人机对战方法、无人机对战控制装置、无人机及存储介质 WO2022061712A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/117741 WO2022061712A1 (zh) 2020-09-25 2020-09-25 无人机对战方法、无人机对战控制装置、无人机及存储介质
CN202080008657.6A CN113395999A (zh) 2020-09-25 2020-09-25 无人机对战方法、无人机对战控制装置、无人机及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/117741 WO2022061712A1 (zh) 2020-09-25 2020-09-25 无人机对战方法、无人机对战控制装置、无人机及存储介质

Publications (1)

Publication Number Publication Date
WO2022061712A1 true WO2022061712A1 (zh) 2022-03-31

Family

ID=77616649

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/117741 WO2022061712A1 (zh) 2020-09-25 2020-09-25 无人机对战方法、无人机对战控制装置、无人机及存储介质

Country Status (2)

Country Link
CN (1) CN113395999A (zh)
WO (1) WO2022061712A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070172194A1 (en) * 2006-01-25 2007-07-26 Nobuhiro Suzuki Program for controlling display of simulation video digest
US20070173304A1 (en) * 2006-01-25 2007-07-26 Mcilvain Scott H Electronic game device with hand and foot controls
CN106377901A (zh) * 2016-11-07 2017-02-08 王天尊 一种互联网智能终端红外对战系统及其交互方法
CN110588964A (zh) * 2019-10-08 2019-12-20 台州万际航空器科技有限公司 一种竞技无人机
CN110732132A (zh) * 2019-11-11 2020-01-31 淮安鱼鹰航空科技有限公司 一种机车空地对抗娱乐系统
CN110755842A (zh) * 2019-11-11 2020-02-07 淮安鱼鹰航空科技有限公司 一种双机对抗娱乐系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009000286A (ja) * 2007-06-21 2009-01-08 Taito Corp ゲームシステム及び遠隔操作可能なゲームロボット
CN102205180A (zh) * 2010-03-29 2011-10-05 田瑜 航模空战系统及其使用方法
US20180280780A1 (en) * 2015-09-30 2018-10-04 Nikon Corporation Flying device, moving device, server and program
CN105597308B (zh) * 2015-10-29 2019-01-22 上海圣尧智能科技有限公司 一种无人机、模拟空战游戏设备和模拟空战游戏系统
US9816783B1 (en) * 2016-01-07 2017-11-14 DuckDrone, LLC Drone-target hunting/shooting system
CN108646589B (zh) * 2018-07-11 2021-03-19 北京晶品镜像科技有限公司 一种攻击无人机编队的作战模拟训练系统及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070172194A1 (en) * 2006-01-25 2007-07-26 Nobuhiro Suzuki Program for controlling display of simulation video digest
US20070173304A1 (en) * 2006-01-25 2007-07-26 Mcilvain Scott H Electronic game device with hand and foot controls
CN106377901A (zh) * 2016-11-07 2017-02-08 王天尊 一种互联网智能终端红外对战系统及其交互方法
CN110588964A (zh) * 2019-10-08 2019-12-20 台州万际航空器科技有限公司 一种竞技无人机
CN110732132A (zh) * 2019-11-11 2020-01-31 淮安鱼鹰航空科技有限公司 一种机车空地对抗娱乐系统
CN110755842A (zh) * 2019-11-11 2020-02-07 淮安鱼鹰航空科技有限公司 一种双机对抗娱乐系统

Also Published As

Publication number Publication date
CN113395999A (zh) 2021-09-14

Similar Documents

Publication Publication Date Title
US10302397B1 (en) Drone-target hunting/shooting system
CN110917619B (zh) 互动道具控制方法、装置、终端及存储介质
US9884254B2 (en) Augmented reality gaming systems and methods
CN112090069B (zh) 虚拟场景中的信息提示方法、装置、电子设备及存储介质
CN110507994B (zh) 控制虚拟飞行器飞行的方法、装置、设备及存储介质
WO2019228038A1 (zh) 定位信息提示方法和装置、存储介质及电子装置
US11944904B2 (en) Data synchronization method and apparatus, terminal, server, and storage medium
CN110507990B (zh) 基于虚拟飞行器的互动方法、装置、终端及存储介质
US9833695B2 (en) System and method for presenting a virtual counterpart of an action figure based on action figure state information
US20150170540A1 (en) Weapons training system and methods for operating same
CN106075915B (zh) 一种可以接收多个方向射击激光束的无人机空中对战装置
CN111589150A (zh) 虚拟道具的控制方法、装置、电子设备及存储介质
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
US20070243914A1 (en) Toy combat gaming system
WO2021143290A1 (zh) 虚拟道具的显示方法和装置、存储介质及电子装置
CN110876849A (zh) 虚拟载具的控制方法、装置、设备及存储介质
CN111659116A (zh) 虚拟载具的控制方法、装置、设备及介质
CN111111218A (zh) 虚拟无人机的控制方法和装置、存储介质及电子装置
US20210031109A1 (en) Augmented reality gaming system
CN113022884A (zh) 无人机载荷试验仿真方法及系统
WO2022061712A1 (zh) 无人机对战方法、无人机对战控制装置、无人机及存储介质
CN113457151A (zh) 虚拟道具的控制方法、装置、设备及计算机可读存储介质
CN112704875A (zh) 虚拟道具控制方法、装置、设备及存储介质
CN110960849B (zh) 互动道具控制方法、装置、终端及存储介质
CN116558360A (zh) 基于运动载具的射击模拟训练方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954564

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20954564

Country of ref document: EP

Kind code of ref document: A1