WO2008116982A2 - Procédé de reconnaissance d'objets dans un jeu de tir pour jouets télécommandés - Google Patents
Procédé de reconnaissance d'objets dans un jeu de tir pour jouets télécommandés Download PDFInfo
- Publication number
- WO2008116982A2 WO2008116982A2 PCT/FR2008/000180 FR2008000180W WO2008116982A2 WO 2008116982 A2 WO2008116982 A2 WO 2008116982A2 FR 2008000180 W FR2008000180 W FR 2008000180W WO 2008116982 A2 WO2008116982 A2 WO 2008116982A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- recognition
- video
- video image
- virtual
- Prior art date
Links
Classifications
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A63F13/12—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/573—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/306—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying a marker associated to an object or location in the game field
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8017—Driving on land or water; Flying
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Definitions
- the invention relates to an object recognition method for a video game system.
- Such a system is known from WO 01/95988 A1.
- This document describes a hunting video game involving two remotely piloted vehicles with on-board video cameras.
- One of the two remote-controlled vehicles is the hunter and the other is the prey.
- the video images of the video camera of the hunter vehicle are transmitted to a control unit and are displayed there.
- the video images delivered by the hunter vehicle are scanned to detect the image of the opposing vehicle. If the opposing vehicle is detected in the video image, the opposing vehicle in the image is replaced by a virtual game character.
- the player driving the fighter vehicle using the control unit sees on his video image not the image of the enemy vehicle but that of a virtual game character that he must continue with his vehicle.
- the object recognition method known from WO 01/95988 A1 is however not applicable to a shooting game.
- the object of the present invention is therefore to provide an object recognition method for a shooting video game.
- This object is achieved according to the invention by an object recognition method for a video game system, the system comprising:
- a first remote-controlled vehicle comprising an on-board video camera
- an electronic video display entity for remotely controlling the first vehicle, the method comprising the following steps:
- the electronic entity of the system is preferably a portable console with a video screen.
- the method of the invention it is possible to make a video shooting game with a plurality of remote controlled real toys. Participants each piloting a remotely operated vehicle may use a real environment or scenery in which remotely operated vehicles operate. In particular, players can use real obstacles to try to protect their remotely controlled vehicle from other players' shots. According to the invention, the shots launched by the vehicles are only fictitious and simulated by the game system.
- the invention therefore allows an innovative combination between the aspects of a fully virtual classic video game and a classic remote control vehicle game. entirely real. Thanks to the invention, the players controlling the remote-controlled vehicles can use the elements of the real scenery as elements of the game.
- the recognition of the second vehicle is carried out by recognition of distinctive elements arranged on the second vehicle, namely electroluminescent diodes.
- the light-emitting diodes are arranged in the form of a specific geometry.
- the light-emitting diodes may also flash at a predetermined frequency, have specific colors, and / or have variable colors in time, to facilitate the recognition of the second vehicle.
- the recognition of the light-emitting diodes may also include the measurement of their total luminosity in order to make it possible to estimate the visible portion of the light-emitting diode. the second vehicle according to the value of the measured brightness.
- the method of the invention further uses speed and / or acceleration and / or position sensors on the target vehicle and / or the shooter vehicle. The sensors allow the two machines to know in real time their coordinates in space. These can then be transmitted by radio means.
- the method according to the invention further comprises a step of predicting the displacement of the second vehicle in the video image on the basis of: a measurement of the displacement of the first vehicle, and / or the prior displacement of the second vehicle in the video image.
- the two preceding features namely the use of sensors embedded on the vehicles and the prediction of the displacement of the second vehicle in the video image, can be coupled.
- a software merges the position information to have relevant information free of drifts that may be the result of inertial units on board each of the two vehicles.
- the vehicles estimate their position finely. The fact that an enemy vehicle suddenly disappears from the estimated position indicates that it was most likely hidden behind a real obstacle.
- FIG. 1a shows the operation of a first fictitious event simulation system according to the invention
- Figure 1b shows a second system for adding imaginary objects, for example obstacles
- Figure 1c shows imaginary elements added by the game on the display of the game console
- FIG. 1d shows a third system according to the invention with on-board automatic piloting
- FIG. 2a shows the operation of a second stimulation system according to the invention completed by an autopilot
- Figure 2b shows the complete system: fictitious events, imaginary objects and autopilot interact in a very complete game.
- Figure 3 shows two remotely operated vehicles and their associated remote controls;
- Figures 4a and 4b show the video display of a first shooter according to the invention;
- FIG. 5 shows the real environment of a second shooting game according to the invention
- Figures 6, 6a, 6b and 6c show different video images from the second shooting game according to the invention
- Figures 7a and 7b show the video image of the second shooter when executing a fictitious shot by a player
- FIG. 8 is a flowchart of the object recognition method for validation or invalidation of the imaginary shots made during the shooting games.
- FIG. 9 is a parabolic fire validation or invalidation flowchart when the target is hidden by a real obstacle.
- FIG. 1a shows the concept of the invention which is that of a video game
- a simulator 1 supervises the operation of a radio controlled toy 3.
- the simulator 1 of the video game modifies the instructions 5, which aim to control the toy 3, of the player 7.
- the toy 3 receives steering instructions 9 of the simulator 1. These instructions 9 are generated by the simulator 1 taking into account the control instructions 5.
- the toy 3 is not only dependent on the instructions received 9 but also the physical events external to the game.
- Sensors 13 arranged on the toy 3 transmit information on the environment of the toy 3 to viewing means 15 of the video game. The information from the sensors 13 allows the video game system to estimate the state changes of the toy 3 in its real environment.
- the display means 15 uses the information of the sensors 13 to generate a display on a screen 21 of a control unit 23 manipulated by the player 7.
- the sensors 13 comprise in particular a video camera 25 which is embedded on the remote controlled toy 3.
- This video camera 25 delivers video images which are displayed by the display means 15 on the screen 21 of the player 7.
- the video camera 25 gives therefore to the player 7 a perspective such as "perceived" by the remote control vehicle 3.
- the toy can also be provided with other additional sensors. These can be very simple sensors such as accelerometers or extremely sophisticated sensors such as an inertial unit.
- the video game is a visualization. For example, with a gyroscope and / or accelerometers and visualization software, the video game can reconstruct an artificial horizon if the remotely operated vehicle is an unmanned aircraft. The role and operation of the simulator 1 will now be described in detail.
- the simulator 1 is located between the player 7 and the radio-controlled toy 3. It receives the piloting instructions 5 issued by the player 7. These actions or control commands 5 represent the modifications that the player 7 wishes to impart to the propulsion elements (such as the motor of the toy 3) and / or guide (such as the surfaces of the toy 3) to, for example, direct the toy 3 in a certain direction.
- the propulsion elements such as the motor of the toy 3
- guide such as the surfaces of the toy 3
- pilot actions 5 are not directly transmitted as such to the remote controlled toy.
- the toy 3 is decoupled from the control of the player 7 through the simulator 1.
- the simulator 1 which directly controls the toy 3 by issuing the instructions 9.
- These instructions 9 are created by the simulator 1 taking into account account instructions 5.
- the simulator 1 generates the instructions 9 not only with the instructions 5, but especially also through "piloting characteristics" which are auto-generated by the simulator 1.
- These driving characteristics are created according to the video game chosen by the player 7.
- the simulator 1 simulates new events not present in the physical world that have an impact on the toy 3.
- These fictional events are "translated" into driving characteristics that change the instructions 9 sent to the toy 3, so that the behavior of the toy 3 is changed.
- the simulator 1 can simulate a toy engine crash 3. If the toy 3 is an airplane, the simulator 1 can artificially make the aircraft "heavier” by creating driving characteristics that gives the player 7 the impression that the aircraft 3 reacts more slowly to its commands 5 than usual.
- the simulator 1 can also create complete fictional scenarios such as the passage of a thunderstorm by the aircraft 3.
- the simulator 1 generates driving characteristics which lead to instructions 9 which have the effect of shaking the aircraft. remotely controlled aircraft 3 as if it were gusting wind.
- the simulator 1 intervenes in a complex and real-time manner in the piloting of the toy 3 to provide the player 7 a very rich gaming experience. It's not just about watching the player's control to intervene in case of mistakes or danger.
- the simulator 1 exerts an active and Legiist influence on the driving of the toy 3 to give it a more varied and more entertaining behavior.
- the main difference with a typical video game running entirely on a computer without a real remote controlled vehicle is that the change of state is not driven solely by the simulation, but that it is the result of the change of the open loop and that this change of state is measured by sensors.
- Figure 1b is a more complete version of the system of the invention.
- the simulator adds to the instructions imaginary elements in addition to the combined events. These imaginary elements may for example be obstacles or, more interestingly, virtual objects endowed with a behavior, for example virtual enemies.
- the imaginary elements can also be virtual elements of the radio-controlled toy itself, such as a weapon system with a viewfinder and a virtual weapon that sends virtual projectiles.
- two additional feedback loops are added to the system.
- the information from the toy's sensors is used by the simulator to estimate the position of the radio-controlled toy, for example, to determine whether a virtual enemy's shot hit the toy.
- the second feedback loop is defined between the simulator and the display software.
- the display software is informed of the movement of the virtual objects so as to be able to perform a composite display. For example, adding on the video image virtual elements: obstacles, virtual enemies or elements of the firing system as the element 43 of Figure 4a.
- Figure 1c is an example of augmented reality.
- the image is composed of reality augmented by imaginary objects.
- Figure 1d shows a loop when there is an automatic pilot.
- a feedback loop is performed on the drone itself. The same sensors as used for the display loop are used.
- Figure 2a is an even more complete version of the system of the invention.
- An autopilot 27 is added to the system. It allows to enslave the operation of the aircraft 3. With the autopilot 27, the aircraft 3 can be made more stable and predictable in its behavior. Thus, the sources of interaction between the video game 29 and the toy 3 are more numerous. In the case of a system with an autopilot 27, the toy 3 can be qualified as a drone since it has the ability to move autonomously in the real scenery without the need for piloting on the part of the player 7.
- the autopilot 27 has a command envelope. If the vehicle 3 is a tank, the envelope can for example define its maximum speed, its maximum acceleration, the turning speed in a turn, etc. If the machine 3 is a quadricopter, the control envelope of the autopilot 27 can define its maximum upward speed, its maximum angular velocities as well as the description and the main transition between the hover and the forward march. The control envelope of the autopilot 27 is therefore a set of data defining motion constraints of which the vehicle 3 is capable. The control envelope of the autopilot 27 therefore limits the capabilities of the vehicle 3. By manipulating the envelope of the autopilot, the video game can simulate several physical quantities.
- the simulator 1 can create very varied fictional scenarios. For example, at the beginning of the game sequence, the simulator 1 can simulate a vehicle 3 heavier because filled with fuel. As the game develops, the simulator 1 will simulate a lightening of the vehicle 3. Or, the simulator 1 may simulate a mission of depositing fictitious equipment that the vehicle 3 carries. Again, the simulator 1 will generate driving characteristics leading to instructions 9 giving the player 7 the impression that the weight of the vehicle 3 changes during the game as and when depositing the fictitious equipment.
- the instruction between the simulator and the pilot is in our example to modify virtually the weight of the drone.
- the driving characteristics can impose a high speed and a low acceleration to the toy 3.
- the player 7 will not be able to bring his toy 3 at a high speed, even if these control commands 5 ask for it.
- the driving characteristics change, which gradually increases the speed and acceleration limits that the toy 3 is capable of.
- the player will be able to reach higher speeds with his toy 3. In this way, the player really feels like driving a toy that lighten up over time, even if it only simulates.
- the autopilot controls can be carried out directly by the simulator in a "remote autopilot" mode.
- the simulator sends higher level instructions to the autopilot such as fictitious damage, modification of the weight of the vehicle, or the instruction to land emergency following virtual events from the video game.
- the simulator 1 can modify the instructions of the player 7 in a certain way, in particular by superimposing on them new events of the same class as the instructions. This superposition can be an addition, a subtraction, a division, a multiplication, the setting of terminals, etc. Indeed, the superposition can be done by any combination of arithmetic and / or logical operations.
- Such a superposition for example by addition or subtraction of signals generated by the simulator 1 to the control signals 5 given by the player 7, is very useful for the simulation of events intended to bias the steering of the vehicle 3.
- the vehicle 3 is a tank, thanks to the superposition, it is possible to simulate that the tank has received a blow from an opponent. The shot on goal is simulated as not fatal but as damaging the tank.
- the simulator 1 will thus superimpose signals to the player's actions that simulate a tendency of the car to turn slowly on itself. Player 7 must then adapt his control to compensate for this superimposed bias. In the case, the player 7 will have to control the tank in the opposite direction to compensate for the bias created by the simulator 1.
- the steering can be made more complex by simulating wind in adding a drift component.
- We can simulate even more complex and simulate gusts of wind.
- the player must then compensate for these events that follow one another.
- the video game takes full control of the toy 3, either temporarily or permanently. For example, if the vehicle 3 is a tank, it can be simulated that it took a hit.
- the simulator 1 then takes control exclusively and runs the craft on itself and shakes it. Then, the commands are returned to the player.
- the vehicle receives a fatal blow, which is simulated so that the vehicle makes a final lurch before stopping permanently. The game is then finished.
- Figure 2b shows a complete simulation system where feedback loops combine at three different levels to create a complete augmented reality game.
- the first device is the one that transforms the "instructions" of the player into instructions for example to add imaginary events.
- the first feedback loop is where virtual objects are added to the system by the simulator.
- the display software then combines the sensor measurements and the information from the simulator to produce a composite display of the actual image augmented with the virtual objects.
- the second feedback loop is where the sensor measurements of the radio controlled toy are used by the simulator. They allow him to simulate virtual objects, for example, to check whether the toy hits a virtual obstacle, or to inform a virtual enemy who would like to pursue the radio-controlled toy.
- FIG. 3 shows an implementation of the system of FIG. 1 or 2 in the form of a first remote-controlled vehicle 31 and a second remote-controlled vehicle 33 with their associated remotes 35 and 37.
- the remotely controlled vehicles 31 and 33 are chariot shape.
- the remote controls 35 and 37 are in the form of portable consoles with a video display 39 and control buttons 41.
- Each of the tanks 31 and 33 has an on-board video camera. The video image delivered by the onboard camera is sent to the corresponding portable console 35, 37 and is displayed on the screen 39.
- a user controls his tank 31, 33 with the controls 41 of the game console 35, 37.
- Communication between game consoles and tanks as well as that a communication between the tanks themselves is preferably by Bluetooth protocol or Wifi (registered trademarks).
- the tanks 31 and 33 as well as the consoles 35 and 37 can be used by two players to engage a shooter whose representations are in Figures 4a and 4b.
- FIG. 4a is an example of a video image delivered by one of the two video cameras of one of the tanks.
- the video image 47 is retransmitted from the tank and presented to the tank driver on the screen 39 of his game console.
- Various virtual elements are embedded in the video image such as the cross-hairs of the virtual viewfinder 43 and a virtual scale. 49 giving an indication of the rise of the tank's barrel.
- the player moves the virtual reticle 43 on the video image 47 until it is superimposed with the image 45 of the opponent as shown in Figure 4b.
- the player has targeted his opponent, it is enough to trigger a shot.
- the video game then simulates the path of a fictional shell, in particular its speed, its parabolic trajectory, the angle of impact, parameters that are all estimated by the video game.
- the game is played not between two remotely controlled tanks but between a toy in the form of an anti-aircraft vehicle 51 with an onboard video camera 25 and a toy in the form of a quadrocopter 53.
- 5 shows an overview of a real environment 55 with, for example, as an obstacle a real tree 57.
- the vehicle 51 and the quadrocopter 53 move in the real scenery 55 according to the piloting instructions.
- One player controls the anti-aircraft vehicle 51 and the other player the quadrocopter 53.
- the pilot of the anti-aircraft vehicle 51 will try to cut down virtually the quadricopter 53 directed by the other player.
- FIGS. 6a to 6c show the point of view delivered to the driver of the vehicle 51 by the video camera 25 on board this vehicle.
- Figures 6a to 6c show a real scenery different from that of Figure 5.
- the real environment includes two houses and several trees.
- the quadricopter 53 is perfectly visible.
- the quadricopter 53 has moved and is partially masked by vegetation.
- the quadrocopter has turned around and is hiding behind one of the houses.
- the game system according to the invention is capable of judging whether the quadricopter 53 is visible, partially visible or hidden in the video images as shown in FIGS. 6a to 6c.
- This recognition procedure is reproduced in detail by the flowchart of FIG. 8. Thanks to this recognition of objects in the image, the game system is able to validate a fictitious shot towards the quadrocopter 53 if it is recognized as visible in the video image (and correctly targeted). If quadcopter 53 is recognized as hidden, shooting in its direction is invalidated.
- FIGS. 7a and 7b show two video images as delivered by the camera 25 on the anti-aircraft vehicle 51 of FIG.
- FIGS. 7a and 7b detail the recognition procedure of the quadricopter 53 in the video image and the positioning of a virtual reticle 43 for fictitious shooting.
- the driver of the vehicle 51 moves a virtual reticle 43 representing his weapon on the video image to aim his opponent, the quadrocopter 53.
- the player moves his virtual reticle 43 until it is superimposed with the image of the quadrocopter 53 as shown in Figure 7b.
- the player then triggers his shot.
- FIG. 8 reproduces in detail the object recognition and validation procedure. shooting used by the video game system.
- the procedure starts, step 100, by displaying the video image delivered by the video camera 25 on the screen of the shooter's game console.
- the video game system inserts a virtual reticle 43 into the displayed video image.
- the shooter can now move this virtual reticle 43 to aim and then trigger the shot.
- the triggering of the shot is detected in step 102.
- the system acquires the instantaneous position A of the virtual reticle 43 on the video image, step 103.
- an image recognition software scans the video image to recognize the presence of the opposing vehicle 53 in the video image, step 104.
- the recognition of the opponent in the video image. image can be made in different ways.
- the recognition software can simply scan the video image trying to find the known form of the opposing vehicle 53.
- the opposing vehicle 53 has recognition elements arranged on its surface. It may for example be reflective elements.
- the opposing vehicle 53 has many light-emitting diodes (LEDs) flashing around it. These light emitting diodes have a known geometry, flash at a known frequency and have known colors.
- the recognition software can thus detect the adverse vehicle 53 more easily in the video image. It is also conceivable to use multi-color luminescent electrodes that change from green to red and orange to facilitate recognition.
- the diodes can also emit in the infrared. In this way, there are no more elements visible to the human eye in the game system.
- the image recognition software can be embedded either in the remote-controlled vehicle 51 or in the control unit, ie the game console.
- the recognition software can include an algorithm for tracking the recognition elements of the opposing vehicle 53. Image after image, this algorithm measures the displacement of the recognition elements. To do this, he can rely on the movement of his own vehicle 51, which allows him to predict where in the image should be the opposing vehicle 53, adding the previous movement found of the opposing vehicle 53 to allow to the software to predict its position.
- the sensors 13 on each of the radio-controlled toys are used. The information is sent not only to the game console used to control the machine but also to all other game consoles.
- the simulator knows the estimated position of the enemy vehicle.
- the video recognition means can not find the enemy vehicle in the image but to check if there is not a real object between the two vehicles. This principle is very interesting in the context of video games. It makes it easy to take into account real obstacles. In this way players can, as shown in Figure 7b, hide behind a tree.
- the recognition step 104 having been performed as described, the procedure continues according to the result of the recognition.
- step 105 If the recognition is negative, the shot is invalidated, step 105.
- the next step 106 is to proceed to the acquisition of the position B instantaneous of the opposing vehicle 53 in the video image. Then, the position B of the opposing vehicle 53 is compared with the position At the moment when the positions are identical, the shot is validated, step 108, that is to say that the fictitious shot has reached its goal. If the positions are different, the firing is invalidated, step 109. To make the object recognition even more effective, the opposing vehicle 53 can further transmit its instantaneous position to the firing vehicle 51.
- the vehicles have sensors and in particular an inertial unit.
- the vehicles are transmitted their position.
- the vehicle enters the field of view of the video camera its position transmitted by radio is known. It is used as initial position by the detection algorithms. This makes them converge more easily and faster. Both sensor methods and video displacement prediction can be coupled.
- a software merges the position information to have relevant information free of drifts of the inertial unit.
- the vehicles estimate in a fine way their position. The fact that the detection algorithm indicates that the vehicle suddenly disappears from the estimated position means that there is very likely a real obstacle between the two vehicles.
- the game system knows the position of the opposing vehicle 53 in the image.
- the motion recognition and prediction software tracks the movement of the opponent to the point of virtual impact.
- the simulation software can simulate complex shooting parameters: for example, the initial velocity of a shell can be very fast during firing and decrease rapidly.
- the software can also simulate complex ammunition behaviors such as projectile blast near the target or guided and self-guided missiles. This type of projectile simulation makes the game even more interesting, an opponent can see the start of an enemy projectile and in a very short time maneuver his craft to avoid it. It also makes it possible to simulate the extremely complex situations of a shooting game between fighter planes in which the trajectory of the projectiles depends on that of the firing plane, and the position of the point of impact depends on the position of the aircraft. opponent several seconds after firing.
- Figure 9 shows the flow chart of indirect fire.
- Position C is the virtual position calculated by the projectile simulator.
- the object of the algorithm is to verify that the position of the virtual projectile is identical to that of the opposing apparatus at position B.
- the position A is that of the cross-piece serving only to define the initial conditions of firing. of the projectile.
- the position C of the projectile evolves over time, the duration of simulation is limited by the software.
- the object recognition procedure can also perform a partial recognition, that is, an estimate if the object is partially hidden.
- This partial recognition can be done by measuring the variation of the brightness of the light-emitting diodes from one image to another.
- the system may assume that when the brightness drops by half, half of the light-emitting diodes are masked, which means that the opposing vehicle is half-hidden.
- the invention is not limited to a one-to-one game. or two. Indeed, the games described can be performed with more than two players and more than two remote controlled toys.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Toys (AREA)
- Selective Calling Equipment (AREA)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP08761879A EP2125127A2 (fr) | 2007-02-13 | 2008-02-13 | Procédé de reconnaissance d'objets dans un jeu de tir pour jouets télécommandés |
US12/526,933 US20100178966A1 (en) | 2007-02-13 | 2008-02-13 | A method of recognizing objects in a shooter game for remote-controlled toys |
JP2009549841A JP2010518354A (ja) | 2007-02-13 | 2008-02-13 | 遠隔操作玩具用のシューティング・ゲームにおいて対象を認識するための方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0700998A FR2912318B1 (fr) | 2007-02-13 | 2007-02-13 | Reconnaissance d'objets dans un jeu de tir pour jouets telecommandes |
FR0700998 | 2007-02-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2008116982A2 true WO2008116982A2 (fr) | 2008-10-02 |
WO2008116982A3 WO2008116982A3 (fr) | 2008-12-24 |
Family
ID=38038896
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2008/000180 WO2008116982A2 (fr) | 2007-02-13 | 2008-02-13 | Procédé de reconnaissance d'objets dans un jeu de tir pour jouets télécommandés |
Country Status (5)
Country | Link |
---|---|
US (1) | US20100178966A1 (ja) |
EP (1) | EP2125127A2 (ja) |
JP (1) | JP2010518354A (ja) |
FR (1) | FR2912318B1 (ja) |
WO (1) | WO2008116982A2 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2953014A1 (fr) * | 2009-11-24 | 2011-05-27 | Parrot | Balise de jalonnement pour l'orientation et la navigation a vue d'un drone |
US20110151955A1 (en) * | 2009-12-23 | 2011-06-23 | Exent Technologies, Ltd. | Multi-player augmented reality combat |
US8903568B1 (en) | 2013-07-31 | 2014-12-02 | SZ DJI Technology Co., Ltd | Remote control method and terminal |
US8938160B2 (en) | 2011-09-09 | 2015-01-20 | SZ DJI Technology Co., Ltd | Stabilizing platform |
US9277130B2 (en) | 2013-10-08 | 2016-03-01 | SZ DJI Technology Co., Ltd | Apparatus and methods for stabilization and vibration reduction |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3117768B1 (en) | 2006-05-19 | 2019-11-06 | The Queen's Medical Center | Motion tracking system and method for real time adaptive imaging and spectroscopy |
JP5124351B2 (ja) * | 2008-06-04 | 2013-01-23 | 三洋電機株式会社 | 車両操作システム |
FR2939325B1 (fr) | 2008-12-04 | 2015-10-16 | Parrot | Systeme de drones munis de balises de reconnaissance |
JP5558730B2 (ja) * | 2009-03-24 | 2014-07-23 | 株式会社バンダイナムコゲームス | プログラム及びゲーム装置 |
US10188958B2 (en) | 2009-05-28 | 2019-01-29 | Anki, Inc. | Automated detection of surface layout |
US9155961B2 (en) | 2009-05-28 | 2015-10-13 | Anki, Inc. | Mobile agents for manipulating, moving, and/or reorienting components |
US8882560B2 (en) | 2009-05-28 | 2014-11-11 | Anki, Inc. | Integration of a robotic system with one or more mobile computing devices |
EP2786791A3 (en) | 2009-05-28 | 2015-01-07 | Anki, Inc. | Distributed system of autonomously controlled mobile agents |
CN101905087B (zh) * | 2009-06-05 | 2012-07-11 | 牟善钶 | 遥控玩具与电子游戏机的整合接口 |
US20110025542A1 (en) * | 2009-08-03 | 2011-02-03 | Shanker Mo | Integration Interface of a Remote Control Toy and an Electronic Game |
FR2957266B1 (fr) * | 2010-03-11 | 2012-04-20 | Parrot | Procede et appareil de telecommande d'un drone, notamment d'un drone a voilure tournante. |
GB2482119B (en) | 2010-07-19 | 2013-01-23 | China Ind Ltd | Racing vehicle game |
KR101299910B1 (ko) * | 2010-08-18 | 2013-08-23 | 주식회사 팬택 | 증강 현실 서비스의 공유 방법 및 그를 위한 사용자 단말기와 원격자 단말기 |
EP2747641A4 (en) | 2011-08-26 | 2015-04-01 | Kineticor Inc | METHOD, SYSTEMS AND DEVICES FOR SCAN INTERNAL MOTION CORRECTION |
EP2862604B1 (en) * | 2012-06-05 | 2018-05-02 | Sony Corporation | Information processing device, information processing method, program and toy system |
TWI630505B (zh) * | 2012-08-28 | 2018-07-21 | 仁寶電腦工業股份有限公司 | 互動式擴增實境系統及其可攜式通訊裝置與互動方法 |
US9004973B2 (en) | 2012-10-05 | 2015-04-14 | Qfo Labs, Inc. | Remote-control flying copter and method |
US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9782141B2 (en) | 2013-02-01 | 2017-10-10 | Kineticor, Inc. | Motion tracking system for real time adaptive motion compensation in biomedical imaging |
EP2778819A1 (en) * | 2013-03-12 | 2014-09-17 | Thomson Licensing | Method for shooting a film performance using an unmanned aerial vehicle |
KR101298030B1 (ko) * | 2013-03-12 | 2013-08-26 | 주식회사 네시삼십삼분 | 슈팅 게임이 기록된 컴퓨터 판독 가능한 기록매체 |
CN104134070B (zh) * | 2013-05-03 | 2018-05-29 | 仁宝电脑工业股份有限公司 | 交互式对象追踪系统及其交互式对象与追踪方法 |
US10004462B2 (en) | 2014-03-24 | 2018-06-26 | Kineticor, Inc. | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
WO2016014718A1 (en) | 2014-07-23 | 2016-01-28 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9861882B2 (en) * | 2014-09-05 | 2018-01-09 | Trigger Global Inc. | Augmented reality gaming systems and methods |
US9996369B2 (en) | 2015-01-05 | 2018-06-12 | Anki, Inc. | Adaptive data analytics service |
US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
CN108697367A (zh) | 2015-11-23 | 2018-10-23 | 凯内蒂科尓股份有限公司 | 用于在医学成像扫描期间跟踪并补偿患者运动的系统、装置和方法 |
US10258888B2 (en) * | 2015-11-23 | 2019-04-16 | Qfo Labs, Inc. | Method and system for integrated real and virtual game play for multiple remotely-controlled aircraft |
US10197998B2 (en) | 2015-12-27 | 2019-02-05 | Spin Master Ltd. | Remotely controlled motile device system |
US10238962B2 (en) * | 2015-12-27 | 2019-03-26 | Spin Master Ltd. | System and method for recharging battery in augmented reality game system |
US9996978B2 (en) | 2016-02-08 | 2018-06-12 | Disney Enterprises, Inc. | System and method of simulating first-person control of remote-controlled vehicles |
US9922465B2 (en) | 2016-05-17 | 2018-03-20 | Disney Enterprises, Inc. | Systems and methods for changing a perceived speed of motion associated with a user |
CN107077216A (zh) * | 2016-12-19 | 2017-08-18 | 深圳市阳日电子有限公司 | 一种画面显示的方法及移动终端 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63277081A (ja) * | 1987-05-08 | 1988-11-15 | 株式会社 アスキ− | ゲ−ム装置 |
JP3424959B2 (ja) * | 1993-06-16 | 2003-07-07 | 株式会社ナムコ | ゲーム装置及びゲームを行うための装置の制御方法 |
JPH11309269A (ja) * | 1998-04-27 | 1999-11-09 | Sony Corp | ゲーム装置、シミュレーション装置及びゲーム画像表示方法 |
US6752720B1 (en) * | 2000-06-15 | 2004-06-22 | Intel Corporation | Mobile remote control video gaming system |
JP3686919B2 (ja) * | 2000-12-06 | 2005-08-24 | 株式会社ニコン技術工房 | ゲーム装置、ゲーム処理方法及び可読記憶媒体 |
US20030232649A1 (en) * | 2002-06-18 | 2003-12-18 | Gizis Alexander C.M. | Gaming system and method |
US7234992B2 (en) * | 2002-11-01 | 2007-06-26 | Mattel, Inc. | Remotely controlled toy vehicles with light(s) |
US7704119B2 (en) * | 2004-02-19 | 2010-04-27 | Evans Janet E | Remote control game system with selective component disablement |
JP4507884B2 (ja) * | 2005-01-11 | 2010-07-21 | トヨタ自動車株式会社 | 遠隔制御システム及び遠隔制御装置を備える車両 |
US7317406B2 (en) * | 2005-02-03 | 2008-01-08 | Toyota Technical Center Usa, Inc. | Infrastructure-based collision warning using artificial intelligence |
US7211980B1 (en) * | 2006-07-05 | 2007-05-01 | Battelle Energy Alliance, Llc | Robotic follow system and method |
WO2008091422A2 (en) * | 2006-10-19 | 2008-07-31 | Rocket Racing, Inc. | Rocket-powered vehicle racing reality system |
US8682502B2 (en) * | 2007-03-28 | 2014-03-25 | Irobot Corporation | Remote vehicle control system and method |
-
2007
- 2007-02-13 FR FR0700998A patent/FR2912318B1/fr not_active Expired - Fee Related
-
2008
- 2008-02-13 JP JP2009549841A patent/JP2010518354A/ja active Pending
- 2008-02-13 WO PCT/FR2008/000180 patent/WO2008116982A2/fr active Application Filing
- 2008-02-13 EP EP08761879A patent/EP2125127A2/fr not_active Withdrawn
- 2008-02-13 US US12/526,933 patent/US20100178966A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
None |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2953014A1 (fr) * | 2009-11-24 | 2011-05-27 | Parrot | Balise de jalonnement pour l'orientation et la navigation a vue d'un drone |
WO2011067494A2 (fr) | 2009-11-24 | 2011-06-09 | Parrot | Balise de jalonnement pour l'orientation et la navigation a vue d'un drone |
WO2011067494A3 (fr) * | 2009-11-24 | 2011-08-11 | Parrot | Balise de jalonnement pour l'orientation et la navigation a vue d'un drone |
US20110151955A1 (en) * | 2009-12-23 | 2011-06-23 | Exent Technologies, Ltd. | Multi-player augmented reality combat |
US11140322B2 (en) | 2011-09-09 | 2021-10-05 | Sz Dji Osmo Technology Co., Ltd. | Stabilizing platform |
US8938160B2 (en) | 2011-09-09 | 2015-01-20 | SZ DJI Technology Co., Ltd | Stabilizing platform |
US10321060B2 (en) | 2011-09-09 | 2019-06-11 | Sz Dji Osmo Technology Co., Ltd. | Stabilizing platform |
US9648240B2 (en) | 2011-09-09 | 2017-05-09 | SZ DJI Technology Co., Ltd | Stabilizing platform |
US9493232B2 (en) | 2013-07-31 | 2016-11-15 | SZ DJI Technology Co., Ltd. | Remote control method and terminal |
US9927812B2 (en) | 2013-07-31 | 2018-03-27 | Sz Dji Technology, Co., Ltd. | Remote control method and terminal |
US10747225B2 (en) | 2013-07-31 | 2020-08-18 | SZ DJI Technology Co., Ltd. | Remote control method and terminal |
US8903568B1 (en) | 2013-07-31 | 2014-12-02 | SZ DJI Technology Co., Ltd | Remote control method and terminal |
US11385645B2 (en) | 2013-07-31 | 2022-07-12 | SZ DJI Technology Co., Ltd. | Remote control method and terminal |
US9485427B2 (en) | 2013-10-08 | 2016-11-01 | SZ DJI Technology Co., Ltd | Apparatus and methods for stabilization and vibration reduction |
US9277130B2 (en) | 2013-10-08 | 2016-03-01 | SZ DJI Technology Co., Ltd | Apparatus and methods for stabilization and vibration reduction |
US10334171B2 (en) | 2013-10-08 | 2019-06-25 | Sz Dji Osmo Technology Co., Ltd. | Apparatus and methods for stabilization and vibration reduction |
US11134196B2 (en) | 2013-10-08 | 2021-09-28 | Sz Dji Osmo Technology Co., Ltd. | Apparatus and methods for stabilization and vibration reduction |
US11962905B2 (en) | 2013-10-08 | 2024-04-16 | Sz Dji Osmo Technology Co., Ltd. | Apparatus and methods for stabilization and vibration reduction |
Also Published As
Publication number | Publication date |
---|---|
JP2010518354A (ja) | 2010-05-27 |
WO2008116982A3 (fr) | 2008-12-24 |
FR2912318A1 (fr) | 2008-08-15 |
FR2912318B1 (fr) | 2016-12-30 |
EP2125127A2 (fr) | 2009-12-02 |
US20100178966A1 (en) | 2010-07-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2008116982A2 (fr) | Procédé de reconnaissance d'objets dans un jeu de tir pour jouets télécommandés | |
US8827804B2 (en) | Target interface | |
US7704119B2 (en) | Remote control game system with selective component disablement | |
CA2282088C (fr) | Simulateur de tir de missiles avec immersion du tireur dans un espace virtuel | |
US20160067608A1 (en) | Augmented reality gaming systems and methods | |
FR2908323A1 (fr) | Procede de definition d'un referentiel commun pour un systeme de jeux video | |
FR2908324A1 (fr) | Procede d'ajustement d'affichage pour un systeme de jeux video | |
EP2099541A1 (fr) | Procede de definition de zone de jeux pour un systeme de jeux video | |
EP3304522B1 (fr) | Système de création d'un environnement | |
CN110876849B (zh) | 虚拟载具的控制方法、装置、设备及存储介质 | |
US9836984B2 (en) | Storytelling environment: story and playgroup creation | |
CN110507990B (zh) | 基于虚拟飞行器的互动方法、装置、终端及存储介质 | |
US10928915B2 (en) | Distributed storytelling environment | |
CN111202979B (zh) | 虚拟道具控制方法、装置、电子设备及存储介质 | |
CN111659116A (zh) | 虚拟载具的控制方法、装置、设备及介质 | |
JP2019097978A (ja) | ゲームシステム及びプログラム | |
CN113713393B (zh) | 虚拟道具的控制方法、装置和存储介质及电子设备 | |
CN114432701A (zh) | 基于虚拟场景的射线显示方法、装置、设备以及存储介质 | |
CN112973119A (zh) | 虚拟与现实结合的多人体感系统、方法、装置及介质 | |
CN112755524B (zh) | 虚拟目标展示方法、装置、电子设备及存储介质 | |
CN112121433A (zh) | 虚拟道具的处理方法、装置、设备及计算机可读存储介质 | |
EP2643830A2 (fr) | Procede de simulation de tirs au-dela de la vue directe et simulateur de tirs apte a mettre en oeuvre le procede | |
Martell | Framing Ludens: Pawn Swapping and Game Mode Alteration in an Unreal Engine Game Level | |
FR2554619A1 (fr) | Systeme simulateur pour la formation d'une personne operatrice devant apprendre a manoeuvrer un dispositif dans ou par rapport a un environnement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08761879 Country of ref document: EP Kind code of ref document: A2 |
|
ENP | Entry into the national phase |
Ref document number: 2009549841 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008761879 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12526933 Country of ref document: US |