WO2018092947A1 - Système de jeu de robot intelligent - Google Patents

Système de jeu de robot intelligent Download PDF

Info

Publication number
WO2018092947A1
WO2018092947A1 PCT/KR2016/013322 KR2016013322W WO2018092947A1 WO 2018092947 A1 WO2018092947 A1 WO 2018092947A1 KR 2016013322 W KR2016013322 W KR 2016013322W WO 2018092947 A1 WO2018092947 A1 WO 2018092947A1
Authority
WO
WIPO (PCT)
Prior art keywords
smart
smart robot
battlefield
user
module
Prior art date
Application number
PCT/KR2016/013322
Other languages
English (en)
Korean (ko)
Inventor
유환수
Original Assignee
주식회사 트라이캐치미디어
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 트라이캐치미디어 filed Critical 주식회사 트라이캐치미디어
Priority to PCT/KR2016/013322 priority Critical patent/WO2018092947A1/fr
Publication of WO2018092947A1 publication Critical patent/WO2018092947A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/14Racing games, traffic games, or obstacle games characterised by figures moved by action of the players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a smart robot game system, and more particularly, to create an environment synchronized with the battlefield of the reality and the OID (Optical Identification Device) using a smart robot in virtual reality, and can perform a simulation game using the same. It is about a system.
  • OID Optical Identification Device
  • Drone bombers used by the US military are wirelessly controlled and perform missions to detect or bomb enemy lines by executing pre-input commands or performing new commands based on information received in real time.
  • Virtual simulation programming is also used to acquire accurate drones and control technologies such as wireless mini helicopters.
  • the development of computer system and computer graphic technology for realizing virtual world and development of BCI and HMD device technology that can analyze user's emotion and stimulate cognitive vision provide various virtual reality environment and provide more realistic virtual world. Access is now possible.
  • the present invention has been proposed in order to solve the above-described problems, and an object thereof is to provide a smart robot game system in which a user operates a real tank by directly adjusting a toy tank by providing a battlefield of virtual reality.
  • a smart robot game system is a smart robot type in the form of a tank, and moves based on a user's control, and transmits information about an actual position using an optical code.
  • Smart robot platform A user device based on a control interface for manipulating the smart robot and a view interface showing a battlefield of virtual reality;
  • a battle field configured to be synchronized with the entire battlefield of the OID-based matte type terrain and the virtual reality on which the 3D object is coated so that the smart robot moves, and is displayed to the user in accordance with the movement of the smart robot.
  • the smart robot platform includes a communication unit including message control, location information management and image encoder functions; A control unit that controls the vehicle body control, turret control, and movement; And a sensor unit including an infrared module, a position recognition module, and a camera module.
  • the infrared module comprises: an IR laser transmitter for transmitting an IR laser, and transmitting information including an type of shell by using an optical code; And an IR laser receiver for receiving an IR laser.
  • the controller may analyze the optical code transmitted through the IR laser receiver and set whether to be shot or not and the state of the smart robot.
  • the position recognition module scans an optical code on a mat coated with an optical code, and the optical code consists of 5x5 braille, three square vertices having a template for correcting the position of the code, and the control unit By using Kalman filter on the position measurement values using code, the position of the smart robot can be measured.
  • the user device may include a control unit including message control, location information management, and video decoder functions;
  • a user interface including a user interface object module in a virtual reality including movable objects and a user interface object module disposed in a fixed area including fixed objects;
  • a game system that performs game rules, event processing, and effect handling.
  • the battlefield may include: an actual battlefield environment including indoor GPS, terrain tiles, and object objects to locate and reflect the location of the smart robot in a game; And a map object template module having templates for various terrains and features, a virtual battlefield object module including objects represented in a virtual environment for an offline map object template, a 3D map editor for editing a virtual feature, and Virtual battlefield environment including an effect library including a library for effects.
  • a virtual reality battle game system that can increase the user's immersion degree can be provided.
  • the position of the smart robot can be estimated accurately by using the OID for position synchronization between the virtual battlefield and the actual battlefield, and by using the template-based optical code recognition technique and the position correction technique using the Kalman filter. have.
  • FIG. 1 is an overall configuration diagram of a smart robot game system according to an embodiment of the present invention.
  • FIG. 2 is an example of implementing a smart toy employed in a smart robot game system according to an embodiment of the present invention in the form of a tank.
  • FIG. 3 is a view showing the configuration of the IR laser transmitter and the IR laser receiver shown in FIG.
  • FIG. 4 is an internal configuration diagram of a smart toy robot system of a tank type employed in the smart robot game system according to an embodiment of the present invention.
  • 5 to 7 are examples of user devices usable in the smart robot game system according to the embodiment of the present invention.
  • FIG. 8 illustrates an implementation of a software type controller.
  • FIG. 9 is a diagram showing the structure (architecture) of a user interface.
  • FIG. 10 is a diagram illustrating a configuration of a game system applied to a user device.
  • FIG. 11 is a diagram illustrating a form of an actual electric field.
  • FIG. 12 is a diagram illustrating a design of a battlefield design of a virtual reality and a 3D template of a map object.
  • FIG. 13 is a diagram employed for explaining synchronization between a virtual battlefield and a real battlefield.
  • FIG. 14 is a diagram for explaining the configuration of a server.
  • FIG. 15 is a diagram for describing position correction on a rotation transformation based on an optical code and a template.
  • 16 is a diagram illustrating an optical mat.
  • Smart robot game system a smart robot type of the tank type smart robot platform that moves based on the user's control, and transmits information about the actual position using the optical code;
  • a user device based on a control interface for manipulating the smart robot and a view interface showing a battlefield of virtual reality;
  • a battle field configured to be synchronized with the entire battlefield of the OID-based matte type terrain and the virtual reality on which the 3D object is coated so that the smart robot moves, and is displayed to the user in accordance with the movement of the smart robot.
  • Virtual reality refers to a reality-like environment or technology made of artificial technology, not the real world.
  • Virtual reality must connect humans and computers to give humans a sense of being in a real virtual space, beyond a sense of being in a manipulated reality.
  • a space composed of 3D graphics, a display capable of expressing it, and a sensor for measuring a body's movement are required.
  • Virtual reality is an artificial world that perceives user's actions including human senses and creates a virtual world generated based on it in real time by computer and creates it like reality.
  • the virtual world is not a static world, and objects in the world are able to move, interact with each other, and are affected by external actions.
  • virtual reality is one of the main perspectives. It stimulates all the sensory organs of the human body to immerse you in the artificially created world, making you feel like you are there.
  • the virtual world where objects perceived based on the user's actions and sensations are created by the user's actions, the user becomes both a recognizer and a creator.
  • the computer's real-time response to user behavior is one of the important elements of simulation and other virtual reality systems.
  • the most important thing about virtual reality is that you are the center of simulation and interact with the virtual world created by computers.
  • Such a virtual world can add effects of immersion and interaction by utilizing peripheral devices such as a head display device (HMD).
  • HMD head display device
  • One of the important elements for building a virtual reality system is the implementation of an interface that enables effective communication between humans and computers.
  • Conventional communication between a user and a computer is based on information input by a user, but a virtual reality system requires a cognitive system that can recognize or detect a user's behavior and analyze and interpret it as well.
  • a smart toy refers to a toy having artificial intelligence or a toy connected to a network to interact with a user. It may be connected to a computer, or it may be a toy that uses computer technology by incorporating sophisticated sensors and electric circuits.
  • Intelligence toys include robot pets such as Sony's Aibo. Through artificial intelligence, it was developed to learn the behavior like a real pet and to perform various behaviors according to the user's reaction or education. Due to the development and popularization of drone smartphones, various toys linked with smartphones have also been proposed. Nico Li et al. Proposed a system that can produce real terrain using a 3D printer, set the drone's movement path using a handheld screen and see-through headset, and receive real-time images through the actual drone.
  • the HMD is an image output mechanism that acts on the user's head and is a display device that displays an image directly in front of the user's eyes.
  • Marvin Minsky was first developed in 1963. It is used as a display for augmented reality by displaying information on a part of the field of view such as Google Glass, and it is used for virtual reality by displaying an image all over the field of view of the user such as Oculus and detecting user's movement such as head trekking.
  • Google Glass a part of the field of view
  • virtual reality by displaying an image all over the field of view of the user such as Oculus and detecting user's movement such as head trekking.
  • the HMD for augmented reality consists of a pair of goggles, which secures the user's existing field of view, displays images or text on it, receives an image from the field of view of the current user through the camera, Information about the space can be provided to the user or the user can update the status of SMS and SNS information.
  • HMD for virtual reality is composed of helmet or goggles, and it blocks the view of existing users and provides 3D image based on binocular parallax. In special cases, cameras can be installed to ensure the user's existing field of view.
  • FIG. 1 is an overall configuration diagram of a smart robot game system according to an embodiment of the present invention. 1 illustrates a configuration of a system through interworking of a real environment module such as a smart robot user configuration environment and a software module such as a smart terminal and a virtual reality environment.
  • a real environment module such as a smart robot user configuration environment
  • a software module such as a smart terminal and a virtual reality environment.
  • the smart robot game system includes a smart robot platform 10, a user device 20, a battlefield environment 30, and a VR battlefield environment 40.
  • the smart robot platform 10 is a tram-type robot type that moves based on the user's control and transmits information about an actual position using an optical code.
  • the smart robot platform 10 includes a communication unit 10a including message control and location information management and an image encoder, a control unit 10b for controlling the movement of the vehicle body control and turret control and a conventional input method, And a sensor unit 10c including an infrared module, a position recognition module, and a camera module.
  • the smart robot platform 10 basically uses a tram-type smart robot.
  • the user device 20 is based on a control interface for manipulating a smart robot and a view interface showing the battlefield of virtual reality. That is, the user device 20 may include a control unit 20a including message control, location information management, and a video decoder, a user interface 20b including a control interface and a view interface, and It may include a game system 20c that performs game rules and event processing and effect handling.
  • a control unit 20a including message control, location information management, and a video decoder
  • a user interface 20b including a control interface and a view interface
  • It may include a game system 20c that performs game rules and event processing and effect handling.
  • the smart robot moves and synchronizes with the OID-based matte type terrain and the virtual reality battlefield in which 3D objects are applied, and all these things can be linked through the server 50.
  • the battlefield environment 30 may include an indoor GPS, a terrain tile, and a feature object for locating a smart toy and reflecting it in a game.
  • the VR battlefield environment 40 may include a map object template module 40a having templates for various terrains and features, and a virtual environment for an offline map object template.
  • VR battlefield object module 40b including objects to be represented, a 3D map editor 40c for editing virtual features, and a library for effects An effect library 40d may be included.
  • the server 50 shown in FIG. 1 may match a real battlefield and a virtual battlefield and transmit and correct location information according to a user's manipulation.
  • FIG. 2 is an example of implementing a smart toy employed in a smart robot game system according to an embodiment of the present invention in the form of a tank.
  • the tank-type smart toy illustrated in FIG. 2 (ie, called a smart tank) is configured to allow a user to control the rotation of the turret and the movement of the robot using a controller.
  • the two cameras can be used to check the front position of the tank and the direction of the barrel, and the robot's position can be identified through an OID scanner.
  • the smart toy (smart tank) of the tank type illustrated in FIG. 2 includes a driving motor 12, a barrel rotation motor 11, cameras 13 and 14, an IR laser transmitter 15, and an IR laser receiver ( 16, a battery pack 17, a communication board 18, a control board 19, and an OID scanner 21.
  • FIG. 3 is a diagram showing the configuration of the IR laser transmitter 15 and the IR laser receiver 16 shown in FIG.
  • the smart toy can check the firing and shooting down of the shell by using an IR laser.
  • the smart toy can detect whether the smart tank is damaged and can be driven depending on the type of shell and the position of the shot. That is, the main controller 72 of the smart toy robot system 60, which will be described later, may check the firing and shooting down of the shell by using the IR lasers from the IR laser transmitter 15 and the IR laser receiver 16. Accordingly, the main controller 72 may detect a damage state of the smart tank and whether it can be driven depending on the type of shell and the position of the shell.
  • the IR laser transmitter 15 transmits an IR laser, but transmits information such as the type of shell using an optical code.
  • the IR laser receiver 16 receives the IR laser.
  • the main controller 72 analyzes the optical code transmitted through the IR laser receiver 16 to set the status of the bullet and the state of the smart tank. In order to be able to do so, corresponding information may be provided to the main controller 72 of the smart toy 60.
  • the IR laser transmitter 15 and the IR laser receiver 16 may be regarded as being included in the IR module 70 of the smart toy 60 to be described later.
  • FIG. 4 is an internal configuration diagram of a smart toy robot system of a tank type employed in the smart robot game system according to an embodiment of the present invention.
  • the smart toy robot system 60 collects operation settings and information of each module through the main controller 72.
  • the main controller 72 interoperates with the user device 20, the battlefields 30 and 40, and the server 50 to control the operation and the location of the smart toy robot system of the corresponding vehicle type.
  • the smart toy robot system 60 includes a camera module 62 capable of relaying a screen of a field, a motor module 64 that performs rotation of a turret (rotation turret), and body movement, and a server 50.
  • Network module 66 for performing communication OID module 68 for location recognition, and IR module 70 for infrared transmission and reception.
  • the camera module 62 includes the cameras 13 and 14 illustrated in FIG. 2.
  • the motor module 64 includes a drive motor 12 and a barrel rotation motor 11 illustrated in FIG. 2.
  • Network module 66 includes a communication board 18 illustrated in FIG. 2.
  • OID module 68 includes an OID scanner 21 illustrated in FIG. 2.
  • the main controller 72 includes a control board 19 illustrated in FIG. 2.
  • the smart toy robot system 60 as described above moves according to the user's control, checks the position in the actual battle field by using the OID scanner 21, and transmits the same to the server 50, and transmits the data from the server 50. Move to match the terrain of the virtual battlefield.
  • the OID module 68 may scan the OID scanner 21 with an optical code on a mat coated with an optical code (optical mat).
  • the above-described network module 66 of FIG. 4 may correspond to the communication unit 10a of the smart robot platform 10 of FIG. 1.
  • main controller 72 of FIG. 4 may correspond to the control unit 10b of the smart robot platform 10 of FIG. 1.
  • the IR module 70, the OID module 68, and the camera module 62 of FIG. 4 may correspond to the sensor unit 10c of the smart robot platform 10 of FIG. 1.
  • smart toy robot system may be abbreviated as smart toy.
  • 5 to 7 are examples of user devices usable in the smart robot game system according to an embodiment of the present invention, and show various user device types.
  • the user device usable in the smart robot game system may include a controller for controlling the smart tank and a display device showing the battlefield environment.
  • the display device basically recommends an HMD with a gyroscope sensor to experience virtual reality, but a smart device such as a smartphone or a smart pad can be used.
  • the controller can be controlled using a software type controller using a smart device and a dedicated controller for VR HMD users.
  • FIG. 8 is a diagram illustrating an implementation of a software type controller, in which a vehicle body of a corresponding smart toy (smart tank) may be adjusted through body direction.
  • the barrel can be moved using the Function Direction.
  • the user's screen can observe the battlefield through the head trekking in the case of HMD according to the user environment setting, and the handling can be achieved through the handling of the smart pad.
  • FIG. 9 is a diagram showing the structure (architecture) of a user interface.
  • the user interface may be divided into a user interface object module 74 in a virtual reality, and a user interface object module 76 disposed in a fixed area.
  • the user interface object module 74 in the virtual reality includes information on the cockpit environment of the smart toy (smart tank), and movable objects such as a mini map and a system button.
  • the user interface object module 76 disposed in the fixed area includes fixed objects such as coordinates and launch points.
  • FIG. 10 is a diagram illustrating a configuration of a game system applied to a user device.
  • the game system applied to the user device includes a game content module 78 and an effect module 80.
  • Game content module 78 includes game rules, scores, achievements, and the like.
  • Effect module 80 includes shell fire, shell hits, and object destruction.
  • FIG. 11 is a diagram illustrating the shape of an actual battlefield
  • FIG. 12 is a diagram illustrating a design of a battlefield design of a virtual reality and a 3D template of a map object
  • FIG. 13 is a synchronization between a virtual battlefield and a battlefield of reality. It is a figure which is adopted in order to demonstrate.
  • Battlefield consists of a battlefield of reality that moves a smart tank and a virtual battlefield made of 3D graphic objects.
  • Battlefield is provided to map the virtual battlefield to the battlefield of reality to be displayed to the user according to the movement of the smart toy (smart tank).
  • an OID-mounted map object is installed on a mat coated with an optical code to decorate a battlefield environment.
  • a battlefield of virtual reality made of 3D graphics may be constructed, which may show a design of a battlefield design of a virtual reality and a 3D template of a map object as shown in FIG. 12.
  • the actual battlefield and the battlefield of the virtual reality are synchronized using OID-based optical codes as shown in FIG. 13, and each map object is matched with 3D graphic objects to form a virtual battlefield.
  • the actual electric field is in the form of a mat coated with 2D optical code.
  • FIG. 14 is a diagram for explaining the configuration of a server.
  • the server 50 In order to match the actual battlefield with the virtual battlefield and transmit and correct the location information according to the user's manipulation, the server 50 is required.
  • the server 50 includes a cloud server 52 and an application server 54.
  • the cloud server 52 may provide network services, device connections, and services for games.
  • the application server 54 provides an application required for the game, and may be composed of SPDY2.0 and a web socket.
  • the embodiment of the present invention requires a technique for accurately identifying and correcting the position of the smart toy (smart tank) in order to connect the space of the real space and the virtual reality.
  • an optical code for recognizing a position of a smart toy (smart tank) is adopted, a technique for recognizing the optical code, and an error occurring when the position is indicated by recognizing the optical code
  • the positional correction based on the Kalman filter was performed.
  • FIG. 15 is a diagram for explaining position correction regarding a rotation transformation based on an optical code and a template
  • FIG. 16 is a diagram illustrating an optical mat.
  • the optical code is basically expressed in 5 ⁇ 5 braille as shown in FIG. 15, and three rectangular vertices have a template for correcting the position of the code. At this time, the maximum number of available data code bits is 21 bits. In order to display two-dimensional coordinates, 10 bits of data are allocated to the X-axis coordinates and the Y-axis coordinates, and one bit is allocated for the flag value. 1024 values can be expressed for each of the X and Y axes, and the battle field can be configured using this.
  • the code As the smart toy (smart tank) moves, the code is recognized as being rotated.
  • the rotation angle of the input code may be calculated using the template code, and the position and angle of the smart tank may be calculated.
  • the same coordinate code is repeatedly displayed in a predetermined area.
  • the target recognition rate is P and the unit recognition rate is i
  • the target recognition rate is set to 99% and the unit recognition rate is set to 70%, the number of codes M to be repeated per error distance 1Cm 2 can be obtained based on the above-described formula.
  • the optical code was repeatedly inserted into four identical codes per 1 cm 2 , which is possible to construct an optical mat having a size of 1024 cm ⁇ 1024 cm as shown in FIG. 16.
  • the Kalman filter may be applied to the case where the measured value of the object includes a probabilistic error, and the state at a specific time point of the object has a linear relationship with the state of the previous time point.
  • the Kalman filter is applied to measure the position of the smart toy.
  • the measurement may be performed on the position velocity and acceleration generated when the smart toy is driven, but the measurement value may include an error.
  • the main controller 72 may measure the position of the corresponding smart toy (ie, the smart tank) by applying a Kalman filter to continuously measured values.
  • F k denotes a transition matrix based on a previous state at a corresponding time
  • B k denotes a state transition matrix by a user input
  • w k denotes a noise variable.
  • the x k representing the state of the smart tank with respect to the movement of the smart toy may be represented by the position and speed of the smart toy as follows.
  • x k means the position at the time point k, Is the velocity at time k.
  • the acceleration arbitrarily applied to the smart toy ie, the smart tank
  • x k F k x k-1 + B k u k + G ak .
  • the Kalman filter can be used to estimate the position of the smart toy (smart tank).
  • a virtual battle game can be played using a smart toy.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Toys (AREA)

Abstract

L'invention concerne un système de jeu de robot intelligent qui fournit un champ de bataille de réalité virtuelle et permet à un utilisateur de commander en personne un tank de type jouet comme s'il contrôlait un tank réel. Le système de jeu de robot intelligent présenté ici comprend : une plateforme de robot intelligent qui est un type de robot intelligent se présentant sous forme d'un tank et se déplace en fonction des actions d'un utilisateur, et transmet des informations sur la position réelle de celui-ci à l'aide d'un code optique ; un dispositif d'utilisateur reposant sur une interface de commande pour commander le robot intelligent et sur une interface de visualisation pour afficher un champ de bataille de réalité virtuelle ; et un champ de bataille fourni à l'utilisateur en étant affiché selon le mouvement du robot intelligent par configuration et synchronisation d'une topographie de type tapis basée sur des OID sur laquelle le robot intelligent se déplace autour d'un champ de bataille dans une réalité virtuelle superposée à un objet 3D.
PCT/KR2016/013322 2016-11-18 2016-11-18 Système de jeu de robot intelligent WO2018092947A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2016/013322 WO2018092947A1 (fr) 2016-11-18 2016-11-18 Système de jeu de robot intelligent

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2016/013322 WO2018092947A1 (fr) 2016-11-18 2016-11-18 Système de jeu de robot intelligent

Publications (1)

Publication Number Publication Date
WO2018092947A1 true WO2018092947A1 (fr) 2018-05-24

Family

ID=62145239

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/013322 WO2018092947A1 (fr) 2016-11-18 2016-11-18 Système de jeu de robot intelligent

Country Status (1)

Country Link
WO (1) WO2018092947A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112354183A (zh) * 2020-10-21 2021-02-12 天津大学 一种基于无线局域网的坦克车控制系统和控制方法
CN114868173A (zh) * 2019-12-23 2022-08-05 丹尼斯·维克托洛维奇·帕夫连科 用于虚拟装甲车辆的控制器

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003325971A (ja) * 2002-05-08 2003-11-18 Sega Toys:Kk ゲーム装置及びゲーム用入力装置
US20080004111A1 (en) * 2006-06-30 2008-01-03 Logitech Europe S.A. Non Impact Video Game Controller for Dancing Games
KR20090001683A (ko) * 2007-05-10 2009-01-09 성균관대학교산학협력단 지능형 호이스트를 이용한 자재/부재의 수직 양중 관리자동화 방법
JP2015232783A (ja) * 2014-06-09 2015-12-24 株式会社バンダイナムコエンターテインメント プログラムおよび画像生成装置
JP2016532178A (ja) * 2013-06-08 2016-10-13 株式会社ソニー・インタラクティブエンタテインメント ヘッドマウントディスプレイにおいて、透過モードと非透過モードとの間を移行するシステム及び方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003325971A (ja) * 2002-05-08 2003-11-18 Sega Toys:Kk ゲーム装置及びゲーム用入力装置
US20080004111A1 (en) * 2006-06-30 2008-01-03 Logitech Europe S.A. Non Impact Video Game Controller for Dancing Games
KR20090001683A (ko) * 2007-05-10 2009-01-09 성균관대학교산학협력단 지능형 호이스트를 이용한 자재/부재의 수직 양중 관리자동화 방법
JP2016532178A (ja) * 2013-06-08 2016-10-13 株式会社ソニー・インタラクティブエンタテインメント ヘッドマウントディスプレイにおいて、透過モードと非透過モードとの間を移行するシステム及び方法
JP2015232783A (ja) * 2014-06-09 2015-12-24 株式会社バンダイナムコエンターテインメント プログラムおよび画像生成装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114868173A (zh) * 2019-12-23 2022-08-05 丹尼斯·维克托洛维奇·帕夫连科 用于虚拟装甲车辆的控制器
CN112354183A (zh) * 2020-10-21 2021-02-12 天津大学 一种基于无线局域网的坦克车控制系统和控制方法

Similar Documents

Publication Publication Date Title
US20230244299A1 (en) Mixed reality high-simulation battlefield first aid training platform and training method using same
US8634969B2 (en) Teleoperation method and human robot interface for remote control of a machine by a human operator
WO2017065348A1 (fr) Procédé de collaboration au moyen d'un visiocasque
Higuchi et al. Flying head: A head-synchronization mechanism for flying telepresence
WO2018131914A1 (fr) Appareil et procédé de fourniture de guidage dans un environnement virtuel
US20210333807A1 (en) Method and system for controlling aircraft
WO2018038485A1 (fr) Procédé et système de commande d'attraction de réalité virtuelle
KR101713223B1 (ko) 가상현실 체험 장치
CN105573341A (zh) 一种飞行器光学控制方法及系统
CN107045816A (zh) 基于ar眼镜和数据手套的空战对抗训练模拟装置及方法
Tsykunov et al. Swarmtouch: Guiding a swarm of micro-quadrotors with impedance control using a wearable tactile interface
WO2021125716A1 (fr) Procédé et système pour fournir un jeu vr en utilisant des informations sur un véhicule autonome
CN111716365B (zh) 基于自然行走的沉浸式远程交互系统及方法
US20200175748A1 (en) Information processing device and image generation method
WO2015083915A1 (fr) Système de relais de jeu de combat utilisant des robots volants
US20180299948A1 (en) Method for communicating via virtual space and system for executing the method
US20220083055A1 (en) System and method for robot interactions in mixed reality applications
WO2018092947A1 (fr) Système de jeu de robot intelligent
WO2021038834A1 (fr) Système de simulation de conduite et procédé pour aéronef sans pilote
CN106075915A (zh) 一种可以接收多个方向射击激光束的无人机空中对战装置
US20210335145A1 (en) Augmented Reality Training Systems and Methods
WO2022075817A1 (fr) Système d'enseignement de codage de robot à distance
Mangina et al. Drones for live streaming of visuals for people with limited mobility
WO2018034412A1 (fr) Dispositif haptique lié à un dispositif de mise en œuvre de réalité augmentée ou de réalité virtuelle
WO2019124728A1 (fr) Appareil et procédé d'identification d'objet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16921840

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.09.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16921840

Country of ref document: EP

Kind code of ref document: A1