EP3590587A1 - Mixed-reality-methoden und -systeme, die bei kollektiven ereignissen angewendet werden - Google Patents

Mixed-reality-methoden und -systeme, die bei kollektiven ereignissen angewendet werden Download PDF

Info

Publication number
EP3590587A1
EP3590587A1 EP19182428.3A EP19182428A EP3590587A1 EP 3590587 A1 EP3590587 A1 EP 3590587A1 EP 19182428 A EP19182428 A EP 19182428A EP 3590587 A1 EP3590587 A1 EP 3590587A1
Authority
EP
European Patent Office
Prior art keywords
radio
mixed reality
controlled object
control device
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19182428.3A
Other languages
English (en)
French (fr)
Inventor
Paul-Henri DECAMP
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dws Dyna Wing Sail
Original Assignee
Dws Dyna Wing Sail
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dws Dyna Wing Sail filed Critical Dws Dyna Wing Sail
Publication of EP3590587A1 publication Critical patent/EP3590587A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H17/00Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H23/00Toy boats; Floating toys; Other aquatic toy devices
    • A63H23/02Boats; Sailing boats
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H27/00Toy aircraft; Other flying toys
    • A63H27/12Helicopters ; Flying tops

Definitions

  • the present invention relates to the technical field of immersion methods and systems, and more particularly to mixed reality methods and systems in the context of collective events.
  • the present invention relates more particularly to competitions of scale models of nautical vehicles, such as for example regattas of radio-controlled sailboats.
  • augmented reality makes it possible to increase the user's sense.
  • a regatta course marked by a starting line, virtual buoys and a finishing line can be superimposed, in a scene intended for a user equipped with an augmented reality helmet. , on the real water level of the competition. Textual or graphic content can also be overlaid on a real radio-controlled nautical vehicle, so that this content seems to the user to exist in the real world.
  • sensors are generally deployed in the real environment and / or directly on nautical vehicles, to detect in real time a change in the real scene and to align, consequently, the virtual objects which are there are integrated.
  • WO2008145980 proposes the creation of an augmented reality for the remote control of a remote-controlled toy, integrating a video camera configured to acquire video images of the real environment.
  • the scene displayed at the user is augmented by virtual image entities, generated as a function of image entities detected in the video images received from the real environment.
  • virtual reality makes it possible to constitute an immersive virtual environment (a 3D model), corresponding to a real environment and with which one can interact in real time, by means of a remote-controllable virtual object. Piloting a vehicle, such as a virtual drone, in a 3D reconstruction of a real city by immersion in a cockpit, is one example.
  • the document US2006223637 (Ferraz ) describes a system for controlling a radio-controlled toy vehicle incorporating a video camera, and a control device capable of controlling this toy vehicle and comprising a real-time visual display of the video content captured by the on-board camera.
  • a software application operating on the control device makes it possible to superimpose, on the visual display of the camera data, virtual functions / means associated with the radio-controlled toy vehicle (a simulated weapon system controllable from the control device).
  • two users provided, respectively, with a control device and a toy vehicle (the two vehicles being real and arranged in the same circuit) can control the movement of their respective vehicles, as well as target and control the virtual weapon system in the direction of the opposing vehicle.
  • An object of the present invention is to improve the immersion effect of mixed reality applied to collective events.
  • Another object of the present invention is to propose methods and systems making it possible to reduce the resources required for the organization of a collective (or collaborative) event.
  • Another object of the present invention is to provide a user in real time with 3D scenes integrating virtual objects and real objects with which he can interact without distinction.
  • Another object of the present invention is to improve the immersion experience of a user participating remotely in a collective environment.
  • Another object of the present invention is to enrich the interactivity between the user and objects which are displayed to him in an immersive scene.
  • Another object of the present invention is to allow a coherent coexistence of real objects and virtual objects in an immersive scene of a collective event.
  • the radio-controlled mobile object 1 is a nautical vehicle such as a sailboat.
  • This sailboat is, for example, of radio-controlled naval model making type.
  • the example of a nautical vehicle is in no way limiting of the radio-controlled object 1 which can be any other real radio-controlled device in reduced model or in full size such as a radio-controlled toy, or a radio-controlled vehicle .
  • the radio-controlled object 1 is, in one embodiment, a land vehicle such as a radio-controlled car or an air vehicle such as a radio-controlled drone.
  • the watercraft includes two actuators (winch or servomotor preferably), used for steering and propulsion (listening to the sail (s)).
  • the watercraft may include a smoke machine, ballasts, a propeller motor and a system foils retractable in the hull of the watercraft.
  • the watercraft further comprises a plurality of inertial sensors, such as an accelerometer, a gyroscope, a magnetometer, or a compass.
  • the water vehicle includes a nine-axis inertial unit.
  • the radio-controlled object 1 comprises a plurality of sensors such as cameras, microphones, contact sensors, one or more positioning sensors (in particular, a GPS receiver), a wind vane, a tachometer, or an anemometer.
  • the positioning sensor is, in one embodiment, centimeter of the “Rover” type (eg: UBLOX NEO-M8P).
  • the watercraft also includes a wireless communication module such as Bluetooth, Wi-Fi, HiperLAN, LPWAN or ZigBee.
  • This wireless communication module is housed in a waterproof case, or a similar system protecting it from water (silicone layer, formwork preventing water from coming into contact with the on-board system, even if the nautical vehicle overturns) .
  • This wireless communication module is configured to communicate, on request or not, data measured by sensors on board the radio-controlled object 1 .
  • This wireless communication module is capable of receiving (from a control device or the like) and transferring instructions or configuration data intended for one or more sensors on board the radio-controlled object 1 .
  • the radio-controlled object 1 can be used in a collective event, in particular a competition such as a regatta.
  • FIG. 2 there is shown, by way of illustration, an environment in which a collective event takes place, namely a nautical competition.
  • the radio-controlled object 1 is connected, via its wireless communication module, to a mixed reality server 2 .
  • the mixed reality server 2 thus has data collected by the sensors on board the radio-controlled object 1 .
  • the mixed reality server 2 also has data captured by sensors 3 deployed in the environment of the collective event.
  • These sensors 3 include, for example, cameras, contact sensors, one or more positioning sensors, an anemometer, a wind vane.
  • the anemometer and the wind vane are, in one embodiment, arranged at least 1.50 m in height relative to the level of the water body.
  • a user (the skipper), present at the site of the collective event, is provided with a control device 4 making it possible to control the radio-controlled object 1 remotely.
  • the control device 4 is mobile in the environment of the collective event. The user equipped with the control device 4 can thus move freely at least in part of the environment of the collective event.
  • the control terminal 4 is a fixed / mobile computer, a remote control, a smartphone, or a touch pad for example.
  • control device 4 communicates with the radio-controlled object 1 directly (in “training” mode for example) or via the mixed reality server 2 (in “competition” mode).
  • control device 4 is also able to receive and display a video stream received from the mixed reality server 2 .
  • the user is provided with an augmented reality user equipment, such as a headset 5 or augmented reality glasses.
  • This equipment is a device that the user wears on the head (in the form of a helmet or glasses) or in the hands (smartphone, phablet, or tablet).
  • the augmented reality user equipment is connected to the mixed reality server 2 .
  • the augmented reality helmet 5 comprises, in one embodiment, a positioning sensor (in particular, a GPS receiver) making it possible to locate, at least in the environment of the collective event, this augmented reality helmet 5 .
  • the mixed reality server 2 is able to integrate one or more predefined virtual objects 6 into video content received from a camera. More generally, the mixed reality server 2 is configured to generate, from the data recovered from the sensors deployed in the environment of the collective event and / or in the radio-controlled object 1 , mixed reality scenes integrating the object radio-controlled 1 and / or one or more virtual objects 6 .
  • the mixed reality server 2 is connected to a database of virtual objects.
  • the control device 4 and / or the augmented reality user equipment comprises a positioning sensor such as a GPS receiver.
  • This positioning sensor is, in one embodiment, centimetric. This positioning sensor makes it possible to provide the precise coordinates of the wearer (player, referee, or spectator for example) when he moves in the environment of the collective event.
  • the mixed reality server 2 is configured to generate (or regenerate) a scene for the wearer (player, referee, spectator or, more generally, the user) of the control device 4 and / or of the user equipment.
  • augmented reality based on position data from the control device and / or that of the augmented reality user equipment.
  • the augmented reality user equipment in particular the helmet 5 or augmented reality glasses, comprises a three-axis compass, making it possible to provide the orientation of this equipment in a coordinate system (for example, in XYZ). This orientation makes it possible to determine the field of vision of the user of the augmented reality user equipment.
  • the mixed reality server 2 is configured to generate a mixed reality scene for the user, according to his field of vision.
  • the mixed reality server 2 generates, from data retrieved from the sensors deployed in the environment and / or in the radio-controlled object 1 , a multimedia scene of mixed reality, as a function of the position of the control device. 4 and / or the position and / or the orientation of the augmented reality user equipment.
  • the multimedia content of the generated scene is obtained from the cameras and microphones closest to the position of the user and / or those covering his field of vision and the faces of the virtual objects visible from this position and / or under the guidance of the user.
  • the mixed reality server 2 makes it possible to coordinate the simultaneous display of virtual objects to the intention of different users (whether spectators, players, or referees for example), even when they replace themselves and / or change their field of vision.
  • the mixed reality server 2 is accessible via a communications network 7 (commonly that of the Internet), so that a remote user provided with a user terminal 8 (such as a smartphone, a tablet or a laptop) / fixed) can connect to the mixed reality server 2 and take part in the current collective event.
  • a communications network 7 commonly that of the Internet
  • the remote user can choose, from a predefined list (of sailboats for example), a virtual object 6 with which he wishes to participate in the collective event.
  • the mixed reality server 2 is responsible for integrating such an object into the mixed reality scene to be broadcast from the collective event.
  • the remote user participates in the collective event with a radio-controlled object 1 made available to him (for example, a real radio-controlled rental sailboat launched by a user or a spectator present on the event).
  • the mixed reality server makes it possible, in fact, to coherently superimpose the virtual object 6 on the perception that users have of the real environment and which is displayed to them (on the screen of the remote user terminal 8 , the screen of the control terminal 4 or the augmented reality user equipment).
  • the radio-controlled object 1 has, in fact, a concrete and objective existence which can be observed directly, whereas the virtual object 6 is a digital object which exists by essence or effect, and is simulated in the scene of mixed reality displayed for users.
  • This mixed reality scene includes, for example, the actual environment of the event (therefore, including the radio-controlled object 1 ) which is augmented with the virtual object 6 , a regatta course and, possibly, an atmosphere.
  • the mixed reality server 2 combines the real objects , including the radio-controlled object 1 , and the virtual object 6 in the real nautical competition environment.
  • the virtual objects coincide with the real objects and Conversely.
  • imperceptible incidents contact between a virtual nautical vehicle and a virtual obstacle or a real obstacle such as a real nautical vehicle
  • perceptible phenomena visible and / or audible to users
  • the mixed reality server 2 through which the commands for the radio-controlled object 1 and the virtual object 6 pass allows real-time interaction with these objects 1 , 6 , either by means of the control device 4 or via the remote user terminal 8 .
  • the mixed reality server 2 serves as a wireless communication bridge between the radio-controlled objects 1 , the virtual objects 6 , the control devices 4 and the remote user terminals 8 .
  • the mixed reality server 2 interprets the commands intended for objects 1 , 6 and, on request or not, allows the destination (router) of a command to be changed.
  • the result is a coherent combination of the physical world with virtual objects.
  • the real-time interpretation of user interactions and the data returned by the sensors makes it possible to animate virtual object 6 in real time in a manner consistent with the environment.
  • the animation of the virtual object 6 can be performed in real time using any dynamic simulation engine (physical game engine) such as haVoKTM, Newton Game Dynamics, or Dynamo.
  • dynamic simulation engine physical game engine
  • haVoKTM Newton Game Dynamics
  • Dynamo Dynamo.
  • the mixed reality server 2 interprets user interactions and the data collected from the sensors 3 and generates immersive content (video, audio, text) which it transmits to the remote user terminal 8 , to the control device 4 and / or the augmented reality user equipment.
  • This immersive content includes, for example, the body of water on which the nautical vehicles sail (virtual and real), as well as other information such as the course, the list, the speed, the pitch, the strength of the communication signal, the level of energy autonomy of a nautical vehicle, the distance between a user and the nautical vehicle.
  • This immersive content also includes information concerning the regatta such as, for example, the position of the beacons, the position of the virtual buoys 9 of the course, other users' nautical vehicles, the classification of the users, the sailing instructions.
  • the mixed reality server 2 broadcasts racing instructions / information (in place of the committee) to the users, by means of speakers or by means of notifications integrated into the visual content displayed by mixed reality user equipment. This information is useful for certain maneuvers in regattas (passage of distant buoys, or navigation in packs, for example) as well as to inform spectators about the event.
  • the mixed reality server 2 is, in one embodiment, configured to control one or more radio-controlled nautical vehicles (those for rental for example), to make them return to a specific point in the water when a delay of use is complete.
  • the mixed reality server 2 is, in one embodiment, configured to adapt the regatta course automatically and in real time, as a function of changes in orientation and wind strength, this data being collected from sensors 3 deployed in the environment of the event.
  • Several views of the scene are made available to users by the mixed reality server 2 , such as an objective view (or “third-person view”), a subjective view (or “first-person view” from '' a camera on board a nautical vehicle or on a flying drone, for example).
  • an objective view or “third-person view”
  • a subjective view or “first-person view” from '' a camera on board a nautical vehicle or on a flying drone, for example.
  • the mixed reality server 2 makes it possible to create a mixed reality for the user present at the regatta location as well as for a user (skipper or viewer) distant from the regatta location. This results, advantageously, in a simplification of the organization of a nautical competition integrating both real radio-controlled sailboats and virtual sailboats.
  • the mixed reality server 2 communicates in real time information concerning the event being broadcast to a remote server. Information concerning, for example, a regatta in progress is thus available, via this remote server.
  • the remote server stores information received from the mixed reality server 2 in a database accessible online.
  • This database includes, for example, the results of competitions, or also information relating to regattas or participants. This stored data makes it possible to replay (in deferred mode) or respond to a request concerning a sequence or all of the immersive content previously broadcast.
  • the system described above is used for any other group event like a game or recreational boating including water vehicles radio controlled (eg, naval battles combining several sailboats in front compete with virtual weapons on real water bodies).
  • group event like a game or recreational boating including water vehicles radio controlled (eg, naval battles combining several sailboats in front compete with virtual weapons on real water bodies).
  • An on-board camera positioned on a rotating support in the three directions of space, can be compared to the sight of a cannon for naval combat scenarios.
  • This view is transcribed on the display screen of the control device or via the augmented reality user equipment.
  • the skipper can either maneuver the sailboat and the cannon, or delegate the use of the "virtual weapon" to a gunner (additional player), from a second control device. Crew navigation is thus possible.
  • the methods and systems described above find, in particular, application in the organization of a collective event involving one or more distant participants in addition to at least one participant present at the place of the event.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Ocean & Marine Engineering (AREA)
  • Processing Or Creating Images (AREA)
  • Toys (AREA)
  • Selective Calling Equipment (AREA)
EP19182428.3A 2018-07-03 2019-06-25 Mixed-reality-methoden und -systeme, die bei kollektiven ereignissen angewendet werden Withdrawn EP3590587A1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FR1856109A FR3083457B1 (fr) 2018-07-03 2018-07-03 Methodes et systemes de realite mixte appliquees aux evenements collectifs

Publications (1)

Publication Number Publication Date
EP3590587A1 true EP3590587A1 (de) 2020-01-08

Family

ID=65031359

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19182428.3A Withdrawn EP3590587A1 (de) 2018-07-03 2019-06-25 Mixed-reality-methoden und -systeme, die bei kollektiven ereignissen angewendet werden

Country Status (2)

Country Link
EP (1) EP3590587A1 (de)
FR (1) FR3083457B1 (de)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060223637A1 (en) 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
WO2008145980A1 (en) 2007-05-31 2008-12-04 Sony Computer Entertainment Europe Limited Entertainment system and method
US20160129358A1 (en) * 2014-11-07 2016-05-12 Meeper Technology, LLC Smart Phone Controllable Construction Brick Vehicle
US20180095461A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Uav positional anchors

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060223637A1 (en) 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
WO2008145980A1 (en) 2007-05-31 2008-12-04 Sony Computer Entertainment Europe Limited Entertainment system and method
US20160129358A1 (en) * 2014-11-07 2016-05-12 Meeper Technology, LLC Smart Phone Controllable Construction Brick Vehicle
US20180095461A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Uav positional anchors

Also Published As

Publication number Publication date
FR3083457A1 (fr) 2020-01-10
FR3083457B1 (fr) 2020-07-17

Similar Documents

Publication Publication Date Title
US20200254353A1 (en) Synchronized motion simulation for virtual reality
US10692174B2 (en) Course profiling and sharing
US10377484B2 (en) UAV positional anchors
US10977865B2 (en) Augmented reality in vehicle platforms
US9886033B2 (en) System for piloting a drone in immersion
CN111228804B (zh) 在虚拟环境中驾驶载具的方法、装置、终端及存储介质
KR101748401B1 (ko) 가상현실 어트랙션 제어 방법 및 시스템
KR101736477B1 (ko) 저장된 콘텐츠 및 ar 통신의 로컬 센서 증강
US10486060B2 (en) Tracking core for providing input to peripherals in mixed reality environments
FR2908322A1 (fr) Procede de definition de zone de jeux pour un systeme de jeux video
FR2908324A1 (fr) Procede d'ajustement d'affichage pour un systeme de jeux video
FR3058238A1 (fr) Systeme autonome de prise de vues animees par un drone avec poursuite de cible et maintien de l'angle de prise de vue de la cible.
CN105807922A (zh) 一种虚拟现实娱乐驾驶的实现方法、装置及系统
CN104781873A (zh) 图像显示装置、图像显示方法、移动装置、图像显示系统、以及计算机程序
KR20100137413A (ko) 이동체 경쟁 구현 시스템
FR2957266A1 (fr) Procede et appareil de telecommande d'un drone, notamment d'un drone a voilure tournante.
KR101790592B1 (ko) 관광드론을 이용한 증강현실 관광 시스템 및 방법
FR3054336A1 (fr) Systeme autonome de prise de vues animees par un drone avec poursuite de cible et localisation amelioree de la cible.
US20210192851A1 (en) Remote camera augmented reality system
FR3056921A1 (fr) Systeme autonome de prise de vues animees par un drone avec poursuite de cible et maintien de l'angle de prise de vue de la cible.
US20220368958A1 (en) Live video distribution method using unmanned moving device, video distribution device used in live video distribution method, and video archive device for storing video data file generated by video distribution device
Smith The photographer's guide to drones
EP3590587A1 (de) Mixed-reality-methoden und -systeme, die bei kollektiven ereignissen angewendet werden
EP4252417A1 (de) Simulationsferngläser sowie vorrichtung und verfahren zur simulation
Avery et al. Outdoor augmented reality gaming on five dollars a day

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200612

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20201001

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20211019

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220301