EP1913764A2 - Einrichtung zum visualisieren eines virtuellen passagierabteils eines fahrzeugs in einer realen situation - Google Patents

Einrichtung zum visualisieren eines virtuellen passagierabteils eines fahrzeugs in einer realen situation

Info

Publication number
EP1913764A2
EP1913764A2 EP06794315A EP06794315A EP1913764A2 EP 1913764 A2 EP1913764 A2 EP 1913764A2 EP 06794315 A EP06794315 A EP 06794315A EP 06794315 A EP06794315 A EP 06794315A EP 1913764 A2 EP1913764 A2 EP 1913764A2
Authority
EP
European Patent Office
Prior art keywords
video
virtual
passenger compartment
real
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06794315A
Other languages
English (en)
French (fr)
Inventor
Valentin Lefevre
Jean-Marie Vaidie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Total Immersion
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Total Immersion filed Critical Total Immersion
Publication of EP1913764A2 publication Critical patent/EP1913764A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Definitions

  • the present invention relates to the testing and validation of the ergonomics of driving environments such as vehicle cockpits and more particularly a method and a device for testing and validating the ergonomics of a virtual passenger compartment in a real situation.
  • the invention more precisely relates to a viewing device in which a virtual cockpit model is inserted in real time in a video captured itself in real time.
  • the testing and validation of a real cockpit requires the manufacture of this cockpit and its mounting within the vehicle for which it is intended.
  • the cockpit can only be tested in a limited number of situations, due in particular to security issues.
  • the use of a virtual model offers many advantages. In particular, each modification does not require the realization of a new model, reducing design time and cost.
  • the testing and validation of a virtual cockpit is often done through the use of video projectors and screens forming an immersive room. Simulation of the use of the virtual cockpit is simulated in a pure synthesis image environment.
  • the user is seated on a mobile platform that is driven by electric or hydraulic cylinders, also called cabin movement.
  • a mobile platform that is driven by electric or hydraulic cylinders, also called cabin movement.
  • the computer generated image to test and validate the cockpit is as close to reality as possible, the actual interface of the tester with the cockpit is not reproduced.
  • the external environment that is to say the decor
  • the cabin movement is an approximate movement.
  • the mobile platform operates at low frequency, namely about 4 Hz, because of the slowness of the cylinders.
  • a reaction time is introduced of about 300 milliseconds or more.
  • the invention solves at least one of the problems discussed above.
  • the invention thus relates to a device for visualizing a virtual passenger compartment of a vehicle, characterized in that it comprises:
  • mobile video acquisition and display means including:
  • At least one visualization means At least one visualization means
  • At least one video capture means capable of capturing the field of view
  • At least one processing means connected to said mobile video acquisition and display means, said at least one processing means being adapted to,
  • the device according to the invention makes it possible to test and validate the ergonomics of a virtual passenger compartment, especially in a real driving situation.
  • a pilot is able to drive a real vehicle whose cockpit is virtual while the pilot leads in a real context.
  • the pilot is immersed in a virtual cockpit within a real vehicle.
  • the actual vehicle may be a simplified vehicle formed for example, only the chassis, the engine, the steering wheel and a plexiglass bubble.
  • a vehicle makes it possible to have a real cabin as transparent as possible, then to visualize a virtual cockpit without this cockpit being constrained by elements of the real cockpit.
  • this device makes it possible to test a passenger compartment in real driving conditions of the vehicle.
  • the virtual cockpit is a virtual cockpit of a previously modeled vehicle.
  • the scanning frequency of said at least one video capture means is substantially equal to the scanning frequency of said at least one viewing means.
  • said at least one visualization means is a virtual headset.
  • the mobile video acquisition and display means comprise,
  • At least one visualization means At least one visualization means
  • the device further comprising at least two processing means connected to said mobile video acquisition and display means, each of said processing means being adapted to,
  • FIG. 1 shows schematically the test environment and validation of the ergonomics of a virtual cabin
  • FIG. 2 illustrates the virtual reality helmet worn by the user during the test and validation of the ergonomics of a virtual passenger compartment
  • the device according to the invention makes it possible to mix in real time the images of a virtual cockpit with a real video so as to test and validate new cockpits in a real situation.
  • the co-pilot has the same control means as the pilot, including means to brake, disengage, accelerate.
  • the pilot has a device comprising mobile video acquisition and visualization means, in particular, in the form of a virtual reality helmet also called headmachine (HMD or "Head Mounted Display” in terminology Anglo-Saxon).
  • HMD headmachine
  • These means are equipped, for example, with at least one visualization means, at least one video capture means, in particular a camera, capable of capturing the pilot's field of vision and a motion capture device rigidly linked to the (x ) video capture medium (s).
  • a virtual cockpit is chosen from a plurality.
  • This virtual cockpit can also consist of virtual enrichments animated by a virtual or real cabin, it can include indicators of speed, gauges and navigation system.
  • the exterior of the passenger compartment is the real world coming from the capture means of the pilot's field of vision.
  • the pilot has in real time a real image of the context in which he is piloting, notably via the means of capturing his field of vision and visualization means rendering this field of view captured.
  • Figure 1 schematically shows the environment for testing and validation of the ergonomics of a virtual passenger compartment in a real driving context.
  • the driver 12 in charge of performing the test and validation is installed in the real vehicle 14.
  • the vehicle is for example a real car capable of being driven on a road.
  • the pilot 12 is wearing a virtual reality helmet 16 equipped with a display screen, two cameras capable of capturing the pilot's field of vision and a motion capture device rigidly linked to the cameras, for example a branded sensor. "Laser Bird”.
  • FIG. 2 illustrates the virtual reality helmet 16 worn on the head 20 of the pilot 12.
  • the virtual reality helmet 16 comprises a motion capture device 22, or motion sensor, making it possible to determine in real time the position and the orientation of the virtual reality headset 16.
  • the motion sensor may be, for example, of the "Laser Bird" type.
  • the motion sensor comprises a movable part 22 and a fixed part (not shown), the measured movement being the relative movement of the movable part with respect to the fixed part.
  • the virtual reality headset 16 may be monoscopic or stereoscopic. If the virtual reality headset 16 is monoscopic, it includes a single camera. If it is stereoscopic, it includes two cameras, one associated with the left eye and the other associated with the right eye, in order to restore the reliefs to the pilot.
  • the example of virtual reality headset 16 shown in FIG. 2 is stereoscopic and comprises two cameras 24 and 25.
  • the moving part of the motion capture device is integral with the cameras so that the displayed summary images are well calibrated with respect to the video images, the computer images being a virtual passenger compartment.
  • the virtual reality headset also comprises two display screens 26 and 27 for viewing images transmitted from a computer via a link 28, wired or not.
  • a display screen is placed in front of the right eye of the user (screen 26), the other being placed in front of his left eye (screen 27).
  • the connections to the display screens and the display screens may be, for example, of the Super eXtended Graphics Array (SXGA) type.
  • SXGA Super eXtended Graphics Array
  • the principle, according to the invention, is to place a virtual cockpit, for example a virtual cockpit, in the real vehicle driven by the pilot.
  • FIG. 3 schematically illustrates the device of the present invention.
  • This device comprises the virtual reality headset 16, the motion sensor having a fixed portion 22b and a movable portion 22a integral with the cameras 24 and 25 attached to the virtual reality headset and two computers
  • One of the computers is associated with one of the virtual reality headset cameras while the other computer is associated with the other camera.
  • the computer connected to the camera associated with the left eye manages the image to be presented to the left eye
  • the other manages the image to be presented to the right eye.
  • the virtual reality headset includes only one camera, only one computer is needed to manage the image that will be presented to both eyes.
  • the computers 31 and 32 each comprise calculation means and input and output peripherals.
  • calculators 31 and 32 are each comprise calculation means and input and output peripherals.
  • a video input (311 and 321) connected to one of the cameras (24 and 25) of the virtual reality headset 16;
  • a video output (312 and 322) for example of the SXGA type, connected to one of the display screens (26 and 27) of the virtual reality headset 16;
  • connection (313 and 323) to the other computer, for example an Ethernet port, and the computers are interconnected by a network cable;
  • a synchronization connection (314 and 324), for example of the Genlock type, connected to the cameras (24 and 25) of the virtual reality headset 16 and to the synchronization connection of the other computer.
  • one of the computers 31 or 32 preferably comprises an input 315, for example an RS232 port or a USB port, connected to the motion capture device 22b for the position and orientation of the cameras 24 and 25.
  • This information may be transmitted to the second computer via the connection between the two computers, for example via Ethernet.
  • Each computer is responsible for displaying in real time the video image from the corresponding camera, mixed with the summary images, on the corresponding display screen.
  • the images generated by the computers are at the resolution of the virtual reality headset 16.
  • the co-pilot may also have an LCD screen to visualize the view of the left eye and the right eye of the pilot.
  • the hardware architecture described in FIG. 3 adds a VGA type splitter for each PC and a switch ("switch" in English terminology) in order to be able to display on the screen either the view of the left eye is the view of the right eye.
  • each computer may be a personal computer having the following characteristics,
  • G-SYNC Genlock synchronization card
  • - PAL or NTSC video capture card that uses the PCI bus, or that is connected to a USB2 type port, or a firewire port, or a cameralink port.
  • the virtual reality headset imposes the refresh rate, for example 60 Hz.
  • the video output of the computers must have a scanning frequency compatible with that of the virtual reality headset.
  • video cameras have a scanning frequency close to that of the virtual reality headset for example 50Hz in Europe and 60Hz in the United States.
  • the video cameras and the video outputs of the computers are "genlockable" to synchronize the video cameras and the video outputs of the computers.
  • the synchronization of the video cameras and the video outputs of the computers makes it possible to optimize the transmission time of the images (transport delay) and to obtain a stereoscopic restitution without artefact, even during the movements of the user.
  • the cameras can be SD resolution, for example Sony XC555 NTSC cameras with a Genlock sync input. Cameras can also be progressive (non-interlaced) with a USB2, firewire or cameralink output, for example a UEYE 2220C camera.
  • the field of view should be as close as possible to the field of view of the virtual reality headset for each eye, for example a 3.5mm NF-mount lens with a field of view greater than 100 degrees and a VT sensor. thumb.
  • the cameras used can also be HD resolution, using HD-SDI or cameralink output.
  • the video capture cards can have an HD-SDI input, for example on PCI bus, PCI-X or PCI-Express, such as the Decklink HD card, or a cameralink input. It is also possible to acquire non-interlaced signals via the USB2 and cameralink standards, for example.
  • each computer is equipped with a runtime version of the FUSION technology having the following features, at the launch of the software, FUSION loads the plurality of virtual driving environments that can be tested when the driving session; acquisition and display of real-time video streams with high performance; real-time processing of video streams to correct the radial optical distortions of video cameras, thus allowing the video from the cameras to be perfectly matched with the virtual cockpit; de-interlacing video images before the plot in the rendering loop if the cameras are interlaced; real-time processing of motion information (eg the "motion capture" flow from the "Laser Bird” sensor) by extrapolations and interpolations to dynamically synchronize video streams from cameras and computer graphics the virtual cockpit, by recalculating the positions and orientations of the motion sensor on the dates of reception of the video images; real-time calculation of the extri
  • the invention makes it possible to validate the design and ergonomics of a passenger compartment by visualizing a new environment in a real driving context, without constraint of angle of view (the pilot can move his head according to the six degrees of freedom corresponding to changes of position and orientation).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Closed-Circuit Television Systems (AREA)
EP06794315A 2005-08-09 2006-08-09 Einrichtung zum visualisieren eines virtuellen passagierabteils eines fahrzeugs in einer realen situation Withdrawn EP1913764A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0552475A FR2889753A1 (fr) 2005-08-09 2005-08-09 Systeme permettant a un utilisateur de visualiser un cockpit virtuel dans un environnement video
PCT/FR2006/001933 WO2007017596A2 (fr) 2005-08-09 2006-08-09 Dispositif de visualisation d'un habitacle virtuel d'un vehicule en situation reelle

Publications (1)

Publication Number Publication Date
EP1913764A2 true EP1913764A2 (de) 2008-04-23

Family

ID=37642025

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06794315A Withdrawn EP1913764A2 (de) 2005-08-09 2006-08-09 Einrichtung zum visualisieren eines virtuellen passagierabteils eines fahrzeugs in einer realen situation

Country Status (3)

Country Link
EP (1) EP1913764A2 (de)
FR (1) FR2889753A1 (de)
WO (1) WO2007017596A2 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11030818B1 (en) 2019-11-19 2021-06-08 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for presenting virtual-reality information in a vehicular environment
CN114785999B (zh) * 2022-04-12 2023-12-15 先壤影视制作(上海)有限公司 一种实时虚拟拍摄同步控制方法及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0537945A1 (de) * 1991-10-12 1993-04-21 British Aerospace Public Limited Company Vom Computer erzeugte Bilder mit Überlagerung realer Sehwahrnehmung
FR2775814A1 (fr) * 1998-03-06 1999-09-10 Rasterland Sa Systeme de visualisation d'images tridimensionnelles realistes virtuelles en temps reel
EP1124212A1 (de) * 2000-02-10 2001-08-16 Renault 3D-visuelle Präsentationsmethode und Apparat für Autosimulator

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2661061B1 (fr) * 1990-04-11 1992-08-07 Multi Media Tech Procede et dispositif de modification de zone d'images.
JPH11509064A (ja) * 1995-07-10 1999-08-03 サーノフ コーポレイション 画像を表現し組み合わせる方法とシステム
GB2329292A (en) * 1997-09-12 1999-03-17 Orad Hi Tec Systems Ltd Camera position sensing system
JP4878083B2 (ja) * 2001-03-13 2012-02-15 キヤノン株式会社 画像合成装置及び方法、プログラム
JP4022868B2 (ja) * 2002-11-13 2007-12-19 マツダ株式会社 企画支援プログラム、方法、装置並びに記憶媒体
JP4401727B2 (ja) 2003-09-30 2010-01-20 キヤノン株式会社 画像表示装置及び方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0537945A1 (de) * 1991-10-12 1993-04-21 British Aerospace Public Limited Company Vom Computer erzeugte Bilder mit Überlagerung realer Sehwahrnehmung
FR2775814A1 (fr) * 1998-03-06 1999-09-10 Rasterland Sa Systeme de visualisation d'images tridimensionnelles realistes virtuelles en temps reel
EP1124212A1 (de) * 2000-02-10 2001-08-16 Renault 3D-visuelle Präsentationsmethode und Apparat für Autosimulator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2007017596A2 *

Also Published As

Publication number Publication date
WO2007017596A2 (fr) 2007-02-15
FR2889753A1 (fr) 2007-02-16
WO2007017596A3 (fr) 2007-04-05

Similar Documents

Publication Publication Date Title
KR102281026B1 (ko) 홀로그램 앵커링 및 동적 포지셔닝 기법
US20170161939A1 (en) Virtual light in augmented reality
EP2237231B1 (de) Vorrichtung zur Echtzeit-Bestimmung der Positions- und Orientierungsdaten einer Vorrichtung in einer realen Szene
US20170372457A1 (en) Sharp text rendering with reprojection
US20110084983A1 (en) Systems and Methods for Interaction With a Virtual Environment
EP1852828A2 (de) Informationsverarbeitungsvorrichtung und Steuerungsverfahren dafür, Bildverarbeitungsvorrichtung, Computerprogramm und Speichermedium
US20150116354A1 (en) Mixed reality spotlight
CN107636534A (zh) 一般球面捕获方法
GB2481366A (en) 3D interactive display and pointer control
EP1913559A2 (de) Verfahren und einrichtungen zum visualisieren eines digitalen modells in einer realen umgebung
FR3041804A1 (fr) Systeme de simulation tridimensionnelle virtuelle propre a engendrer un environnement virtuel reunissant une pluralite d'utilisateurs et procede associe
EP3692710A1 (de) Beleuchtung für eingefügten inhalt
EP1913764A2 (de) Einrichtung zum visualisieren eines virtuellen passagierabteils eines fahrzeugs in einer realen situation
EP1913765A2 (de) Verfahren und vorrichtungen zur sichtbarmachung einer echten fahrgastzelle in einer synthetischen umgebung
WO2017187095A1 (fr) Dispositif et procede de partage d'immersion dans un environnement virtuel
FR2988962A1 (fr) Procede et dispositif de creation d'images
WO2017149254A1 (fr) Dispositif d'interface homme machine avec des applications graphiques en trois dimensions
WO2020141269A1 (fr) Dispositif de confort visuel
CN212873085U (zh) 一种抬头显示系统
US11656679B2 (en) Manipulator-based image reprojection
EP3996075A1 (de) Bilddarstellungssystem und -verfahren
Chow et al. Human visual perception of region warping distortions
FR2858868A1 (fr) Procede et dispositif de generation d'elements specifiques, et procede et dispositif de generation d'images de synthese comportant de tels elements specifiques
FR2889760A1 (fr) Systeme permettant d'ajouter des elements virtuels a une maquette reelle
FR2967519A1 (fr) Procede d'affichage d'un objet virtuel sur un ecran, procede d'emission de coordonnees d'un objet virtuel, procede d'echange de telles coordonnees, dispositif et programme d'ordinateur associes

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080221

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20081106

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: QUALCOMM CONNECTED EXPERIENCES, INC.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: QUALCOMM INCORPORATED

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20191203