EP1913765A2 - Verfahren und vorrichtungen zur sichtbarmachung einer echten fahrgastzelle in einer synthetischen umgebung - Google Patents

Verfahren und vorrichtungen zur sichtbarmachung einer echten fahrgastzelle in einer synthetischen umgebung

Info

Publication number
EP1913765A2
EP1913765A2 EP06794314A EP06794314A EP1913765A2 EP 1913765 A2 EP1913765 A2 EP 1913765A2 EP 06794314 A EP06794314 A EP 06794314A EP 06794314 A EP06794314 A EP 06794314A EP 1913765 A2 EP1913765 A2 EP 1913765A2
Authority
EP
European Patent Office
Prior art keywords
video stream
image
images
screen
synthesis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06794314A
Other languages
English (en)
French (fr)
Inventor
Valentin Lefevre
Jean-Marie Vaidie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Total Immersion
Original Assignee
Total Immersion
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Total Immersion filed Critical Total Immersion
Publication of EP1913765A2 publication Critical patent/EP1913765A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Definitions

  • the present invention relates to the testing and validation of cockpit ergonomics such as cockpits and dashboards and more particularly to a method and devices for testing and validating the ergonomics of a real cockpit in a synthetic environment. .
  • testing and validating the ergonomics of a cockpit or dashboard has evolved from the real world to a virtual world.
  • the testing and validation of a cockpit or a real dashboard requires the manufacture of this cockpit or this dashboard and possibly its mounting within the vehicle for which it is intended.
  • the cockpit or dashboard can only be tested in a limited number of situations due to, among other things, security issues.
  • the use of a virtual model offers many advantages. In particular, each modification does not require the realization of a new model, reducing design time and cost.
  • the testing and validation of a virtual cockpit or dashboard is often done through the use of video projectors and screens forming an immersive room.
  • the computer generated image to test and validate the cockpit or dashboard is as close to reality as possible, the actual interface of the tester with the cockpit or dashboard is not reproduced. There is therefore a need to optimize the test and the validation of the ergonomics of the cockpits.
  • the invention solves at least one of the problems discussed above.
  • the invention thus relates to a method for visualizing a real cabin in a synthesis environment, this method comprising the following steps,
  • the method according to the invention makes it possible to test and validate a passenger compartment in a synthesis environment, allowing the user real interaction, in real time, with the passenger compartment, without constraint angle of view.
  • the step of inserting the synthesis image into the video stream comprises the step of replacing the pixels having the color of the screen of uniform color, in the video stream, with the pixels corresponding of the computer image.
  • the step of inserting at least one synthesis image into said at least one video stream implements a chroma-key type algorithm. According to these particular embodiments, the processing volume applied to the images is limited, thus allowing the implementation of the invention with standard calculation means.
  • the invention also relates to a computer program comprising instructions adapted to the implementation of each of the steps of the method described above.
  • the invention also relates to a device for visualizing a real cabin in a synthesis environment, this device comprising the following means, means for acquiring at least one video stream coming from mobile image display and acquisition means comprising at least one display screen, at least one image sensor and a motion capture device, the video stream comprising images of a scene, the scene including the actual cockpit and a uniform color screen at least partially surrounding the cockpit;
  • calculating means for inserting at least one synthesis image into the video stream, according to the data characterizing the movements of the image sensor and according to the presence of the uniform color screen on the images of the video stream;
  • the device according to the invention makes it possible to test and validate a passenger compartment in a synthesis environment, allowing the user real interaction, in real time, with the passenger compartment, without any angle constraint.
  • the calculation means for inserting at least one synthesis image into the video stream comprises means for replacing the pixels having the color of the uniform color screen, in the video stream, with the corresponding pixels. of the computer image.
  • the device according to the invention comprises synchronization means between the calculation means and the mobile image display and acquisition means for reducing the delay between the reception of the video images and the availability of the resulting image including the real cockpit and the computer image and removing the artifacts during the user's movements.
  • the scanning frequency of the image sensor is substantially equal to the display screen scanning frequency.
  • the invention also relates to a mobile device for viewing and acquiring images comprising at least one display screen and at least one image sensor, this device comprising the following means,
  • motion capture means rigidly linked to the image sensor
  • the mobile image display and acquisition means comprise two display screens and two image sensors to allow stereoscopic vision.
  • FIG. 1 shows schematically the test environment and validation of the ergonomics of a passenger compartment
  • FIG. 2 illustrates the virtual reality helmet worn by the user during the test and validation of the ergonomics of a passenger compartment
  • - Figure 3 shows an example of the view that can have a user testing the ergonomics of a cabin, without activating his virtual reality headset
  • FIG 1 schematically shows the test environment and validation 100 of the ergonomics of a cabin.
  • the user 105 in charge of performing the test and validation is wearing a virtual reality headset 110, also called head-mounted display (HMD).
  • HMD head-mounted display
  • the cabin 115 whose ergonomics must be studied.
  • it may be around or partially around the user 105.
  • an aircraft cockpit is more encompassing than a car dashboard.
  • a uniform color screen 120 is placed behind the passenger compartment 115.
  • the user 105 is seated in front of the real dashboard 115 of a vehicle being designed.
  • This dashboard can be more or less complex depending on the needs. It may for example include or not the amounts of the vehicle. It can also be complemented by a cabin surrounding more or less the user 105. A cylindrical screen of uniform color surrounds the dashboard.
  • FIG. 2 illustrates the virtual reality headset 110 worn on the head 105 'of the user 105.
  • the virtual reality headset 110 comprises a motion capture device, or motion sensor, making it possible to determine in real time the position and the orientation of the optical centers of the cameras used.
  • the motion sensor may be, for example, of the type " ⁇ aserBIRD".
  • the motion sensor comprises a movable part 200-1 and a fixed part (not shown), the measured movement being the relative movement of the movable part relative to the fixed part.
  • the virtual reality headset 110 may be monoscopic or stereoscopic. If the 110 virtual reality headset is monoscopic it includes a single camera.
  • the example of a virtual reality helmet 110 shown in FIG. 2 is stereoscopic and comprises two cameras 205 and 210.
  • the mobile part of the motion capture device is integral with the cameras so that the displayed summary images are well calibrated with respect to video images.
  • the virtual reality headset also includes two display screens 215 and 220 for viewing images transmitted from a computer.
  • a link 225, wired or not, is used to exchange the necessary data between the virtual reality headset 110 and the computer to which it is connected.
  • a display screen is placed in front of the right eye of the user (screen 215), the other being placed in front of his left eye (screen 220).
  • the connections to the display screens and the display screens may be, for example, of the Super eXtended Graphics Array (SXGA) type.
  • SXGA Super eXtended Graphics Array
  • the principle according to the invention is to place the real cockpit, for example a dashboard or a cockpit, in front of a background of uniform color, for example a blue background or a green background.
  • the video images from the two cameras 205 and 210 of the virtual reality headset are acquired and then processed according to the chroma-key algorithm.
  • the principle of chroma-key consists in replacing in an image all the points of a given color by the corresponding points of another video source.
  • a threshold having a predetermined value is used. If the distance between the color of a point and the given color is less than this threshold, the point is replaced otherwise it is not modified.
  • FIG. 3 shows an example of the view that the user 105 may have without activating his visual reality helmet 110.
  • the user 105 When the user 105 is installed in his place in front of the dashboard 115, he sees this dashboard with his instruments and his controls in front of him. The user sees the uniform color screen 120 through the uprights 300.
  • the virtual reality headset 110 When the virtual reality headset 110 is activated, the computers used receive the images coming from the cameras 205 and 210. Each computer processes and transmits these video images in which a synthesis image is inserted, to the display screens 215 and 220. , the view transmitted to the user is the one he has without the virtual reality headset in which the uniform color of the screen 120 is replaced by computer graphics.
  • These computer-generated images may represent, for example, a landscape or a street that scrolls.
  • FIG. 4 schematically illustrates the device of the present invention.
  • This device comprises the virtual reality headset 110, the motion sensor having a fixed portion 200-2 and a movable portion 200-1 integral with the cameras 205 and 210 attached to the virtual reality headset and two calculators 400 and 405 of the computer type individual (or Personal Computer, PC).
  • One of the computers is associated with one of the virtual reality headset cameras while the other computer is associated with the other camera.
  • the computer connected to the camera associated with the left eye manages the image to be presented to the left eye
  • the other manages the image to be presented to the right eye.
  • the virtual reality headset includes only one camera, only one computer is needed to manage the image that will be presented to both eyes.
  • the computers 400 and 405 each comprise calculation means and input and output peripherals.
  • the computers 400 and 405 each comprise,
  • a video input 400-1 and 405-1) connected to one of the cameras (205 and 210) of the virtual reality headset 110;
  • a video output 400-2 and 405-2, for example of the SXGA type, connected to one of the display screens (215 and 220) of the virtual reality headset 110;
  • connection 400-3 and 405-3 to the other computer, for example an Ethernet port; and, a synchronization connection (400-4 and 405-4), for example of the Genlock type, connected to the cameras (205 and 210) of the virtual reality headset 110 and to the synchronization connection of the other computer.
  • one of the computers 400 or 405 preferably comprises a 400-5 input connected to the motion capture device 200-2 for the position and orientation of the cameras 205 and 210. This information can be transmitted to the second computer via the connection between the two computers, for example via Ethernet.
  • Each computer is responsible for displaying in real time the video image from the corresponding camera, mixed with the summary images, on the corresponding display screen.
  • the images generated by the computers are at the resolution of the virtual reality headset 110.
  • each computer may be a personal computer having the following characteristics,
  • G-SYNC Genlock card
  • the virtual reality headset imposes the refresh rate, for example 60 Hz.
  • the video output of the computers must have a scanning frequency compatible with that of the virtual reality headset.
  • video cameras have a scanning frequency close to that of the virtual reality headset for example 50Hz in Europe and 60Hz in the United States.
  • the video cameras and video outputs of computers are "genlockable" to synchronize video cameras and video outputs calculators.
  • the synchronization of the video cameras and the video outputs of the computers makes it possible to optimize the transmission time of the images (transport delay) and to obtain a stereoscopic restitution without artefact, even during the movements of the user.
  • the cameras can be SD resolution, for example NTSC Sony XC555 cameras with Genlock input. They can also be non-interlaced, for example of the Ueye 2220C type on the USB2 port.
  • the field of vision should be as close as possible to the field of vision of the virtual reality headset for each eye, for example a 3.5mm focal length optics on a NF mount (for a half-inch CCD sensor, a field of view is obtained greater than 100 degrees).
  • Video input cards can have an HD-SDI input on PCI-X or PCI-Express bus, such as the Decklink HD card, or a link camera input.
  • each computer is equipped with a runtime version of the FUSION technology having the following functionalities,
  • the motion information for example the motion capture flux delivered by the laser BERD sensor
  • interpolating and extrapolating techniques for dynamically synchronizing the video streams from the cameras and the image streams of synthesis by recalculating the positions and orientations of the motion sensor on the dates of reception of the video images
  • CPU time (CPU time) than the first one, so it is usually used.
  • the processing performances according to this method are optimized by the pixel processing modules (or pixel shaders).
  • the following table summarizes the performance of the solution according to the invention in the case of a virtual reality headset having a scanning frequency of 60 Hz and SXGA video inputs,
  • the processing time for a video acquisition, the implementation of the chroma-key algorithm and the video distortion processing is less than 5ms with the pixel shader optimizations, the goal being to leave about 10ms per visual cycle to display the polygons of the computer graphics database.
  • the invention makes it possible to validate the design and ergonomics of the interior of a vehicle by visualizing the real passenger compartment and by visualizing a synthetic environment such as a road or buildings, without constraint. angle of view (the user can move his head according to the six degrees of freedom corresponding to the changes of positions and orientation).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)
EP06794314A 2005-08-09 2006-08-09 Verfahren und vorrichtungen zur sichtbarmachung einer echten fahrgastzelle in einer synthetischen umgebung Withdrawn EP1913765A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0552476A FR2889754A1 (fr) 2005-08-09 2005-08-09 Systeme permettant a un utilisateur de visualiser un cockpit reel dans un environnement video, notamment un environnement de conduite automobile
PCT/FR2006/001932 WO2007017595A2 (fr) 2005-08-09 2006-08-09 Procede et dispositifs pour visualiser un habitacle reel dans un environnement de synthese

Publications (1)

Publication Number Publication Date
EP1913765A2 true EP1913765A2 (de) 2008-04-23

Family

ID=37663328

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06794314A Withdrawn EP1913765A2 (de) 2005-08-09 2006-08-09 Verfahren und vorrichtungen zur sichtbarmachung einer echten fahrgastzelle in einer synthetischen umgebung

Country Status (3)

Country Link
EP (1) EP1913765A2 (de)
FR (1) FR2889754A1 (de)
WO (1) WO2007017595A2 (de)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010016113A1 (de) 2010-03-24 2011-09-29 Krauss-Maffei Wegmann Gmbh & Co. Kg Verfahren zur Ausbildung eines Besatzungsmitglieds eines insbesondere militärischen Fahrzeugs
EP3379828A1 (de) 2017-03-23 2018-09-26 Thomson Licensing Vorrichtung und verfahren für immersive visuelle darstellungen und individuelle kopfausrüstung
CN112908084A (zh) * 2021-02-04 2021-06-04 三一汽车起重机械有限公司 作业机械的模拟训练系统、方法、装置和电子设备
CN113448445B (zh) * 2021-09-01 2021-11-30 深圳市诚识科技有限公司 一种基于虚拟现实的目标位置跟踪方法和系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9121707D0 (en) * 1991-10-12 1991-11-27 British Aerospace Improvements in computer-generated imagery
JP4878083B2 (ja) * 2001-03-13 2012-02-15 キヤノン株式会社 画像合成装置及び方法、プログラム
JP4022868B2 (ja) * 2002-11-13 2007-12-19 マツダ株式会社 企画支援プログラム、方法、装置並びに記憶媒体
JP4401727B2 (ja) * 2003-09-30 2010-01-20 キヤノン株式会社 画像表示装置及び方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2007017595A3 *

Also Published As

Publication number Publication date
WO2007017595A3 (fr) 2007-04-05
WO2007017595A2 (fr) 2007-02-15
FR2889754A1 (fr) 2007-02-16

Similar Documents

Publication Publication Date Title
US10867432B2 (en) Methods and systems for rendering virtual reality content based on two-dimensional (2D) captured imagery of a three-dimensional (3D) scene
US10990186B2 (en) Virtual reality head-mounted devices having reduced numbers of cameras, and methods of operating the same
CN106662749B (zh) 用于全视差光场压缩的预处理器
US20230111408A1 (en) Techniques for capturing and rendering videos with simulated reality systems and for connecting services with service providers
US9967555B2 (en) Simulation device
US12026833B2 (en) Few-shot synthesis of talking heads
US20240296626A1 (en) Method, apparatus, electronic device and storage medium for reconstructing 3d images
US20150035832A1 (en) Virtual light in augmented reality
CN107636534A (zh) 一般球面捕获方法
CN109074681A (zh) 信息处理装置、信息处理方法和程序
EP2104925A1 (de) Verfahren und vorrichtung zur echtzeiteinbettung von virtuellen objekten in einen bildstrom mithilfe von daten aus einer echten szene, die von den bildern dargestellt wird
CN107562185B (zh) 一种基于头戴vr设备的光场显示系统及实现方法
EP1913765A2 (de) Verfahren und vorrichtungen zur sichtbarmachung einer echten fahrgastzelle in einer synthetischen umgebung
US20130127994A1 (en) Video compression using virtual skeleton
FR2900261A1 (fr) Procede de traitement d'images d'un simulateur
EP2297705A1 (de) Verfahren zur echtzeitzusammenstellung eines videos
FR2988962A1 (fr) Procede et dispositif de creation d'images
EP1913764A2 (de) Einrichtung zum visualisieren eines virtuellen passagierabteils eines fahrzeugs in einer realen situation
FR3071650A1 (fr) Procede de realite augmentee pour la visualisation de plats de restaurant
FR3077910A1 (fr) Procede d'aide a la maintenance d'un systeme complexe
US11295531B1 (en) System and method for generating interactive virtual image frames in an augmented reality presentation
WO2010064479A1 (ja) デジタル映像再生装置
CN117768599A (zh) 处理图像的方法、装置、系统、电子设备和存储介质
JP2011164184A (ja) 画像表示装置、表示制御プログラム、及び画像表示システム
WO2019122440A1 (en) System and method for capturing and visualizing a 3d scene in real-time

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080221

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20081106

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20090519