EP1960159A2 - Betrachtungssystem zur manipulierung eines objekts - Google Patents

Betrachtungssystem zur manipulierung eines objekts

Info

Publication number
EP1960159A2
EP1960159A2 EP06841920A EP06841920A EP1960159A2 EP 1960159 A2 EP1960159 A2 EP 1960159A2 EP 06841920 A EP06841920 A EP 06841920A EP 06841920 A EP06841920 A EP 06841920A EP 1960159 A2 EP1960159 A2 EP 1960159A2
Authority
EP
European Patent Office
Prior art keywords
space
manipulation
user
image
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06841920A
Other languages
English (en)
French (fr)
Inventor
Olivier Kleindienst
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
KOLPI SARL A CV
Original Assignee
KOLPI SARL A CV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by KOLPI SARL A CV filed Critical KOLPI SARL A CV
Publication of EP1960159A2 publication Critical patent/EP1960159A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40122Manipulate virtual object, for trajectory planning of real object, haptic display
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40126Virtual landmarks, reference points for operator

Definitions

  • the present invention relates to the field of stereoscopy and more particularly to a method or a system for acting remotely to modify the properties of an object.
  • Stereoscopy is a method of restoring a sensation of relief from the fusion of two flat images of the same subject.
  • Stereoscopy thus allows a 3-dimensional representation from two image sources.
  • Stereoscopy is used in video games to represent the game environment in which the user will play.
  • Stereoscopy is also used in the field of petroleum research to locate deep aquifers.
  • Stereoscopy is a very important tool in the field of mechanics concerning design journals, virtual fluid flows as well as simulator training.
  • an object of the present invention is to provide a system that allows to perform a precise manipulation in three dimensions in a volume located around the user 's accommodation distance, on an object that is visualized in a device adapted for this purpose.
  • the manipulation space is a spatially coherent volume with the viewing space.
  • Another advantage is that the handling space is free ie empty. Thus one does not visualize the hands or the real and physical actuator-sensors on the display device. The representation of the hands is thus hidden.
  • this device is a multi-universe device that can combine real and virtual spaces.
  • downscaling is possible in order to adapt, for example, the manipulation of an object of a real space, with the manipulation of an object of a virtual space.
  • An additional advantage is the high accuracy of the resolution since the density of voxels is greater than ten thousand voxels per cubic centimeter.
  • the display device is compatible with computer applications.
  • FIG. 1 shows a side view of a user sitting in front of a display device
  • FIG. 2 represents an exploded view of the display device, oriented along its rear face
  • FIG. 3 represents a view from above of one of the elements of the display device:
  • FIG. 4 represents the display device oriented along its front face;
  • Figure 5 shows the connection diagram between the display device, a computer and different peripherals;
  • FIG. 6 represents a sensor-actuator manipulated by the user;
  • Fig. 7 shows the optical process of the manipulation system;
  • FIG. 7bis shows the optical paths in the display device;
  • FIG. 8 represents the top view of FIG.
  • FIG. 9 represents the haptic process of the heat transfer system.
  • Figure 1 shows a user 1 in front of a device 2.
  • the user can be seated on a seat 3 or standing in front of the device.
  • a support 4 is provided under the device.
  • an articulated arm 5 is fixed at one of its ends, the other end of the arm being directly fixed on a location 6 chosen by the user (table, desk, ).
  • the attachment of this support is located on the rear part of the device.
  • this device 2 is first of all constituted by a base 7.
  • This base has the shape of a half-disk (FIG. 3).
  • the rectilinear side 7a of the half-disc is the front part of the base.
  • the rounded side 7b of the half-disc is the rear part of the base.
  • the rear and front of the base is joined by forming at their two ends a slight rectangular stall 7c.
  • the edge of the surface In the middle of the rounded side of this half-disc the edge of the surface describes a rounded shape 7d oriented towards the inside of the half-disc. In relation to this rounded shape, a more rectangular shape 7e is cut in the middle of the rectilinear side of the half-disc. On each side of this rectangular shape, the straight side of the half-disc has a slightly bent shape 7f.
  • This central support 8 is composed of two rectangular panels interconnected according to their lengths. Each width of these panels is fixed on the base by means of screws. This central support is fixed in the center of the half-disc 7, between the rounded shape 7d and the rectangular shape 7e. These panels form between them an angle of about 45 °. The angular opening of the panels is directed towards the rounded shape 7d of the base 7.
  • the panels of this central support 8 each comprise a central mirror 9.
  • Each lateral support 10 comprises two parallelepipeds connected to each other by a z-shaped structure. Each lateral support is oriented approximately 45 ° with respect to the reciligne 7a of the base towards the interior of this base.
  • Each lateral support also comprises a lateral mirror 11. This mirror has a reflection index ranging from 0 to 100% of reflection. This side mirror can be replaced by a screen, a filter or even lenses.
  • the lateral mirror and the parallelepipeds are fixed on the base 7 by means of screws 20 and pins 21.
  • two connectors "Cinch" 19 are arranged on a video plate 23.
  • the screen support 12 of the screen 13 At the front of the lateral supports 10 and the central support 8 is the screen support 12 of the screen 13.
  • This screen support 12 has a length equal to that of the rectilinear portion 7a of the base.
  • This screen support 12 is fixed on the front part of the base by means of screws 20.
  • the screen support 12 At one from its ends and the side of the "cinch” connectors 19, the screen support 12 comprises a "Bulgin" type switch 18 mounted on an inter plate 22 by means of screws.
  • a volume part taking up the contours of the base is arranged on the base so as to cover the central support, the lateral supports and the screen support.
  • This volume part comprises a belt 14 which constitutes the lateral surface of the volume part.
  • the belt 14 reproduces the cutout of the rounded shape 7d of the base.
  • the belt 14 facing the cinch connectors 19, the belt has a gap to allow the cinch connectors of the lateral support 10 to fit together.
  • the belt reproduces the rectangular shape. 7th of the base. As shown in FIG. 4, at a certain height of the base and vertically aligned with this rectangular cutout, the belt 14 comprises a viewing window 15.
  • the volume part also comprises a lid 16 shown in FIG. 3. This lid is identical to the half-disk describing the base 7. On this lid 16 is a decorative plate 17.
  • the device 2 thus only reveals the cover 16, the outer face of the base 7, the belt 14.
  • the belt On the front part of the device, the belt comprises the viewing window 15 which gives access to the space display 15a.
  • the work area 15b of the user On the back of the device, behind the belt 14, is the work area 15b of the user, as we will detail later.
  • the device is directly connected to an electronic central unit 25 such as a computer.
  • This computer includes various image processing software as well as graphics or electronic cards.
  • This computer also includes peripherals such as for example a keyboard 26, a mouse 28 and a monitor 27.
  • this computer comprises a device for using the device 2: it is one or more sensor-actuators 29 as shown in Figure 6.
  • This actuator sensor 29 is composed of a support base 30 connected to the computer 25.
  • This base 30 comprises a slot 31 in which a pivoting member 32 is disposed.
  • This pivoting element has a length of a few centimeters. He is himself in solidarity with an element lengthened 33 and a length of about ten centimeters.
  • This elongated element has an end 34 in the form of an open clamp.
  • This kind of clamp 34 maintains the end of an element 35 that looks like a pen. It is a stylus that has six degrees of freedom in space. This stylet 35 also laterally has a small slider 35a as we will detail later. The user therefore acts by manipulating this stylus in the work area.
  • FIG. 7 shows the optical operating mode of the display device 2 in connection with the computer 25.
  • a real object 36 a coffee machine, disposed at a greater or lesser distance from the display device 2. This object is located in a real orthonormal space 37 (xr, yr, zr).
  • the image of this object is visible in the viewing space 15a, in an orthonormal handling space 38 (xm, ym, zra). Indeed, at least two video image capturing systems 39 and 40 located on either side of the object retransmit by a video link 40a the image of the real object 36 directly to the display device 2 on first inputs 41. These image capture systems are, for example, video cameras, magnetic resonance imaging systems or scanners using lasers. Following this first real object 36, consider a second object, for example virtual, such as a coffee cup 42. This object is located in a virtual orthonormal space 43 (xv, yv, zv). The image of this object is visible in the viewing space 15a, in the same orthonormal handling space as previously 38 (xm, ym, zm). Indeed at least two virtual cameras 44 and 45 located on either side of the virtual object send the image of the virtual object to the computer 25. This computer processes the images so as to retransmit them to the device. visualization on inputs 2 47 via a link 45a.
  • the rear screen 60 of the display device generates two extreme rays 60a and 60b which correspond to an image coming from the rear of the display device.
  • this image is received by the right eye and the left eye of the user.
  • the front screen 13 of the visualization generates, for example, a mean radius 13a which corresponds to an image from the front of the display device.
  • this image is received by the right eye and the left eye of the user.
  • the user 1 can manipulate the actuator-sensors 35 by means of his hands 48, without the image of the hands does not appear in the viewing space 15a.
  • the display device 2 makes it possible to visualize the superposition of the two real and virtual images. This is then augmented virtual reality.
  • the operating mode (haptic) described in FIG. 9 makes it possible to represent the use of the sensor-actuators 35 and the consequence of this use on the objects viewed as images in the viewing space 15a.
  • These sensor-actuators 35 have a slider 35a which makes it possible to choose between a positioning action on the object in the image displayed and an action of movement of the object on the image displayed.
  • the user can control actions (positioning or movement) as well on the whole object or only on a part. So the user can rotate an object, orient it differently, exert pressure on a certain area of this object.
  • This manipulation can be exercised both via a sensor and without the sensor.
  • the user can for example directly use his fingers and perform any type of manipulation. It can also perform these manipulations via a glove. During this manipulation, the user visualizes the two images (real and virtual) superimposed in the viewing space.
  • the user acts, for example, on the real image and on the virtual image.
  • This action is transmitted in real space 50 and virtual space 51 after a processing by the computer 25.
  • a first function of the computer synchronizes over time between the images of the real space and those of the computer. virtual space after the changes made by the user on each object in each space.
  • sensor-actuators 52, 53, 54 and 55 located on either side of the real or virtual object.
  • Each of these sensor-actuators is provided with a processing unit 56.
  • the action transmitted in the real space 50 is received by the processing unit 56 of the real actuator-sensors 52 and 53.
  • These sensor-actuators can be for example articulated arms of robots or cylinders.
  • the action transmitted in the virtual space 51 is received by the processing unit 56 of the virtual actuator-sensors 54 and 55.
  • the actuator-sensors 52, 53, 54 and 55 are not viewable since they intervene digitally only.
  • the actual actuator sensors 52 and 53 transmit force feedback signals.
  • the force feedback signal from the actual actuator sensor 52 is processed by the actual actuator processing unit 56. Then the computer 25 receives this signal and processes it. Indeed a second function of the computer performs a scaling between the images of the real space and those of the virtual space before the superposition of these two images in the viewing space 15a.
  • the images are transmitted to the processing unit 56 of the virtual sensor-actuator 52 as well as to the processing unit 56 of the sensor-actuator 35 of the handling space.
  • the user can then feel the force feedback in the viewing space 15a.
  • the sensors-actuators allow to enter the sense of touch.
  • This type of operation also applies if, instead of considering a real object and a virtual object, we consider two virtual objects (it is then virtual reality dual), or a single virtual object (it is a question of a stereoscopic screen) or a single real object (this is autostereoscopic television).

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Processing Or Creating Images (AREA)
  • Stereo-Broadcasting Methods (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
EP06841920A 2005-12-14 2006-12-13 Betrachtungssystem zur manipulierung eines objekts Withdrawn EP1960159A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0512676A FR2894684B1 (fr) 2005-12-14 2005-12-14 Systeme de visualisation pour la manipulation d'un objet
PCT/FR2006/002716 WO2007068824A2 (fr) 2005-12-14 2006-12-13 Systeme de visualisation pour la manipulation d'un objet

Publications (1)

Publication Number Publication Date
EP1960159A2 true EP1960159A2 (de) 2008-08-27

Family

ID=36168935

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06841920A Withdrawn EP1960159A2 (de) 2005-12-14 2006-12-13 Betrachtungssystem zur manipulierung eines objekts

Country Status (5)

Country Link
US (1) US8767054B2 (de)
EP (1) EP1960159A2 (de)
JP (1) JP2009519481A (de)
FR (1) FR2894684B1 (de)
WO (1) WO2007068824A2 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9643314B2 (en) * 2015-03-04 2017-05-09 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
US11285607B2 (en) * 2018-07-13 2022-03-29 Massachusetts Institute Of Technology Systems and methods for distributed training and management of AI-powered robots using teleoperation via virtual spaces
JP7169130B2 (ja) * 2018-09-03 2022-11-10 川崎重工業株式会社 ロボットシステム

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5631973A (en) * 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
DE69332914T2 (de) * 1992-01-21 2004-02-26 Sri International, Menlo Park Chirurgisches System
JPH08280003A (ja) * 1995-04-07 1996-10-22 Olympus Optical Co Ltd 立体撮影装置
JP4100531B2 (ja) * 1998-08-11 2008-06-11 株式会社東京大学Tlo 情報提示方法及び装置
US6181768B1 (en) * 1999-06-04 2001-01-30 Leonard F. Berliner Radiological image acquisition and manipulation system for multiple view stereoscopic imaging
US6803928B2 (en) * 2000-06-06 2004-10-12 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Extended virtual table: an optical extension for table-like projection systems
WO2002005217A1 (en) * 2000-07-07 2002-01-17 Kent Ridge Digital Labs A virtual surgery system with force feedback
GB0211229D0 (en) * 2002-05-16 2002-06-26 Stereoscopic Image Systems Ltd Apparatus for the optical manipulation of a pair of landscape stereoscopic images
US20050285854A1 (en) * 2004-06-29 2005-12-29 Ge Medical Systems Information Technologies, Inc. 3D display system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2007068824A2 *

Also Published As

Publication number Publication date
WO2007068824A3 (fr) 2007-08-09
WO2007068824A2 (fr) 2007-06-21
US8767054B2 (en) 2014-07-01
US20090273665A1 (en) 2009-11-05
WO2007068824B1 (fr) 2007-10-18
JP2009519481A (ja) 2009-05-14
FR2894684A1 (fr) 2007-06-15
FR2894684B1 (fr) 2008-03-21

Similar Documents

Publication Publication Date Title
Ware et al. Visualizing graphs in three dimensions
Ware et al. Reevaluating stereo and motion cues for visualizing graphs in three dimensions
US9684994B2 (en) Modifying perspective of stereoscopic images based on changes in user viewpoint
AU2019206713A1 (en) Systems and methods for rendering data from a 3D environment
US20060250391A1 (en) Three dimensional horizontal perspective workstation
US20060126925A1 (en) Horizontal perspective representation
FR3041804A1 (fr) Systeme de simulation tridimensionnelle virtuelle propre a engendrer un environnement virtuel reunissant une pluralite d'utilisateurs et procede associe
Gerschütz et al. A review of requirements and approaches for realistic visual perception in virtual reality
EP2721444B1 (de) System zur positionierung eines berührungsbildschirms und eines virtuellen objekts sowie vorrichtung zur manipulation von virtuellen objekten mit einem solchen system
WO2007068824A2 (fr) Systeme de visualisation pour la manipulation d'un objet
EP2440975B1 (de) System und verfahren zur erzeugung stereoskopischer bilder unter verschiebung der objekte in der fernzone parametrierbare für ein system und verfahren mit n kameras (n>1)
EP2831847A1 (de) Verfahren und vorrichtung zur erzeugung von bildern
EP2994813B1 (de) Verfahren zur steuerung einer grafischen schnittstelle zur anzeige von bildern eines dreidimensionalen objekts
Garagnani et al. Virtual and augmented reality applications for Cultural Heritage
FR3056770A1 (fr) Dispositif et procede de partage d'immersion dans un environnement virtuel
FR2971864A1 (fr) Equipement de realite virtuelle equipe d'un dispositif de contact et utilisation d'un tel dispositif dans un equipement de realite virtuelle
WO2017149254A1 (fr) Dispositif d'interface homme machine avec des applications graphiques en trois dimensions
Peterka et al. Personal varrier: autostereoscopic virtual reality display for distributed scientific visualization
EP1124212B1 (de) 3D-visuelle Präsentationsmethode und Apparat für Autosimulator
Manferdini et al. virtual exhibition and fruition of archaeological finds
Perelman et al. Designing an input device to interact with multidimensional data: disco
FR2947348A1 (fr) Dispositif pour manipuler et visualiser un objet virtuel
CN201796580U (zh) 立体数码相架和电子照片视觉显示系统
EP4352953A1 (de) System zur erfassung von 2d-digitalbildern und zur simulation von 3d-digitalbildern und -sequenzen
WO2022261105A1 (en) 2d digital image capture system and simulating 3d digital image and sequence

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080613

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20151218

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160429