WO2007114225A1 - Affichage stereo d'images bidimensionnelles - Google Patents

Affichage stereo d'images bidimensionnelles Download PDF

Info

Publication number
WO2007114225A1
WO2007114225A1 PCT/JP2007/056816 JP2007056816W WO2007114225A1 WO 2007114225 A1 WO2007114225 A1 WO 2007114225A1 JP 2007056816 W JP2007056816 W JP 2007056816W WO 2007114225 A1 WO2007114225 A1 WO 2007114225A1
Authority
WO
WIPO (PCT)
Prior art keywords
image display
stereoscopic
dimensional
dimensional image
image
Prior art date
Application number
PCT/JP2007/056816
Other languages
English (en)
Japanese (ja)
Inventor
Isao Tomisawa
Masaru Ishikawa
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007114225A1 publication Critical patent/WO2007114225A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images

Definitions

  • the present invention relates to a stereoscopic two-dimensional image display device that displays a two-dimensional image in a pseudo-stereoscopic manner.
  • Patent Document 1 JP 2001-255493 A
  • Patent Document 2 Japanese Patent Laid-Open No. 2003-98479
  • the above-described conventional stereoscopic two-dimensional image display device simply displays a two-dimensional image in three dimensions, and an observer simply observes the stereoscopic two-dimensional image displayed in a passive position. Just do it. That is, the observer cannot perform any action on the displayed stereoscopic two-dimensional image, and it is not an interactive communication tool. Therefore, a stereoscopic two-dimensional image display device as an interactive communication tool is desired!
  • the present invention has been made in view of the above circumstances, and an example of the problem is to provide a stereoscopic two-dimensional image display device capable of giving a sense of vigilance to an observer. .
  • a stereoscopic two-dimensional image display device In order to achieve the above object, a stereoscopic two-dimensional image display device according to claim 1 A display unit having an image display surface for displaying a three-dimensional image, and an image transmission panel disposed on the image display surface so as to be separated from the image display surface. A stereoscopic image display means for displaying a stereoscopic two-dimensional image by forming an image on an imaging plane in a space opposite to the display portion of the panel, and a position of the detection object in the vicinity of the imaging plane.
  • the haptic presentation means for applying a force to the subject that is part or all of the detected object, and the position of the detected object detected by the position detecting means, And a knot control unit that controls the magnitude and direction of the force applied to the object by the knot presentation unit.
  • the stereoscopic two-dimensional image display device includes a display unit including an image display surface for displaying a two-dimensional image, and an image transmission panel arranged separately from the image display surface.
  • the three-dimensional two-dimensional image is displayed by forming light emitted from the image display surface on an image formation surface in a space opposite to the display portion of the image transmission panel.
  • a force sense control means for controlling the magnitude and direction of the force applied to the object.
  • FIG. 1 is a configuration diagram of a stereoscopic two-dimensional image display device according to an embodiment of the present invention.
  • FIG. 2 is a diagram for explaining an example of position detection and sensation presentation in the stereoscopic two-dimensional image display device according to the embodiment of the present invention.
  • FIG. 3 is a diagram illustrating an example of a stereoscopic two-dimensional image display device according to an embodiment of the present invention.
  • FIG. 4 is a diagram for explaining a specific example in which the sensation is changed according to the position of a finger in the embodiment shown in FIG.
  • FIG. 5 is a diagram for explaining another example of the stereoscopic two-dimensional image display device according to the embodiment of the present invention.
  • FIG. 6 is a diagram for explaining a specific example in which the sensation is changed according to the displayed image in the embodiment shown in FIG.
  • FIG. 7 is a diagram for explaining another example of the stereoscopic two-dimensional image display device according to the embodiment of the present invention.
  • FIG. 8 is a diagram for explaining a specific example in which the image is changed according to the position of the finger to change the sense of click in the embodiment shown in FIG.
  • FIG. 9 is a diagram showing an example of the shape of an object used in the stereoscopic two-dimensional image display device according to the embodiment of the present invention.
  • FIG. 1 is a configuration diagram of a stereoscopic two-dimensional image display apparatus 100 according to an embodiment of the present invention.
  • the stereoscopic two-dimensional image display device 100 is a pseudo-stereoscopic image display device that displays a two-dimensional image that can be viewed by an observer as a stereoscopic display on a predetermined plane in space.
  • a stereoscopic two-dimensional image display device 100 includes a display unit 10 and a microlens array 20 (image transmission panel), and includes a stereoscopic image display unit 2 that displays a two-dimensional image on a plane in space. Prepared in body 1.
  • the stereoscopic two-dimensional image display device 100 is associated with the sound output unit 50 that outputs sound such as sound effects and human voices on the housing 1 and the position of the two-dimensional image displayed in the space.
  • the position detection sensor 40 that is arranged and detects the position of the object to be detected by the observer H (for example, a finger) 70 and the object to be detected (for example, a metal) that is attached to the object 70 to be detected by the observer H
  • the haptic device 60 that applies force to 71 is provided.
  • the sensation of sensation means the sensation that occurs when an object is touched, and is the sensation of recognizing the hardness of the object or the shape of the object. That is, when the observer H touches the displayed pseudo-stereoscopic image, the stereoscopic two-dimensional image display device 100 applies a force to the observer H and presents a sense of touching an object. It is also a device to do.
  • the display drive unit 3 that drives and controls the display unit 10 and the position detection sensor 40 are driven, and the detection that is detected via the position detection sensor 40 is performed.
  • a sensor driving unit 4 for receiving signals, an audio driving unit 5 for driving and controlling the audio output unit 50, a katsu device driving unit 6 for driving and controlling the katsu device 60, an image generating unit 7 for generating image data to be displayed, and
  • a control unit 8 that performs overall control regarding image display, audio output, and force presentation is provided adjacent to the housing 1.
  • the display drive unit 3, the sensor drive unit 4, the audio drive unit 5, the haptic device drive unit 6, the image generation unit 7, and the control Part 8 may be configured so that all components are included in the same casing.
  • the display unit 10 is provided along one inner surface of the housing 1 facing the observer H, and is two-dimensional.
  • An image display surface 11 for displaying images is provided.
  • the display unit 10 includes a display such as a liquid crystal display, an EL (electroluminescence) panel, and a CRT (Cathode Ray Tube), and displays an image corresponding to the drive signal of the display drive unit 6. Displayed on image display screen 11. That is, light corresponding to the image is emitted from the image display surface 11.
  • a panel display such as a liquid crystal display or an EL panel which is a thin image display surface is suitable.
  • the microlens array 20 is configured by arranging two lens array halves 21 in parallel with each other.
  • Each lens array half 21 has a plurality of micro-convex lenses 23 having the same radius of curvature arranged adjacent to each other in a matrix on both surfaces of a transparent substrate 22 made of glass or resin having excellent translucency.
  • the optical axis of each micro-convex lens 23 formed on one surface is adjusted to be the same as the optical axis of the micro-convex lens 23 formed on the other surface. That is, each pair of micro-convex lenses 23 adjusted to have the same optical axis is arranged two-dimensionally (the XY plane in FIG. 1) so that the respective optical axes are parallel to each other. Yes.
  • the microlens array 20 is arranged at a position parallel to the image display surface 11 of the display unit 10 by a predetermined distance (the working distance of the microlens array 20).
  • the microlens array 20 is a three-dimensional image display surface in which the light corresponding to the image emitted from the image display surface 11 of the display unit 10 is separated by a predetermined distance (the working distance of the microlens array 20) opposite to the image display surface 11.
  • the image displayed on the image display surface 11 is displayed on the stereoscopic image display surface 30 which is a two-dimensional plane in space. This formed image is a force that is a two-dimensional image.
  • the two-dimensional image displayed on the stereoscopic image display surface 30 is referred to as a stereoscopic two-dimensional image.
  • the display unit 10 and the microlens array 20 form an image of light corresponding to an image on the stereoscopic image display surface 30 (that is, the imaging surface) to form a stereoscopic two-dimensional image.
  • the stereoscopic image display unit 2 to be displayed is configured.
  • the stereoscopic image display surface 30 is a plane virtually set in space and is an entity. Then, it is one plane in the space defined according to the working distance of the microlens array 20. Therefore, the front surface of the housing 1 is opened so that an image displayed on the stereoscopic image display surface 30 can be seen from the front.
  • the position detection sensor 40 is disposed on the front surface of the housing 1 and determines the position of an object 70 to be detected (for example, a finger) 70 of an observer H that exists in a predetermined detection area near the stereoscopic image display surface 30. It is a sensor to detect. For example, as a two-dimensional position detection sensor capable of detecting the position on the stereoscopic image display surface 30 (X-Y coordinates in FIG. 1), the position of the object to be detected by the observer H who touches the stereoscopic two-dimensional image is detected. You may do it.
  • the 3D position detection sensor can also detect the position (Z coordinate in FIG. 1) in the depth direction (Z direction in FIG. 1) perpendicular to the stereoscopic image display surface 30 not only on the position on the stereoscopic image display surface 30. May be.
  • an optical sensor is adopted as the two-dimensional position detection sensor, as shown in FIG. 2, an outer frame is provided around the detection plane (for example, the stereoscopic image display surface 30), A frame-shaped two-dimensional position detection sensor 40 is arranged on the outer frame. That is, among the four sides of the outer frame, the light emitting part of the X direction detection line sensor is provided on one of the two sides parallel to the Y direction, and the light receiving part is provided on the other side, and the Y direction is detected on one of the two sides parallel to the X direction. By providing the light emitting part of the line sensor and the light receiving part on the other side, the position coordinates on the XY plane when the detected object 70 crosses the detection plane can be determined.
  • a two-dimensional position detection sensor shown in FIG. 2 is used so that a space including the stereoscopic image display surface 30 is used as a detection region. It can be configured by arranging in multiple layers in the Z direction.
  • the object 70 to be detected by the observer H is detected by the frame-shaped position detection sensor 40.
  • the two-dimensional position detection sensor and the three-dimensional position detection sensor described above are used.
  • the detected object 70 of the observer H may be detected by a device other than the above.
  • the position of the detected object 70 of the observer H may be detected by a camera or the like.
  • the position of the detected object 70 of the observer H is detected by image recognition of color and shape).
  • the haptic device 60 is a device that is arranged on the front surface of the housing 1 and applies a force to the subject 71 of the observer H existing in a predetermined detection area near the stereoscopic image display surface 30.
  • the object to be detected 71 is, for example, a metal (magnetic material) attached to a finger or the like.
  • the detected object 70 and the operated object 71 may be configured to be the same as the finger itself, for example.
  • a force is applied to the object 71 remotely from the outside (see FIG. 1), and a predetermined device itself attached to the object 70 generates a force (see FIG. 1). There are two ways (not shown).
  • the haptic device 60 may be an electromagnet that generates magnetic force, a wind generator that generates wind, or the like.
  • an electromagnet is used as the haptic device 60, a plurality of electromagnets are arranged on the outer frame around the stereoscopic image display surface 30, and a current is passed through the electromagnet, so that the object attached to the detection object (finger) 70.
  • a magnetic force can be applied to the acting body (metal, etc.) 71.
  • the observer H can be given a sense of being pulled in the direction of the electromagnet that has passed the current.
  • a plurality of wind pressure generators are installed on the outer frame around the three-dimensional image display surface 30 and the wind generator generates the wind, thereby detecting the object to be detected.
  • (Finger) Actuated body attached to 70 (wind pressure receiver like a sail or finger itself) 7 1 It is possible to apply wind force to the observer 1, and as a result, the observer H has a wind generator that generates wind force. A sense of being away from the direction can be given.
  • an electromagnet that generates a magnetic force when the predetermined device itself attached to the detected object 70 generates a force, as the kap device 60, an electromagnet that generates a magnetic force, a wind force generator that generates wind force, and a vibration device, etc. Is assumed.
  • an electromagnet is used as the haptic device 60, a metal outer frame is arranged around the three-dimensional image display surface 30 and a current is passed through the electromagnet attached to the finger so that the electromagnet is magnetically applied to the metal outer frame. The electromagnet may be attracted to the metal outer frame.
  • the wind power generator when the wind power generator is used as the haptic device 60, the wind power generator may be attached to the finger so that the force is directly felt in the direction opposite to the wind generated by the finger.
  • a vibration device when used as the haptic device 60, a vibration device having a noise function is directly attached to the finger or hand, and a vibration is imparted to the finger or hand so that a tactile sensation is easily provided. May be.
  • the direction of the applied force cannot be controlled accurately, but even if it touches a three-dimensional two-dimensional image, it can be influenced by vision and give a sense. It is also possible to make the observer H feel the approximate direction of the force by attaching multiple vibration devices to the finger or hand.
  • the haptic device 60 attached to the detected body 70 is connected to the haptic device driving unit 6 by wire or wirelessly. It comes to be driven and controlled!
  • the haptic device 60 shown in FIG. 2 has electromagnets 61 arranged on each side of the outer frame around the stereoscopic image display surface 30 and is parallel to the stereoscopic image display surface 30, that is, within the stereoscopic image display surface 30.
  • the force is applied in the predetermined direction. For example, when the force Fa in the + X direction is to be applied to the metal 71, the electromagnet 6la is driven. In this case, the finger 70 can be given a feeling of being pulled in the + X direction.
  • the finger 70 is given a feeling of being pulled in the + Y direction. it can.
  • the electromagnets 61a and 61b may be driven simultaneously (the force Fa acts in the + X direction and the force Fb acts in the + Y direction. In this case, the finger 70 can be given a sensation of being pulled diagonally upward to the right. In FIG.
  • the display drive unit 3 controls the drive of the display unit 10 according to the image sent from the control unit 8. More specifically, the display drive unit 3 receives the image data generated by the image generation unit 7 via the control unit 8, and displays an image corresponding to the image data on the image display surface 11 of the display unit 10. It becomes.
  • the audio driving unit 5 is configured to drive and control the audio output unit 50, and to output predetermined audio from the audio output unit 50 in response to an instruction from the control unit 10.
  • the sensor driving unit 4 drives the position detection sensor 40 and receives a detection signal indicating the position detected from the position detection sensor 40.
  • the received detection signal is sent to the control unit 8
  • the control unit 8 instructs the image generation unit 7 to generate an image corresponding to the detection signal, and the generated image is displayed on the image display surface 11 of the display unit 11 via the display driving unit 3. Or a sound corresponding to the generated image is output from the sound output unit 50, or a force corresponding to the detection signal is applied to the object 71 via the haptic device 6.
  • the image generation unit 7 generates image data to be displayed on the display unit 10, and is configured to generate an image according to a predetermined program, for example.
  • the image generation unit 7 may include an image storage unit in which a predetermined image is stored in advance, and may be configured to output the stored image to the control unit 8 in response to an instruction from the control unit 8. .
  • the control unit 8 receives the signal output from the position detection sensor 40 via the sensor driving unit 4, and displays the image corresponding to the output signal until now according to the output signal of the position detection sensor 40.
  • the displayed image power is also switched and displayed. That is, the control unit 8 controls the display unit 10 so as to change the image according to the position of the detection target 70 of the observer H, and thereby the stereoscopic image formed on the stereoscopic image display surface 30 in space.
  • the target two-dimensional image is changed.
  • control unit 8 receives the signal output from the position detection sensor 40 via the sensor driving unit 4, and changes the force applied to the object 71 according to the output signal of the position detection sensor 40. It has become to let you. That is, the control unit 8 controls the haptic device 60 so as to change the magnitude and direction of the force to be generated according to the position of the detected object 70 of the observer H. I change my senses.
  • control unit 8 changes the force applied to the object 71 in accordance with the contents of the image data generated from the image generation unit 7. That is, the control unit 8 controls the katsu device 60 so as to change the magnitude and direction of the force to be generated according to the content of the image, thereby changing the katsu sense felt by the observer H.
  • the stereoscopic two-dimensional image display apparatus 100 changes the image according to the position of the detected object 70 of the observer H, or changes the position to the detected object 70 of the observer H. Therefore, it is possible to change the sensation presented to the observer H, or to change the sensation presented to the observer H according to the content of the image, so that more effective stereoscopic display can be achieved and more rear view can be achieved. Use as an interactive communication tool with high amusement can do.
  • the position detection sensor 40 and the haptic device 60 as shown in FIG. 2, that is, the position detection sensor 40 on the frame arranged around the stereoscopic image display surface 30, and the stereoscopic image display surface A case where an electromagnet 61 is provided on each side of the outer frame near the 30 circumference and a force is applied to the metal 71 attached to the finger 70 will be described.
  • FIG. 3 is a diagram showing a schematic configuration of the stereoscopic two-dimensional image display apparatus 1 OOA of the first embodiment.
  • the stereoscopic two-dimensional image display device 100A includes, as the position detection sensor 40, a two-dimensional position detection sensor that can detect a position in the stereoscopic image display surface 30 (position coordinates on the XY plane), and As the haptic device 60, a haptic device that applies a force in a direction parallel to the stereoscopic image display surface 30 (XY plane) is provided. As shown in FIG.
  • the stereoscopic two-dimensional image display device 100A is configured such that when the finger 70 of the observer H touches a predetermined stereoscopic two-dimensional image P displayed on the stereoscopic image display surface 30, the metal 71
  • force F is applied to the two-dimensional image P and the finger 70 is released from the two-dimensional image P (touching the three-dimensional two-dimensional image P)
  • control is performed without applying the force F. That is, when the position of the finger 71 is detected from the position detection sensor 40 and it is determined that the three-dimensional two-dimensional image P is touched from the detected position coordinate, the force F is applied. If it is determined that the three-dimensional two-dimensional image P is touched, the three-dimensional two-dimensional image P may be changed and the force F may be applied.
  • the haptic control in the present embodiment can be variously modified.
  • a force F is applied to the metal 71.
  • the stereoscopic two-dimensional image P may be moved in accordance with the moved direction, and the applied force F may be changed.
  • the magnitude and direction of the force applied to the finger 70 that touches the stereoscopic two-dimensional image P may be changed according to the content of the displayed stereoscopic two-dimensional image P (the wind blows).
  • Force F is applied, and if it is a three-dimensional two-dimensional image P in which water is flowing, force F is applied from upstream to downstream on the image.
  • a strong force F may be applied to a rapid river, and a weak force F may be applied to a slow river).
  • FIG. 4 is an example of a stereoscopic two-dimensional image P1 displayed by the stereoscopic two-dimensional image display device 100A.
  • the stereoscopic two-dimensional image P1 shown in FIG. 4 is an image of a waterfall.
  • the stereoscopic two-dimensional image display device 100A puts the finger 70 when the finger 70 is placed at the position of the waterfall of the stereoscopic two-dimensional image PI (the portion where water flows).
  • a three-dimensional two-dimensional image P1 that generates splashes at the position is displayed, and a downward force F1 is applied to the metal 71 attached to the finger 70 to put the finger into the water that actually falls under the upper force. It gives the feeling of time. Note that even if you put your finger 70 in a place other than the waterfall position, no force is applied to the metal 71.
  • the stereoscopic two-dimensional image display device 100A responds to the movement of the finger 70 as shown in FIG. 4 (b).
  • 3D image P 1 where the position of splashing has moved up is displayed and a downward force F2 (> F1) is applied to the metal 71 attached to the finger 70 to
  • F2 higher force
  • the stereoscopic two-dimensional image display device 100A responds to the movement of the finger 70 as shown in FIG. 4 (c).
  • the downward force F3 ((F1) is applied to the metal 71 attached to the finger 70, and the upper force actually falls down into the water.
  • the observer H According to the stereoscopic two-dimensional image display device 100A of the first embodiment, according to the position of the detection object 70 of the observer H and the content of the stereoscopic two-dimensional image P, the observer H's By controlling the magnitude and direction of the force F acting on the subject 71 and applying a parallel force F to the stereoscopic image display surface 30, the viewer H feels like being pulled or pushed up and down or left and right. give be able to.
  • the force applied to the metal 71 by the haptic device 60 is always the actual force (the force actually felt when a finger is put into the waterfall). It may be weaker than the actual force. This is because the human sense is strongly influenced by vision, so that the observer H can feel a force sense that is close to the actual sense, even with a power that assists the vision.
  • FIG. 5 is a diagram showing a schematic configuration of the stereoscopic two-dimensional image display apparatus 1 OOB of the second embodiment.
  • the stereoscopic two-dimensional image display device 100B includes a two-dimensional position detection sensor capable of detecting a position (position coordinates on the XY plane) in the stereoscopic image display surface 30 as the position detection sensor 40, and In addition to the direction parallel to the stereoscopic image display surface 30 (X-Y plane), the visual device 60 that applies force in the depth direction (Z direction) perpendicular to the stereoscopic image display surface 30 is used. I have.
  • the haptic device 60 is disposed in front of and behind the stereoscopic image display surface 30 (in the Z direction), respectively! Because of sneaking (the haptic device 60 shown in the first embodiment is arranged in front of and behind the three-dimensional image display surface 30), it is pushed back (force acting in the Z-axis direction) and pulled back. A sensation (force acting in the -Z axis direction) can be given to the observer H.
  • the opposite of the haptic devices 60 arranged in front of the observer H By driving the electromagnets 61 on the two sides, the force in the + Z-axis direction can be applied to the metal 71.
  • a force in the Z-axis direction can be applied to the metal 71.
  • the force F can be applied to the direction of displacement in the vertical and horizontal directions, depending on the position of the subject 71 and the electromagnet to be driven.
  • FIG. 6 shows an example of a stereoscopic two-dimensional image P2 displayed by the stereoscopic two-dimensional image display device 100B.
  • the stereoscopic two-dimensional image P2 shown in FIG. 6A is a cloud image
  • the stereoscopic two-dimensional image P2 shown in FIG. 6B is a box image.
  • 3D drawing The image display device 100B displays a three-dimensional cloud two-dimensional image P2 as shown in FIG. 6 (a), and if the observer H's finger 70 touches the three-dimensional two-dimensional image P2, the observer The metal 71 attached to the finger 70 of H applies a weak force in the + Z-axis direction so that it changes gently and pushes back the finger 70 so that it is light and elastic.
  • the observer H According to the stereoscopic two-dimensional image display device 100B of the second embodiment, according to the position of the detection object 70 of the observer H and the content of the stereoscopic two-dimensional image P, the observer H's By controlling the magnitude and direction of the force F acting on the subject 71 and applying force to the stereoscopic image display surface 30 up, down, left, and right, the feeling of being pulled back and pushed back It is possible to give to the observer H, and it is possible to give the observer H a soft or hard feeling when touching an object.
  • FIG. 7 is a diagram illustrating a schematic configuration of a stereoscopic two-dimensional image display device 100C according to the third embodiment.
  • the stereoscopic two-dimensional image display device 100C includes a three-dimensional position detection sensor capable of detecting the position of a three-dimensional space including the stereoscopic image display surface 30 as the position detection sensor 40 (or detects a three-dimensional position using a camera or the like).
  • the stereoscopic image display surface 30 can be used as the haptic device 60 as in the second embodiment. It is equipped with a haptic device that applies force in the depth direction (Z direction) perpendicular to the surface.
  • the display unit 10A includes a plurality of display units having parallel image display surfaces.
  • the first display unit 10 and the second display unit (located in the center) Display section) 90 and the third display section (display section located on the rightmost side) 90 include an image display surface 91 for displaying a two-dimensional image, and are on the optical axis of the image transmission panel 20, and the first display unit 10 and the image transmission panel 20 of Arranged between.
  • the second display unit 90 and the third display unit 90 are light transmissive from the surface facing the image display surface 11 toward the image display surface 91, and light emitted from the first display unit 10.
  • the third display unit 90 transmits both the light emitted from the first display unit 10 and the light emitted from the second display unit 90).
  • the second display unit 90 for example, a light-transmitting organic EL display device which is a self-luminous display device is suitable.
  • three parallel stereoscopic image display surfaces 30A, 30B, and 30C are defined according to the working distance of the microlens array 20, and each of the stereoscopic image display surfaces 3OA, 30B is defined.
  • 3D images P are displayed on 30C and 30C, respectively. That is, the image emitted from the image display surface 11 of the first display unit 10 is displayed on the first stereoscopic image display surface 30C, and the image emitted from the image display surface 91 of the second display unit 90 is converted to the second stereoscopic image.
  • the image displayed on the image display surface 30B and emitted from the image display surface 91 of the third display unit 90 is displayed on the third stereoscopic image display surface 30A.
  • FIG. 8 shows an example of a stereoscopic two-dimensional image P4 displayed by the stereoscopic two-dimensional image display device 100C.
  • the stereoscopic two-dimensional image P4 shown in FIG. 8 (a) is an image of a box displayed on the stereoscopic image display surface 30C (surface closest to the observer H)
  • FIG. 8 (b) The stereoscopic two-dimensional image P4 shown is a box image displayed on the stereoscopic image display surface 30A (surface farthest from the observer H).
  • FIG. 8 shows an example of a stereoscopic two-dimensional image P4 displayed by the stereoscopic two-dimensional image display device 100C.
  • the stereoscopic two-dimensional image P4 shown in FIG. 8 (a) is an image of a box displayed on the stereoscopic image display surface 30C (surface closest to the observer H)
  • FIG. 8 (b) The stereoscopic two-dimensional image P4 shown is a box image displayed on the stereoscopic image display surface 30
  • the stereoscopic two-dimensional image display device 100C displays the stereoscopic two-dimensional image P4 on the stereoscopic image display surface 30C, which is the front surface, and the finger 70 of the observer H is stereoscopically displayed.
  • the force of the metal 71 z + attached to the finger 70 of the observer H + Z direction is applied to make the observer H feel the repulsive force.
  • the observer H moves the finger 70 slightly backward (the position detection sensor 40 detects the movement of the finger 70 in the Z-axis direction), and the observer H's finger 70
  • the 3D image P4 displayed on the 3D image display surface 30C is displayed on the 3D image display surface 30A, which is the back surface, and No force is applied to the metal 71 attached to 70. That is, in the example shown in FIG. 8, when the observer H pushes the stereoscopic two-dimensional image P4, the stereoscopic two-dimensional image P4 moves to the back, and as a result, the finger 70 of the observer H is three-dimensional. The observer H can no longer touch the two-dimensional image P4. Can produce an effect of not feeling rebound.
  • the stereoscopic two-dimensional image display device 100C of the third embodiment the three-dimensional position of the object 70 to be detected by the observer H can be detected and the plurality of stereoscopic image display surfaces 30 can be detected. Since the stereoscopic two-dimensional image P can be displayed, more complicated image control and haptic control according to the accurate position detection of the detected object 70 becomes possible.
  • the stereoscopic two-dimensional image display device 100 is separated from the display unit 10 including the image display surface 11 that displays a two-dimensional image and the image display surface 11.
  • the image transmission panel 20 is arranged in such a manner that the light emitted from the image display surface 11 is imaged on the imaging surface 30 in the space located on the opposite side of the display unit 10 of the image transmission panel 20.
  • a control unit 8 for controlling the direction.
  • the stereoscopic two-dimensional image display device 100 is disposed separately from the display unit 10 including the image display surface 11 that displays a two-dimensional image and the image display surface 11.
  • the image transmission panel 20, and the light emitted from the image display surface 11 is imaged on the imaging surface 30 in the space located on the opposite side of the display unit 10 of the image transmission panel 20,
  • the stereoscopic image display unit 2 that displays a stereoscopic two-dimensional image
  • the haptic device 60 that applies a force to the subject 71 near the imaging plane 30, and the contents of the two-dimensional image displayed on the image display plane 11
  • the force sense device 60 has a control unit 8 that controls the magnitude and direction of the force that the force device 60 acts on the subject 71, so that the observer can be given a sense of sensation, and thus more effective.
  • 3D display is possible, and the presence and amusement can be improved.
  • the naked-eye three-dimensional stereoscopic system can be realized by a method different from the stereoscopic two-dimensional image display apparatus according to the embodiments and examples of the present invention, but the embodiment of the present invention is not limited. And direct contact with a stereoscopic image as in the stereoscopic two-dimensional image display device according to the embodiment. And you can't get a feel.
  • there are a parallax barrier method and a lenticular method as a method for realizing a three-dimensional stereoscopic system without using glasses. In either method, the right eye shows the right image and the left eye shows the left image.
  • the focus position (image display surface) of the observer's eyes is different from the position where the 3D image (virtual image) is displayed. Therefore, if you try to touch the 3D image, The focus position of the eye moves to the position of the finger trying to touch the 3D image (virtual image), and the 3D image cannot be viewed correctly. In other words, with the parallax barrier method or the lenticular method, it is not possible to directly touch and feel the stereoscopic image.
  • the stereoscopic two-dimensional image displayed by the stereoscopic two-dimensional image display apparatus according to the embodiment and the example of the present invention is a real image formed by the microlens array 20, and therefore, the stereoscopic two-dimensional image is displayed. Even if you bring your hand to the position of the image, you can touch it directly without feeling uncomfortable, and you can also feel the insects.
  • the magnitude of the force may be controlled by feeding back the position of the detected object 70 in this way.
  • the current flowing through the electromagnet 61 is controlled according to the distance between the subject 71 and the magnet 61. It is. This is because if the same current is always applied to the electromagnet 61, the magnitude of the magnetic force changes depending on the distance between the electromagnet 61 and the subject 71 (the magnetic force is inversely proportional to the square of the distance).
  • the position of the object 70 in order to apply a certain amount of force to the object 71, the position of the object 70 This is because it is necessary to feed back to the device driving unit 6 and control the current flowing through the magnet 61 depending on the distance between the object 71 and the electromagnet 61.
  • an alarm or warning may be given.
  • a vibration device equipped with a vibration function is attached to the finger 70 as the haptic device 60, if the finger 70 is inserted beyond the allowable depth, strong vibration is applied and the finger is moved to the allowable position. If 70 is returned, strong vibrations may be settled.
  • the form of the activator 71 used in the above embodiment and examples is not particularly limited, and various forms can be adopted depending on the purpose.
  • it may be attached in a sac shape, that is, so as to be put on the finger, and may be fixed to the finger to some extent.
  • the observer H can feel the fine movement of the subject 71 as a haptic sensation.
  • the ring shape as shown in FIG. 9 (b) allows the object 71 to move along the finger 70, that is, the ring diameter is slightly larger than the thickness of the finger 70, The finger 70 may be slidable.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Affichage stéréo d'images bidimensionnelles (100) comprenant une section d'affichage (10) comportant une surface d'affichage d'image (11) pour afficher une image bidimensionnelle et un réseau de microlentilles (20) placé à une certaine distance de la surface d'affichage d'image (11). L'affichage stéréo d'images bidimensionnelles comprend en outre une section d'affichage stéréo d'images (2) pour afficher une image bidimensionnelle stéréo en formant l'image de la lumière sortant de la surface d'affichage d'image (11) sur une surface de formation d'image (30) dans un espace situé sur le côté du réseau de microlentilles (20) opposé à la section d'affichage (10), un capteur de position (40) permettant de détecter la position de l'objet de détection (70) d'un observateur touchant la surface de formation d'image (30), un dispositif de capteur de force (60) pour appliquer une force à un corps cible (71) fixé à l'objet de détection (70) et une section de commande (8) pour commander l'amplitude et la direction de la force appliquée sur le corps (71) par le dispositif de capteur de force (60) en fonction de la position de l'objet de détection (70) détectée par la capteur de position (40).
PCT/JP2007/056816 2006-03-31 2007-03-29 Affichage stereo d'images bidimensionnelles WO2007114225A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-098040 2006-03-31
JP2006098040 2006-03-31

Publications (1)

Publication Number Publication Date
WO2007114225A1 true WO2007114225A1 (fr) 2007-10-11

Family

ID=38563493

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/056816 WO2007114225A1 (fr) 2006-03-31 2007-03-29 Affichage stereo d'images bidimensionnelles

Country Status (1)

Country Link
WO (1) WO2007114225A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012022639A (ja) * 2010-07-16 2012-02-02 Ntt Docomo Inc 表示装置、映像表示システムおよび映像表示方法
JP2012032964A (ja) * 2010-07-29 2012-02-16 Olympus Imaging Corp 表示装置
JP2014044662A (ja) * 2012-08-28 2014-03-13 Nec Casio Mobile Communications Ltd 電子機器、その制御方法及びプログラム
JP2019121388A (ja) * 2017-12-28 2019-07-22 イマージョン コーポレーションImmersion Corporation 仮想現実の長距離相互作用のためのシステム及び方法
KR20210050251A (ko) * 2019-10-28 2021-05-07 방신웅 자성체를 이용한 감각 제공장치
WO2023042489A1 (fr) * 2021-09-17 2023-03-23 株式会社Jvcケンウッド Dispositif de génération de sensation tactile, procédé de génération de sensation tactile et programme

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1062717A (ja) * 1996-08-20 1998-03-06 Nippon Sheet Glass Co Ltd 画像浮上表示装置
JP2000172430A (ja) * 1998-12-09 2000-06-23 Sony Corp 情報入力装置、その位置認識装置、その位置認識方法、仮想画像立体合成装置及び記憶媒体
JP2000250689A (ja) * 1999-03-03 2000-09-14 Minolta Co Ltd 仮想物体呈示システム
JP2000259074A (ja) * 1999-03-11 2000-09-22 Minolta Co Ltd 道具媒介型の力覚呈示システム
JP2002304246A (ja) * 2001-04-04 2002-10-18 Nippon Telegr & Teleph Corp <Ntt> 力覚提示装置及び仮想空間システム
JP2006048386A (ja) * 2004-08-05 2006-02-16 Nippon Telegr & Teleph Corp <Ntt> 力覚提示装置、仮想オブジェクト算出方法、および仮想オブジェクト算出プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1062717A (ja) * 1996-08-20 1998-03-06 Nippon Sheet Glass Co Ltd 画像浮上表示装置
JP2000172430A (ja) * 1998-12-09 2000-06-23 Sony Corp 情報入力装置、その位置認識装置、その位置認識方法、仮想画像立体合成装置及び記憶媒体
JP2000250689A (ja) * 1999-03-03 2000-09-14 Minolta Co Ltd 仮想物体呈示システム
JP2000259074A (ja) * 1999-03-11 2000-09-22 Minolta Co Ltd 道具媒介型の力覚呈示システム
JP2002304246A (ja) * 2001-04-04 2002-10-18 Nippon Telegr & Teleph Corp <Ntt> 力覚提示装置及び仮想空間システム
JP2006048386A (ja) * 2004-08-05 2006-02-16 Nippon Telegr & Teleph Corp <Ntt> 力覚提示装置、仮想オブジェクト算出方法、および仮想オブジェクト算出プログラム

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012022639A (ja) * 2010-07-16 2012-02-02 Ntt Docomo Inc 表示装置、映像表示システムおよび映像表示方法
KR101298848B1 (ko) * 2010-07-16 2013-08-23 엔티티 도꼬모 인코퍼레이티드 표시 장치, 영상 표시 시스템, 및 영상 표시 방법
US8866739B2 (en) 2010-07-16 2014-10-21 Ntt Docomo, Inc. Display device, image display system, and image display method
JP2012032964A (ja) * 2010-07-29 2012-02-16 Olympus Imaging Corp 表示装置
JP2014044662A (ja) * 2012-08-28 2014-03-13 Nec Casio Mobile Communications Ltd 電子機器、その制御方法及びプログラム
JP2019121388A (ja) * 2017-12-28 2019-07-22 イマージョン コーポレーションImmersion Corporation 仮想現実の長距離相互作用のためのシステム及び方法
US10558267B2 (en) 2017-12-28 2020-02-11 Immersion Corporation Systems and methods for long-range interactions for virtual reality
US10747325B2 (en) 2017-12-28 2020-08-18 Immersion Corporation Systems and methods for long-range interactions for virtual reality
KR20210050251A (ko) * 2019-10-28 2021-05-07 방신웅 자성체를 이용한 감각 제공장치
KR102268293B1 (ko) 2019-10-28 2021-06-23 방신웅 자성체를 이용한 감각 제공장치
WO2023042489A1 (fr) * 2021-09-17 2023-03-23 株式会社Jvcケンウッド Dispositif de génération de sensation tactile, procédé de génération de sensation tactile et programme

Similar Documents

Publication Publication Date Title
US8866739B2 (en) Display device, image display system, and image display method
US8854331B2 (en) Method and apparatus for providing haptic feedback utilizing multi-actuated waveform phasing
JP5087632B2 (ja) 画像表示装置
JP2005141102A (ja) 立体的二次元画像表示装置及び方法
JP2008015188A (ja) 画像提示装置および画像提示方法
US20170150108A1 (en) Autostereoscopic Virtual Reality Platform
WO2007114225A1 (fr) Affichage stereo d&#39;images bidimensionnelles
JP4083829B2 (ja) 立体画像表示装置
JP4939543B2 (ja) 画像表示装置
JPWO2009025034A1 (ja) 画像表示装置
WO2007100204A1 (fr) Dispositif de réalité virtuelle fondé sur la stéréovision
KR102082781B1 (ko) 폴더블 디스플레이 장치 및 이의 제어 방법 및 장치.
KR20120071895A (ko) 촉감 제시 장치, 촉감 셀, 및 촉감 제시 장치를 제어하는 방법
WO2011086874A1 (fr) Dispositif d&#39;affichage et procédé d&#39;affichage
JP4284158B2 (ja) 立体的二次元画像表示システム及び画像表示方法
Bohdal Devices for Virtual and Augmented Reality
US10664103B2 (en) Curved display apparatus providing air touch input function
JP6807144B1 (ja) 画像表示装置
WO2015196877A1 (fr) Plate-forme à réalité virtuelle autostéréoscopique
CN216052966U (zh) 视触觉反馈系统
JP2007206519A (ja) 画像表示装置
KR101409845B1 (ko) 영상 표시 장치 및 방법
WO2007088779A1 (fr) Affichage d&#39;image bidimensionnelle stereoscopique
Kim et al. A tangible floating display system for interaction
US20240094817A1 (en) Provision of feedback to an actuating object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07740254

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 07740254

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)