WO2012053032A1 - 3次元立体表示装置 - Google Patents
3次元立体表示装置 Download PDFInfo
- Publication number
- WO2012053032A1 WO2012053032A1 PCT/JP2010/006219 JP2010006219W WO2012053032A1 WO 2012053032 A1 WO2012053032 A1 WO 2012053032A1 JP 2010006219 W JP2010006219 W JP 2010006219W WO 2012053032 A1 WO2012053032 A1 WO 2012053032A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stereoscopic display
- image
- screen
- eye
- dimensional stereoscopic
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
Definitions
- the present invention relates to a 3D stereoscopic display device (3Dimensional stereoscopic display device) for displaying a 3D stereoscopic image (3Dimensional stereoscopic image) or a 3D stereoscopic image (3Dimensional stereoscopic movie).
- 3D stereoscopic display device for displaying a 3D stereoscopic image (3Dimensional stereoscopic image) or a 3D stereoscopic image (3Dimensional stereoscopic movie).
- the conventional stereoscopic display device disclosed in Patent Document 1 provides a three-dimensional stereoscopic image mainly used for home use.
- This stereoscopic display device is highly convenient because it can view stereoscopic images without wearing glasses for stereoscopic viewing.
- it is suitable as a content playback device for passenger seats or a display device for rear seat RSE (Rear SeattainEntertainment).
- Patent Document 1 when the conventional technology represented by Patent Document 1 is applied to a display device for displaying in-vehicle information for a driver or a meter panel, it is necessary to consider the convenience on HMI (Human Machine Interface). is there. For example, if information display, icons, etc. are displayed on a three-dimensional stereoscopic image or a three-dimensional stereoscopic video without taking the driver's state and situation into consideration, the operability of the display screen is impaired.
- HMI Human Machine Interface
- An object of the present invention is to provide a 3D stereoscopic display device capable of providing an HMI by 3D stereoscopic display that can be operated in accordance with the intuition of the user.
- the three-dimensional stereoscopic display device includes an operation input unit that receives a user operation, and a reproduction processing unit that reproduces the right-eye and left-eye images or video for three-dimensional stereoscopic display of the input display target image or video.
- a three-dimensional display monitor unit that three-dimensionally displays a right-eye image and a left-eye image or image for three-dimensional stereoscopic display of a display target image or video reproduced by the reproduction processing unit, and a user using an operation input unit Depending on the presence / absence of operation, for the right eye and left eye for three-dimensional stereoscopic display in which the position of the apparent display surface for three-dimensional stereoscopic display of the image or video to be displayed is moved with respect to the screen of the stereoscopic display monitor unit A screen composition processing unit that generates an image or video and outputs the generated image or video to the reproduction processing unit.
- FIG. 4 is a flowchart showing a flow of screen composition processing of the three-dimensional stereoscopic display device according to Embodiment 1; It is a figure for demonstrating the screen synthetic
- FIG. 3 is a diagram illustrating an overview of screen composition processing by the three-dimensional stereoscopic display device according to the first embodiment. It is a figure for demonstrating the screen compositing process which makes the display position of the planar map the foreground rather than the screen of a stereoscopic display monitor. It is a figure which shows the flow of the data in the screen composition process of FIG. It is a block diagram which shows the other structure of the vehicle-mounted information system using the three-dimensional three-dimensional display apparatus by Embodiment 1. FIG. It is a block diagram which shows the structure of the vehicle-mounted information system using the three-dimensional three-dimensional display apparatus by Embodiment 2 of this invention.
- FIG. 10 is a diagram illustrating an outline of screen composition processing by the three-dimensional stereoscopic display device according to the second embodiment. It is a figure which shows the example of a display of the icon at the time of operation detection.
- FIG. 1 is a block diagram showing a configuration example of a stereoscopic display system using a three-dimensional stereoscopic display device according to the present invention.
- FIG. 1A shows a stereoscopic display system 1A that displays a stereoscopic image from left and right images captured by a binocular camera.
- the stereoscopic display system 1A includes a left-eye camera 2a, a right-eye camera 2b, a recording / photographing device 3, a screen composition processing unit 4, a video reproduction device 5, and a stereoscopic display monitor 6.
- the left-eye camera 2 a and the right-eye camera 2 b are arranged side by side with an interval considering the binocular parallax, and the scenery A to be photographed is photographed under the control of the recording and photographing device 3.
- the left and right video data of the landscape A photographed by the left-eye camera 2a and the right-eye camera 2b are recorded in the recording / photographing device 3.
- the screen composition processing unit 4 subjects the left and right video data read from the recording / photographing device 3 to a three-dimensional stereoscopic video composition process unique to the present invention, and outputs the resultant to the video reproduction device 5.
- the video reproduction device 5 reproduces the left and right video data processed by the screen composition processing unit 4 and outputs it to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 displays the left and right video data reproduced by the video reproduction device 5 in a stereoscopic manner as viewed from the viewer.
- a stereoscopic display system 1B shown in FIG. 1B includes a stereoscopic video content receiver 7, an image composition processing unit 4, a video reproduction device 5, and a stereoscopic display monitor 6 that communicate with an external device via an antenna 7a.
- the stereoscopic video content receiver 7 is a receiver that receives the stereoscopic video content including the left and right video data as described above from the external device via the antenna 7a.
- the screen composition processing unit 4 subjects the left and right video data of the stereoscopic video content received by the stereoscopic video content receiver 7 to the three-dimensional stereoscopic video synthesis processing unique to the present invention, and outputs it to the video reproduction device 5 To do.
- the stereoscopic display monitor 6 displays the left and right video data reproduced by the video reproduction device 5 in a stereoscopic manner as viewed from the viewer.
- a stereoscopic display system 1C shown in FIG. 1C includes a storage device 8 that stores stereoscopic display content, an image composition processing unit 4, a video reproduction device 5, and a stereoscopic display monitor 6.
- the stereoscopic display content is content data including the left and right video data as described above.
- the storage device 8 may be an HDD (Hard Disk Drive) or a semiconductor memory that stores stereoscopic display content. Further, it may be a drive device that reproduces a storage medium such as a CD or DVD that stores stereoscopic display content.
- the screen composition processing unit 4 subjects the left and right video data of the stereoscopic display content read from the storage device 8 to the three-dimensional stereoscopic video composition processing unique to the present invention, and outputs it to the video reproduction device 5.
- the stereoscopic display monitor 6 displays the left and right video data reproduced by the video reproducing device 5 in a stereoscopic manner as viewed from the viewer.
- so-called three-dimensional data (for example, three-dimensional map data) is stored as stereoscopic display content, and the screen composition processing unit 4 calculates how the image indicated by the three-dimensional data is viewed from the left and right viewpoints.
- left and right video data may be generated.
- FIG. 2 is a diagram for explaining the principle of stereoscopic display on the stereoscopic display monitor, and shows an example of stereoscopic display with the naked eye.
- the stereoscopic display monitor 6 shown in FIG. 2 includes a liquid crystal display element group 6a and a parallax barrier unit 6b.
- the liquid crystal display element group 6a includes a right-eye liquid crystal element group that has directivity so that the right-eye image reaches the right eye, and a left-eye image that has directivity so that the left-eye image reaches the left eye.
- the parallax barrier unit 6b is a visual field barrier that blocks light from a backlight (not shown in FIG. 2) in order to alternately display a right-eye image and a left-eye image.
- the left and right video data played back by the video playback device 5 is displayed as a left-eye (L) video signal and a right-eye (R) video signal alternately as L, R, L, R,. 6 is input.
- the liquid crystal display element group 6a operates the left-eye liquid crystal element group when the left-eye (L) video signal is input, and operates the right-eye liquid crystal element group when the right-eye (R) video signal is input.
- the parallax barrier unit 6b blocks the light of the backlight that has passed through the right-eye liquid crystal display element group during the operation of the left-eye liquid crystal element group, and the left-eye liquid crystal display element during the operation of the right-eye liquid crystal element group. Block the light from the backlight that has passed through the group.
- the right-eye video and the right-eye video are alternately displayed on the screen of the stereoscopic display monitor 6, and the stereoscopic video can be viewed from the viewpoint of the viewer shown in FIG.
- the present invention is not limited to the stereoscopic display monitor 6 having the configuration shown in FIG. 2, and may be a monitor that realizes stereoscopic vision by another mechanism.
- a method of obtaining a stereoscopic image by wearing glasses with different polarizing plates attached to left and right lenses as dedicated glasses may be used.
- FIG. 3 is a block diagram showing the configuration of the in-vehicle information system using the three-dimensional stereoscopic display device according to Embodiment 1 of the present invention.
- the in-vehicle information system 1 is a system that functions as the stereoscopic display system shown in FIG.
- the in-vehicle information system 1 includes a main CPU 4a, a video reproduction device 5, a stereoscopic display monitor 6, a GPS (Global Positioning System) receiver 9, a vehicle speed sensor 10, an internal memory 11, a CD / DVD drive device 12, an HDD 13, and radio reception.
- the main CPU 4 a is a CPU that controls each component in the in-vehicle information system 1.
- a program 13d an application program for in-vehicle information processing
- the video reproduction device 5 is a device that reproduces the left and right video data synthesized by the screen synthesis processing unit 4 of the main CPU 4 a and outputs it to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 is a monitor that displays the left and right video data reproduced by the video reproduction device 5 in a stereoscopic manner when viewed from the viewer.
- the GPS receiver 9 is a receiver that receives position information of the host vehicle from GPS satellites
- the vehicle speed sensor 10 is a sensor that detects a vehicle speed pulse for calculating the vehicle speed of the host vehicle.
- the internal memory 11 is a memory serving as a work area when the main CPU 4a executes an application program for in-vehicle information processing.
- the CD / DVD drive device 12 is a device that plays back an AV source stored in a storage medium 12a such as a CD or a DVD.
- the AV source stored in the storage medium 12a includes stereoscopic display video data, it functions as the stereoscopic video content receiver 7 shown in FIG. 1B, and the in-vehicle information system 1 is shown in FIG.
- the HDD (Hard Disk Drive Device) 13 is a large-capacity storage device mounted in the in-vehicle information system 1 and stores a map database (hereinafter abbreviated as map DB) 13a, icon data 13b, and a program 13d.
- map DB 13a is a database in which map data used in navigation processing is registered.
- the map data also includes POI information in which the location of POI (Point Of Interest) on the map or detailed information related thereto is described.
- the icon data 13b is data indicating an icon to be displayed on the screen of the stereoscopic display monitor 6. There are icons for operation buttons for performing various operations on the screen.
- the program 13d is an application program for in-vehicle information processing executed by the main CPU 4a.
- the radio receiver 14 is a receiver that receives a radio broadcast.
- the radio receiver 14 is tuned according to an operation of a channel selection button (not shown).
- the DTV receiver 15 is a receiver that receives digital television broadcasts, and, like the radio receiver 14, is selected according to the operation of a channel selection button (not shown).
- the DTV receiver 15 functions as the stereoscopic video content receiver 7 shown in FIG. It functions as the stereoscopic display system 1B shown in FIG.
- the in-vehicle LAN_I / F unit 16 is an interface between the in-vehicle LAN (Local Area Network) 17 and the main CPU 4a, and relays data communication between the main CPU 4a and another device connected to the in-vehicle LAN 17, for example. Further, the storage device 8 shown in FIG. 1C is connected to the in-vehicle LAN 17, and the in-vehicle LAN_I / F unit 16 is configured to relay between the storage device 8 and the screen composition processing unit 4 of the main CPU 4a. When captured, the in-vehicle information system 1 functions as the stereoscopic display system 1C illustrated in FIG.
- the operation input unit 18 is a configuration unit for a user to input an operation.
- Examples of the operation input unit 18 include a key switch (operation switch) provided near the screen of the stereoscopic display monitor 6 and a touch switch when a touch panel is provided on the screen of the stereoscopic display monitor 6.
- the audio signal reproduced by the CD / DVD drive device 12, the radio receiver 14 and the DTV receiver 15 and the audio signal from the main CPU 4a are amplified by the amplifier 19 and output through the speaker 20 as audio.
- the voice signal from the main CPU 4a includes guidance guidance voice in navigation processing.
- the apparent display position of the planar image is located behind the screen of the stereoscopic display monitor 6 depending on whether or not the user is operating the system.
- the three-dimensional stereoscopic image is synthesized, or a three-dimensional stereoscopic image in which the apparent display position of the planar image is the same position as the screen of the stereoscopic display monitor 6 is synthesized and displayed.
- the apparent display position of the planar map is far behind the screen of the stereoscopic display monitor 6. To the side (far from the driver).
- FIG. 4 is a flowchart showing the flow of the screen composition process of the 3D stereoscopic display device according to the first embodiment, and shows the process of setting the apparent display position of the planar map to the back side of the screen of the stereoscopic display monitor 6.
- FIG. 5 is a diagram for explaining a screen composition process in which the apparent display position of the planar map is set behind the screen of the stereoscopic display monitor.
- FIG. 6 is a diagram showing a data flow in the screen composition process of FIG.
- FIGS. 5 and 6 will be referred to as appropriate.
- the main CPU 4a reads map data from the map DB 13a stored in the HDD 13 as shown in FIG. 6, and generates planar map data Pic_plane according to a predetermined map drawing algorithm.
- the planar map data Pic_plane is assumed to be a planar map as described on the left side of FIG.
- the planar map indicated by the planar map data Pic_plane is displayed on the apparent map display surface P behind the screen Q of the stereoscopic display monitor 6.
- the distance from the driver's eye position to the screen Q of the stereoscopic display monitor 6 is Z0
- the distance from the driver's eye position to the apparent map display surface P is z.
- the position of the right eye of the driver is the point Or (xr, yr, 0)
- the position of the left eye is the point Ol (xl, yl, 0)
- the distance between the left and right eyes is d. That is,
- d.
- the right-eye image data Pic_R (x, y) of the planar map includes a point p (x, y, z) on the apparent map display surface P and a point Or (xr, yr, 0) that is the position of the right eye.
- a connecting straight line (vector Vr) is represented by a set of points pr that intersect the screen Q of the stereoscopic display monitor 6.
- the left-eye image data Pic_L (x, y) of the planar map includes the point p (x, y, z) on the apparent map display surface P and the point Ol (xl, yl, 0) that is the position of the left eye. ) Are represented by a set of points pl that intersect the screen Q of the stereoscopic display monitor 6.
- the screen composition processing unit 4 inputs the planar map data Pic_plane generated in the main CPU 4a as described above (step ST1). Next, the screen composition processing unit 4 determines whether or not the driver is operating based on the operation input signal from the operation input unit 18 (step ST3). If the driver is not operating (step ST2; NO), the screen composition processing unit 4 inputs parameters Z0, z, d (step ST2). However, the relationship between the distance Z0 from the driver's eye position to the screen Q of the stereoscopic display monitor 6 and the distance z from the driver's eye position to the apparent map display surface P is z> Z0. .
- step ST5 the screen composition processing unit 4 calculates points pr and pl such that the distance between the apparent map display surface P and the driver's eye positions Or and Ol is z. That is, the screen composition processing unit 4 generates the above-described right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y) using the planar map data Pic_plane and the parameters Z0, z, d. . Thereafter, the screen composition processing unit 4 outputs the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) to the video reproduction device 5 (step ST6).
- the video reproduction device 5 reproduces the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 and outputs them to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 uses the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) reproduced by the video reproduction device 5 to stereoscopically display the planar map (step ST7). .
- FIG. 7 is a diagram showing an outline of screen composition processing by the three-dimensional stereoscopic display device of the first embodiment.
- the operation input unit 18 is an operation switch provided near the screen of the stereoscopic display monitor 6.
- the screen composition processing unit 4 determines whether or not the driver is operating based on whether or not the menu switch among the operation switches is operated. That is, when the menu switch is pressed, it is determined that “in operation”.
- FIG. 8 is a diagram for explaining a screen composition process in which the apparent display position of the planar map is placed in front of the screen of the stereoscopic display monitor.
- FIG. 9 is a diagram showing a data flow in the screen composition process of FIG. As shown in FIG. 8, when z ⁇ Z0, the driver indicates that the planar map indicated by the planar map data Pic_plane is stereoscopically viewed from the apparent map display surface P before the screen Q of the stereoscopic display monitor 6. Appears to be displayed.
- the distance z between the driver and the apparent map display surface P of the planar map can be set in the screen composition processing unit 4 by a user operation, or the already set value can be changed by a user operation. Good.
- the operation input unit 18 that receives a user operation and the right-eye and left-eye images or videos for three-dimensional stereoscopic display of the input display target image or video are reproduced.
- a video playback device 5 that performs three-dimensional stereoscopic display of right-eye and left-eye images or video for three-dimensional stereoscopic display of an image to be displayed reproduced by the video playback device 5, and an operation input unit
- the right eye for three-dimensional stereoscopic display in which the position of the apparent display surface P for three-dimensionally displaying an image to be displayed with respect to the screen Q of the stereoscopic display monitor 6 is moved according to the presence / absence of a user operation using 18.
- a screen composition processing unit 4 that generates images for the left and right eyes and outputs them to the video reproduction device 5.
- the apparent display surface P for displaying the display target image in a three-dimensional manner is displayed on the screen Q of the three-dimensional display monitor unit.
- 3D stereoscopic display right-eye and left-eye images that are moved to the back are generated, and when a user operation is performed using the operation input unit 18, an apparent display that displays a display target image in a three-dimensional stereoscopic manner.
- the right-eye and left-eye images for three-dimensional stereoscopic display in which the plane P is moved close to the screen Q of the stereoscopic display monitor unit or moved to the same position are generated.
- the apparent display position of the planar image is on the back side, the same, and in front of the screen of the stereoscopic display monitor depending on whether or not the user is operating the system.
- the three-dimensional stereoscopic image is displayed has been shown, it may be detected whether the user is going to operate the system and the three-dimensional stereoscopic image may be displayed before the operation.
- FIG. 10 is a block diagram illustrating another configuration of the in-vehicle information system using the three-dimensional stereoscopic display device according to the first embodiment.
- the in-vehicle information system 1 ⁇ / b> A includes a user operation detection sensor 21 in addition to the configuration illustrated in FIG. 3.
- the user operation detection sensor 21 is a sensor that detects the behavior of a user who is operating the operation input unit 18.
- it can be realized by a camera device including a camera and an arithmetic processing unit that recognizes an image of the captured image. That is, the arithmetic processing unit performs image analysis on the user image captured by the camera, thereby recognizing the behavior of the user who intends to operate the operation input unit 18.
- the user operation detection sensor 21 is realized by a proximity sensor that detects the approach of an indicator such as a finger to the touch surface based on a change in capacitance. May be.
- the same reference numerals are given to the same or corresponding components as those in FIG.
- FIG. FIG. 11 is a block diagram showing the configuration of an in-vehicle information system using a three-dimensional stereoscopic display device according to Embodiment 2 of the present invention.
- the in-vehicle information system 1 ⁇ / b> B includes a touch panel 22 instead of the operation input unit 18 in the configuration shown in FIG. 3 in the first embodiment.
- the touch panel 22 is a device that detects a touch operation on the operation screen displayed on the stereoscopic display monitor 6, and “touch” when an indicator such as a user's finger touches the touch surface, and further touch after the indicator touches. It is configured to be able to detect “push” that pushes the surface.
- the same reference numerals are given to the same or corresponding components as those in FIG.
- FIG. 12 the apparent map display surface P of the planar map is set behind the screen Q of the stereoscopic display monitor, and the apparent display surface R of the icon is in front of the apparent map display surface P. It is a figure for demonstrating a screen composition process.
- FIG. 13 is a diagram showing a data flow in the screen composition process of FIG.
- the main CPU 4a reads map data from the map DB 13a stored in the HDD 13, and generates planar map data Pic_plane according to a predetermined map drawing algorithm.
- the planar map data Pic_plane represents, for example, the planar map described on the left side of FIG.
- the main CPU 4 a reads icon data of icons to be displayed on the plane map indicated by the plane map data Pic_plane from the icon data 13 b stored in the HDD 13.
- the planar map indicated by the planar map data Pic_plane is displayed on the apparent map display surface P behind the screen Q of the stereoscopic display monitor 6. Further, a determination button and a return button that are operation icons are displayed on the apparent display surface R in front of the apparent map display surface P of the planar map.
- the distance between the map display surface P of the planar map and the icon display surface R is dz. That is, from the driver, each icon of the decision button and the return button is seen floating from the plane map by a distance dz by stereoscopic view.
- the relationship between the distance Z0 from the driver's eye position to the screen Q of the stereoscopic display monitor 6 and the distance z from the driver's eye position to the apparent map display surface P is z> This is the relationship of Z0.
- the right-eye image data Pic_R (x, y) of the planar map is the point p (x, y, z) on the apparent map display surface P or the point p (x, y, z-dz) on the display surface R.
- a straight line (vector Vr) connecting the point Or (xr, yr, 0) that is the position of the right eye is represented by a set of points pr that intersect the screen Q of the stereoscopic display monitor 6.
- the left-eye image data Pic_L (x, y) of the planar map is represented by a point p (x, y, z) on the apparent map display surface P or a point p (x, y, z) on the display surface R.
- each icon of the determination button and the return button is represented by a set of points pr on the right-eye image in the right-eye image on the planar map, and on the left-eye image in the left-eye image on the planar map. It is expressed as a set of points pl.
- the screen composition processing unit 4 uses the planar map data Pic_plane, the parameters Z0, d, z, and icon data, and the distance between the apparent map display surface P and the driver's eye position is z, and the icon display Points pr and pl are calculated so that the distance between the surface R and the driver's eye position is (z ⁇ dz), and image data for the right eye Pic_R (x, y) and image data for the left eye Pic_L (x, y ) And output to the video reproduction device 5.
- the video reproduction device 5 reproduces the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 and outputs them to the stereoscopic display monitor 6.
- the planar map and the icon are stereoscopically displayed using the right-eye image and the left-eye image reproduced by the video reproduction device 5. At this time, it appears to the driver that the icon image has floated by a distance dz on the planar map by stereoscopic viewing.
- the screen composition processing unit 4 determines the distance until (z ⁇ dz) becomes Z0. dz is increased to generate right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y), which are output to the video reproduction device 5.
- the video reproduction device 5 reproduces the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 and outputs them to the stereoscopic display monitor 6.
- the planar map and the icon are stereoscopically displayed using the right-eye image and the left-eye image reproduced by the video reproduction device 5.
- the icon image is displayed so as to be focused on the screen Q of the stereoscopic display monitor 6 by stereoscopic viewing.
- FIG. 14 is a diagram illustrating an outline of screen composition processing by the 3D stereoscopic display device according to the second embodiment.
- a planar map is displayed on an apparent map display surface P behind the screen Q of the stereoscopic display monitor 6, and a decision button and a return button are displayed on the apparent surface of the planar map.
- the image is displayed on the apparent display surface R in front of the upper map display surface P.
- the stereoscopic map is displayed so that the plane map is in focus at the most distant position, and the decision button and the return button are distanced from the plane map. It is displayed so as to be in focus close to dz.
- the determination button and the return button are displayed on the screen Q of the stereoscopic display monitor 6 so as to be in focus, so that the visibility of the operation target icon is improved. Further, before the touch surface is touched, the display is performed so that the focus is farther from the user than the screen Q of the stereoscopic display monitor 6, so that the user who is looking far away is focused even if the user looks at the screen Q. The moving distance of the position is small and easy to see.
- the distance dz is not set to (z ⁇ Z0), but the distance dz is gradually approximated to (z ⁇ Z0) for each predetermined value, whereby the icon image is displayed.
- the display surface R may be gradually moved to match the screen Q of the stereoscopic display monitor 6.
- the distance dz may be gradually returned every predetermined value even when returning to the display state shown in FIG.
- FIG. 12 and FIG. 14A show a case where the determination button and the return button are displayed so as to be in focus at a distance dz from the plane map.
- dz may be set to 0 and displayed on the plane map so as to be in focus.
- the apparent map display surface P may be matched with the screen Q of the stereoscopic display monitor 6. Even if it does in this way, the effect similar to the above can be acquired.
- control may be performed so that the icon focus position by stereoscopic vision is not changed even if a user operation is detected.
- an icon of a function that is not permitted to be operated due to the state of the vehicle includes an icon that does not accept an operation assigned to the icon due to an operation restriction when the vehicle is traveling.
- the color or shape of the icon may be displayed in a different color or shape from the icon of the function that is permitted to operate while the vehicle is running. You can make a message.
- FIG. 15 is a diagram illustrating a display example of an icon when an operation is detected.
- 15A and 15B show a case where the display surface R of the icon image is made to coincide with the screen Q of the stereoscopic display monitor 6 when an operation is detected, as described above.
- the icons of the determination button and the return button are displayed so as to be in focus on the screen Q of the stereoscopic display monitor 6 by stereoscopic vision.
- FIG. Looks like it's floating on. Therefore, as shown in FIGS.
- icons may be displayed so as to extend from the apparent map display surface P of the planar map to the screen Q of the stereoscopic display monitor 6. By displaying in this way, the visibility of the icon can be further improved.
- FIGS. 15A and 15B when an icon is displayed as if it is raised on the screen of the stereoscopic display monitor 6 when an operation is detected, the icon shape may be changed. Thus, by changing the shape of the icon before the operation and during the operation, the visibility of the icon during the operation can be improved.
- the touch panel 22 that receives a user operation, and the video playback device 5 that plays back the right-eye and left-eye images for 3D stereoscopic display of the input display target image
- the three-dimensional display monitor 6 that three-dimensionally displays the right-eye and left-eye images for three-dimensional stereoscopic display of the display target image reproduced by the video reproduction device 5, and an icon for user operation as the display target image
- an apparent display surface R for displaying the icon image for user operation in three-dimensional stereoscopic display
- an apparent display surface P for displaying the base image in three-dimensional stereoscopic display
- a right-eye image and a left-eye image or video for three-dimensional stereoscopic display in which the screen Q of the stereoscopic display monitor 6 is different from each other.
- a screen compositing processing unit 4 that generates right-eye and left-eye images or videos for three-dimensional stereoscopic display in which the position of the surface R is moved and outputs the generated images or videos to the video playback device 5; By doing so, the visibility of the icon image for user operation is improved, so that it is possible to provide an HMI by three-dimensional stereoscopic display capable of an operation that matches the user's intuition.
- the touch panel 22 detects “contact” and “push” of an indicator on the touch surface has been described, but the present invention is not limited to this.
- FIG. 1 by using a three-dimensional touch panel that can detect the distance of the pointing object to the touch surface and the contact without capacitive contact, according to the distance of the pointing object to the touch surface, FIG.
- the display surface R of the icon is gradually moved from the display state of), and when the pointing object touches the touch surface, the display state of FIG. 14B is set and the function of the icon is controlled. Good.
- the icons are displayed in a three-dimensional manner.
- the route guidance screen may be displayed in a three-dimensional manner according to the navigation processing operation.
- the apparent display position of the planar map is set farther from the driver than the screen of the stereoscopic display monitor 6, and the route guidance screen is displayed in front of the apparent display surface of the planar map.
- events such as the vehicle position, route, guidance point, cursor, 3D agent display, and other traffic information are displayed from the apparent map display surface of the planar map according to the user operation. May be displayed so as to be raised to the front.
- significant characters such as a destination may be displayed in three dimensions. For example, a high-speed schematic diagram and POI are mentioned.
- the balloon display in which the information of the designated POI is described is viewed stereoscopically from the driver, and the apparent map display surface of the planar map is displayed. You may display so that it may protrude to the near side.
- the planar map is displayed three-dimensionally. However, if it is generally displayed on the in-vehicle information system, the AV system menu screen, vehicle information, safety information, etc. are displayed. You may apply to. For example, you may use for the display of the icon for control of an air-conditioner, the meter panel of a dashboard, the fuel consumption of a vehicle, preventive safety information, VICS (trademark) information, etc.
- the stereoscopic display that is stereoscopically viewed with the naked eye is shown, but a stereoscopic display method that obtains a stereoscopic image using polarized glasses may be used.
- the present invention can be applied to all display devices having the stereoscopic display monitor as described above. It is.
- the present invention may be applied not only to a vehicle-mounted navigation device but also to a display device of a mobile phone terminal or a personal digital assistant (PDA).
- PDA personal digital assistant
- the present invention may be applied to a display device such as a PND (Portable Navigation Device) that is carried and used by a person on a moving body such as a vehicle, a railway, a ship, or an aircraft.
- PND Portable Navigation Device
- the 3D stereoscopic display device is suitable for a display device of an in-vehicle information system because it can provide an HMI by 3D stereoscopic display that can be operated in accordance with the user's intuition.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
実施の形態1.
図1は、この発明に係る3次元立体表示装置を用いた立体表示システムの構成例を示すブロック図である。図1(a)は、両眼用のカメラで撮影された左右映像から立体映像を表示する立体表示システム1Aを示している。図1(a)において、立体表示システム1Aは、左目用カメラ2a、右目用カメラ2b、記録・撮影装置3、画面合成処理部4、映像再生装置5及び立体表示モニタ6を備える。
左目用カメラ2aと右目用カメラ2bは、両眼の視差を考慮した間隔で並べて配置されており、記録・撮影装置3の制御によって、撮影対象の風景Aを撮影する。左目用カメラ2aと右目用カメラ2bに撮影された風景Aの左右映像データは、記録・撮影装置3に記録される。画面合成処理部4は、記録・撮影装置3から読み出した左右映像データに対して、本発明に特有な3次元立体映像の合成処理を施して映像再生装置5へ出力する。
映像再生装置5は、画面合成処理部4で処理された左右映像データを再生して立体表示モニタ6へ出力する。立体表示モニタ6は、映像再生装置5に再生された左右映像データを、視聴者から見て立体的に表示する。
画面合成処理部4は、立体映像用コンテンツ受信機7が受信した立体映像用コンテンツの左右映像データに対して、本発明に特有な3次元立体映像の合成処理を施して映像再生装置5へ出力する。図1(a)と同様にして、立体表示モニタ6が、映像再生装置5に再生された左右映像データを、視聴者から見て立体的に表示する。
なお、立体表示用コンテンツとして、いわゆる3次元データ(例えば3次元地図データ等)を格納しておき、画面合成処理部4が、この3次元データが示す画像について左右の視点での見え方を演算して左右映像データを生成してもよい。
地図DB13aは、ナビゲーション処理で利用される地図データが登録されたデータベースである。地図データには、地図上のPOI(Point Of Interest)の所在地又はこれに関連する詳細情報が記述されたPOI情報も含まれる。
アイコンデータ13bは、立体表示モニタ6の画面上に表示するアイコンを示すデータである。画面上で各種の操作を行うための操作ボタンのアイコンなどがある。
プログラム13dは、メインCPU4aが実行する車載情報処理用のアプリケーションプログラムである。例えば、画面合成処理部4の機能を実現するプログラムモジュールを含む地図表示用のアプリケーションプログラムがある。
DTV受信機15は、デジタルテレビ放送を受信する受信機であり、ラジオ受信機14と同様に、不図示の選局ボタンの操作に応じて選局がなされる。また、DTV受信機15は、受信したデジタルテレビ放送に3次元立体表示映像データが含まれる場合、図1(b)で示した立体映像用コンテンツ受信機7として機能し、車載情報システム1は、図1(b)で示した立体表示システム1Bとして機能する。
実施の形態1では、平面画像を立体表示するにあたり、ユーザがシステムに対して操作を行っているか否かに応じて、平面画像の見かけ上の表示位置を立体表示モニタ6の画面より奥側にした3次元立体画像を合成したり、平面画像の見かけ上の表示位置を立体表示モニタ6の画面と同一の位置とした3次元立体画像を合成して表示する。
例えば、車載用ナビゲーション装置の地図表示において、平面地図を表示する場合に、運転者がシステムに対して操作中でなければ、平面地図の見かけ上の表示位置を立体表示モニタ6の画面よりも奥側(運転者の遠方)にする。
平面地図データPic_planeが示す平面地図上の点p(x,y)を、見かけ上の地図表示面Pに投影すると、地図表示面P上の点p(x,y,z)となる。
同様に、平面地図の左目用画像データPic_L(x,y)は、見かけ上の地図表示面P上の点p(x,y,z)と左目の位置である点Ol(xl,yl,0)とを結ぶ直線(ベクトルVl)が立体表示モニタ6の画面Qと交わる点plの集合で表される。
次に、画面合成処理部4は、操作入力部18からの操作入力信号に基づいて、運転者が操作中か否かを判定する(ステップST3)。
運転者が操作中でなければ(ステップST2;NO)、画面合成処理部4は、パラメータZ0,z,dを入力する(ステップST2)。ただし、運転者の目の位置から立体表示モニタ6の画面Qまでの距離Z0と、運転者の目の位置から見かけ上の地図表示面Pまでの距離zとの関係は、z>Z0である。
また、運転者が操作中である場合にはz=Z0であるので、見かけ上の地図表示面Pと立体表示モニタ6の画面Qとが一致し、運転者からは、平面地図が画面Q上に表示されて見える。
そこで、操作スイッチが操作されると、図7(b)に示すように、z=Z0、すなわち見かけ上の地図表示面Pと立体表示モニタ6の画面Qとを一致させる。このとき、表示内容は、図7(b)の左側に記載したように実際の立体表示モニタ6の画面Qで焦点が合うように表示される。このため、操作しやすい表示画面を提供することができる。
なお、操作が終了した場合は、z>Z0に再設定することにより、見かけ上の地図表示面Pが立体表示モニタ6の画面Qよりも運転者から遠方に表示された状態に戻す。
図8は、平面地図の見かけ上の表示位置を立体表示モニタの画面よりも手前にする画面合成処理を説明するための図である。図9は、図8の画面合成処理におけるデータの流れを示す図である。図8に示すようにz<Z0であると、運転者からは、立体視によって、平面地図データPic_planeが示す平面地図が、立体表示モニタ6の画面Qよりも手前の見かけ上の地図表示面Pに表示されているように見える。
例えば、カメラ及びその撮像画像を画像認識する演算処理部を備えたカメラ装置で実現することができる。すなわち、演算処理部が、カメラで撮影したユーザ画像を画像解析することにより、操作入力部18を操作しようとするユーザの挙動を画像認識する。
また、操作入力部18がタッチパネルで構成される場合には、静電容量の変化に基づいてタッチ面への指などの指示物の接近を検出する近接センサで、ユーザ操作検出センサ21を実現してもよい。なお、図10において、図3と同一又はこれに相当する構成部には同一符号を付して説明を省略する。
図11は、この発明の実施の形態2による3次元立体表示装置を用いた車載情報システムの構成を示すブロック図である。図11において、車載情報システム1Bは、上記実施の形態1で図3を用いて示した構成のうち、操作入力部18の代わりにタッチパネル22を備える。タッチパネル22は、立体表示モニタ6に表示された操作画面に対するタッチ操作を検出する装置であり、ユーザの指等の指示物がタッチ面に触れた“接触”と、指示物が触れた後にさらにタッチ面を押し込む“押し込み”とを検出できるように構成されている。なお、図11において、図3と同一又はこれに相当する構成部には同一符号を付して説明を省略する。
図12は、平面地図の見かけ上の地図表示面Pを、立体表示モニタの画面Qよりも奥側とし、アイコンの見かけ上の表示面Rを、見かけ上の地図表示面Pよりも手前にした画面合成処理を説明するための図である。図13は、図12の画面合成処理におけるデータの流れを示す図である。
一方、決定ボタン及び戻るボタンの各アイコンは、平面地図の右目用画像においては、当該右目用画像上の点prの集合で表現され、平面地図の左目用画像においては、当該左目用画像上の点plの集合で表現される。
このとき、図14(a)の左側に示すように、ユーザからは、立体視により、平面地図が最も離れた位置で焦点が合うように表示され、決定ボタン及び戻るボタンが、平面地図から距離dzだけ近くで焦点が合うように表示される。
図15は、操作検出時のアイコンの表示例を示す図である。図15(a)及び図15(b)は、上記と同様に、操作検出時にアイコン画像の表示面Rを立体表示モニタ6の画面Qに一致させた場合を示している。この場合、ユーザからは、立体視により、決定ボタン及び戻るボタンのアイコンが、立体表示モニタ6の画面Qで焦点が合うように表示されるが、図15(b)に示すように平面地図上に浮いているように見える。
そこで、図15(c)及び図15(d)に示すように、平面地図の見かけ上の地図表示面Pから立体表示モニタ6の画面Qに延びるようにアイコンを表示してもよい。このように表示することにより、さらにアイコンの視認性を向上させることができる。
また、図15(a)及び図15(b)において、操作検出時に、立体表示モニタ6の画面に浮き出したようにアイコンを表示するにあたり、アイコンの形状を変えて表示してもよい。このように操作前と操作時とでアイコンの形状を変えることで、操作時におけるアイコンの視認性を向上させることができる。
例えば、静電容量式の非接触で、タッチ面までの指示物の距離とその接触を検出できる3次元タッチパネルを使用することにより、タッチ面までの指示物の距離に応じて、図14(a)の表示状態から徐々にアイコンの表示面Rを移動させ、指示物がタッチ面に接触した時点で、図14(b)の表示状態とし、当該アイコンの機能を実行するように制御してもよい。
Claims (18)
- ユーザ操作を受け付ける操作入力部と、
入力した表示対象の画像又は映像の3次元立体表示用の右目用及び左目用の画像又は映像を再生する再生処理部と、
前記再生処理部により再生された前記表示対象の画像又は映像の3次元立体表示用の右目用及び左目用の画像又は映像を3次元立体表示する立体表示モニタ部と、
前記操作入力部を用いたユーザ操作の有無に応じて、前記立体表示モニタ部の画面に対する、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成して前記再生処理部に出力する画面合成処理部とを備えた3次元立体表示装置。 - 前記表示対象の画像は、平面画像又は平面画像上に立体像を表示した画像であることを特徴とする請求項1記載の3次元立体表示装置。
- 前記画面合成処理部は、前記操作入力部を用いたユーザ操作がない場合、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面よりも奥に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。
- 前記画面合成処理部は、前記操作入力部を用いてユーザ操作があると、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面の近くに移動させるか又は同一位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。
- 前記操作入力部は、前記立体表示モニタ部の画面上に設けたタッチパネルであり、
前記画面合成処理部は、前記タッチパネルのタッチ面に指示物が接触したことが検出されると、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面の近くに移動させるか又は同一位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。 - 前記操作入力部を用いてユーザ操作しようとしている挙動を検出するユーザ操作検出部を備え、
前記画面合成処理部は、前記ユーザ操作検出部によりユーザ操作しようとしている挙動が検出されると、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面の近くに移動させるか又は同一位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。 - 前記操作入力部は、前記立体表示モニタ部の画面上に設けられ、指示物がタッチ面からの所定の距離内に接近したことを検出可能なタッチパネルであり、
前記画面合成処理部は、前記タッチパネルのタッチ面から所定の距離内に前記指示物が接近したことが検出されると、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面の近くに移動させるか又は同一位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。 - 前記画面合成処理部は、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を、所定の値ごとに徐々に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。
- ユーザ操作を受け付ける操作入力部と、
入力した表示対象の画像又は映像の3次元立体表示用の右目用及び左目用の画像又は映像を再生する再生処理部と、
前記再生処理部により再生された前記表示対象の画像又は映像の3次元立体表示用の右目用及び左目用の画像又は映像を3次元立体表示する立体表示モニタ部と、
前記表示対象の画像として、ユーザ操作用のアイコン画像及び前記アイコン画像を表示するベース画像を3次元立体表示するにあたり、前記ユーザ操作用のアイコン画像を3次元立体表示する見かけ上の表示面、前記ベース画像を3次元立体表示する見かけ上の表示面及び前記立体表示モニタ部の画面をそれぞれ異なる面とした3次元立体表示用の右目用及び左目用の画像又は映像を生成し、前記操作入力部を用いた前記ユーザ操作用のアイコン画像に対するユーザ操作に応じて、前記ベース画像を3次元立体表示する見かけ上の表示面又は前記立体表示モニタ部の画面に対する、当該アイコン画像を3次元立体表示する見かけ上の表示面の位置を移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成して前記再生処理部に出力する画面合成処理部とを備えた3次元立体表示装置。 - 前記画面合成処理部は、前記操作入力部を用いて前記ユーザ操作用のアイコン画像に対するユーザ操作がない場合に、当該アイコン画像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面よりも奥の位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項9記載の3次元立体表示装置。
- 前記画面合成処理部は、前記操作入力部を用いて前記ユーザ操作用のアイコン画像に対するユーザ操作がない場合に、当該アイコン画像を3次元立体表示する見かけ上の表示面及び前記ベース画像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面よりも奥の位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項9記載の3次元立体表示装置。
- 前記画面合成処理部は、前記操作入力部を用いて前記ユーザ操作用のアイコン画像に対するユーザ操作があると、当該アイコン画像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面に近づけるか又は同一位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項10記載の3次元立体表示装置。
- 前記画面合成処理部は、前記アイコン画像を3次元立体表示する見かけ上の表示面が、所定の値ごとに徐々に前記立体表示モニタ部の画面に近づくように3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項12記載の3次元立体表示装置。
- 前記画面合成処理部は、前記操作入力部を用いて前記ユーザ操作用のアイコン画像に対するユーザ操作があると、当該アイコン画像を3次元立体表示する見かけ上の表示面及び前記ベース画像を3次元立体表示する見かけ上の表示面を、前記立体表示モニタ部の画面に近づけるか又は同一位置に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項10記載の3次元立体表示装置。
- 前記画面合成処理部は、前記アイコン画像を3次元立体表示する見かけ上の表示面及び前記ベース画像を3次元立体表示する見かけ上の表示面が、所定の値ごとに徐々に前記立体表示モニタ部の画面に近づくように3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項14記載の3次元立体表示装置。
- 前記画面合成処理部は、前記ユーザ操作用のアイコン画像のうち、自装置を搭載又は保持する移動体の状態により操作が許可されないアイコン画像については、前記操作入力部を用いたユーザ操作があっても、当該アイコン画像を3次元立体表示する見かけ上の表示面の位置を移動させない3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項10記載の3次元立体表示装置。
- 前記自装置を搭載又は保持する移動体の状態により操作が許可されないアイコンとは、前記移動体の移動時の操作制限により当該アイコンに割り付けられた操作を受け付けないアイコンであることを特徴とする請求項16記載の3次元立体表示装置。
- 前記画面合成処理部は、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を、所定の値ごとに徐々に移動させた3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項9記載の3次元立体表示装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/006219 WO2012053032A1 (ja) | 2010-10-20 | 2010-10-20 | 3次元立体表示装置 |
JP2012539478A JP5465334B2 (ja) | 2010-10-20 | 2010-10-20 | 3次元立体表示装置 |
DE112010005948.9T DE112010005948B4 (de) | 2010-10-20 | 2010-10-20 | Stereskopische Dreidimensionen-Anzeigevorrichtung |
US13/701,395 US9083962B2 (en) | 2010-10-20 | 2010-10-20 | 3Dimension stereoscopic display device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/006219 WO2012053032A1 (ja) | 2010-10-20 | 2010-10-20 | 3次元立体表示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012053032A1 true WO2012053032A1 (ja) | 2012-04-26 |
Family
ID=45974772
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/006219 WO2012053032A1 (ja) | 2010-10-20 | 2010-10-20 | 3次元立体表示装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9083962B2 (ja) |
JP (1) | JP5465334B2 (ja) |
DE (1) | DE112010005948B4 (ja) |
WO (1) | WO2012053032A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014136380A1 (ja) * | 2013-03-04 | 2014-09-12 | 株式会社デンソー | タッチパネル操作装置及びタッチパネル操作装置における操作イベント判定方法 |
JP2016535516A (ja) * | 2013-07-26 | 2016-11-10 | サムスン エレクトロニクス カンパニー リミテッド | 多視点映像処理装置及びその映像処理方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012053030A1 (ja) * | 2010-10-19 | 2012-04-26 | 三菱電機株式会社 | 3次元立体表示装置 |
CN108616719B (zh) * | 2016-12-29 | 2021-04-27 | 杭州海康威视数字技术股份有限公司 | 监控视频展示的方法、装置及系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05147456A (ja) * | 1991-11-27 | 1993-06-15 | Nippon Seiki Co Ltd | 車両用表示装置 |
JPH09113839A (ja) * | 1995-10-23 | 1997-05-02 | Kansei Corp | 車両用表示装置 |
WO2004038486A1 (ja) * | 2002-10-23 | 2004-05-06 | Pioneer Corporation | 画像表示装置及び画像表示方法 |
WO2006035816A1 (ja) * | 2004-09-30 | 2006-04-06 | Pioneer Corporation | 立体的二次元画像表示装置 |
JP2006293878A (ja) * | 2005-04-14 | 2006-10-26 | Nippon Telegr & Teleph Corp <Ntt> | 画像表示システムおよび画像表示方法、ならびに画像表示プログラム |
JP2008040596A (ja) * | 2006-08-02 | 2008-02-21 | Mazda Motor Corp | 車両用情報表示装置 |
JP2008538037A (ja) * | 2005-04-14 | 2008-10-02 | フオルクスヴアーゲン アクチエンゲゼルシヤフト | 交通手段における情報表示方法及び自動車用コンビネーションインストルメント |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003280812A (ja) | 2002-03-20 | 2003-10-02 | Hitachi Ltd | タッチパネル付きディスプレイ装置及び表示方法 |
JP2004280496A (ja) | 2003-03-17 | 2004-10-07 | Kyocera Mita Corp | 操作パネル装置 |
JP2005175566A (ja) | 2003-12-08 | 2005-06-30 | Shinichi Hirabayashi | 立体表示システム |
DE102006032117A1 (de) | 2006-07-12 | 2008-01-24 | Volkswagen Ag | Informationssystem für ein Verkehrsmittel und Verfahren zum Steuern eines solchen Informationssystems |
US8169501B2 (en) * | 2006-12-05 | 2012-05-01 | Fujifilm Corporation | Output apparatus, output method and program |
DE102008035090B4 (de) * | 2008-07-28 | 2010-09-23 | Airbus Deutschland Gmbh | Flexibel einsetzbare Passagierkabinenbedieneinheit zur Steuerung von Kabinenfunktionen, Flugzeug damit und deren Verwendung |
US8294766B2 (en) * | 2009-01-28 | 2012-10-23 | Apple Inc. | Generating a three-dimensional model using a portable electronic device recording |
-
2010
- 2010-10-20 JP JP2012539478A patent/JP5465334B2/ja active Active
- 2010-10-20 WO PCT/JP2010/006219 patent/WO2012053032A1/ja active Application Filing
- 2010-10-20 DE DE112010005948.9T patent/DE112010005948B4/de active Active
- 2010-10-20 US US13/701,395 patent/US9083962B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05147456A (ja) * | 1991-11-27 | 1993-06-15 | Nippon Seiki Co Ltd | 車両用表示装置 |
JPH09113839A (ja) * | 1995-10-23 | 1997-05-02 | Kansei Corp | 車両用表示装置 |
WO2004038486A1 (ja) * | 2002-10-23 | 2004-05-06 | Pioneer Corporation | 画像表示装置及び画像表示方法 |
WO2006035816A1 (ja) * | 2004-09-30 | 2006-04-06 | Pioneer Corporation | 立体的二次元画像表示装置 |
JP2006293878A (ja) * | 2005-04-14 | 2006-10-26 | Nippon Telegr & Teleph Corp <Ntt> | 画像表示システムおよび画像表示方法、ならびに画像表示プログラム |
JP2008538037A (ja) * | 2005-04-14 | 2008-10-02 | フオルクスヴアーゲン アクチエンゲゼルシヤフト | 交通手段における情報表示方法及び自動車用コンビネーションインストルメント |
JP2008040596A (ja) * | 2006-08-02 | 2008-02-21 | Mazda Motor Corp | 車両用情報表示装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014136380A1 (ja) * | 2013-03-04 | 2014-09-12 | 株式会社デンソー | タッチパネル操作装置及びタッチパネル操作装置における操作イベント判定方法 |
JP2014170390A (ja) * | 2013-03-04 | 2014-09-18 | Denso Corp | タッチパネル操作装置及びタッチパネル操作装置における操作イベント判定方法 |
US9720593B2 (en) | 2013-03-04 | 2017-08-01 | Denso Corporation | Touch panel operation device and operation event determination method in touch panel operation device |
JP2016535516A (ja) * | 2013-07-26 | 2016-11-10 | サムスン エレクトロニクス カンパニー リミテッド | 多視点映像処理装置及びその映像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2012053032A1 (ja) | 2014-02-24 |
JP5465334B2 (ja) | 2014-04-09 |
DE112010005948B4 (de) | 2018-08-09 |
DE112010005948T5 (de) | 2013-07-25 |
US20130070065A1 (en) | 2013-03-21 |
US9083962B2 (en) | 2015-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5781080B2 (ja) | 3次元立体表示装置および3次元立体表示処理装置 | |
JP5709886B2 (ja) | 3次元立体表示装置および3次元立体表示信号生成装置 | |
JP5726201B2 (ja) | 3次元立体視表示装置、3次元立体視表示制御装置、およびlsi回路 | |
US9030465B2 (en) | Vehicle user interface unit for a vehicle electronic device | |
EP2672459B1 (en) | Apparatus and method for providing augmented reality information using three dimension map | |
US7825991B2 (en) | Multi-video display system | |
KR101830966B1 (ko) | 전자 기기 및 전자 기기의 컨텐츠 생성 방법 | |
US20120038626A1 (en) | Method for editing three-dimensional image and mobile terminal using the same | |
JP5914114B2 (ja) | 駐車支援装置、及び駐車支援方法 | |
JP6121131B2 (ja) | 多重表示装置 | |
JP5465334B2 (ja) | 3次元立体表示装置 | |
JP2004246455A (ja) | 操作画面表示装置 | |
JP5677168B2 (ja) | 画像表示システム、画像生成装置及び画像生成方法 | |
JP5955373B2 (ja) | 3次元立体表示装置および3次元立体表示信号生成装置 | |
KR101678447B1 (ko) | 이동 단말기 및 이동 단말기의 영상 표시 방법 | |
WO2024069779A1 (ja) | 制御システム、制御方法、および記録媒体 | |
JP2015161930A (ja) | 表示制御装置、表示制御方法、および表示制御システム | |
KR101864698B1 (ko) | 전자 기기 및 전자 기기의 제어 방법 | |
CN115824249A (zh) | Arhud显示方法、arhud显示装置及存储介质 | |
KR20120022352A (ko) | 이동 단말기 및 그 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10858590 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012539478 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13701395 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112010005948 Country of ref document: DE Ref document number: 1120100059489 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10858590 Country of ref document: EP Kind code of ref document: A1 |