WO2012053029A1 - 3次元立体表示装置 - Google Patents
3次元立体表示装置 Download PDFInfo
- Publication number
- WO2012053029A1 WO2012053029A1 PCT/JP2010/006186 JP2010006186W WO2012053029A1 WO 2012053029 A1 WO2012053029 A1 WO 2012053029A1 JP 2010006186 W JP2010006186 W JP 2010006186W WO 2012053029 A1 WO2012053029 A1 WO 2012053029A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- eye
- video
- stereoscopic display
- processing unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
- H04N13/315—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
- H04N13/359—Switching between monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
Definitions
- the present invention relates to a 3D stereoscopic display device (3Dimensional stereoscopic display device) for displaying a 3D stereoscopic image (3Dimensional stereoscopic image) or a 3D stereoscopic image (3Dimensional stereoscopic movie).
- 3D stereoscopic display device for displaying a 3D stereoscopic image (3Dimensional stereoscopic image) or a 3D stereoscopic image (3Dimensional stereoscopic movie).
- the conventional stereoscopic display device disclosed in Patent Document 1 provides a three-dimensional stereoscopic image mainly used for home use.
- This stereoscopic display device is highly convenient because it can view stereoscopic images without wearing glasses for stereoscopic viewing.
- it is suitable as a content playback device for passenger seats or a display device for rear seat RSE (Rear SeattainEntertainment).
- Patent Document 1 when the conventional technology represented by Patent Document 1 is applied to a display device for displaying in-vehicle information for a driver or displaying a meter panel, it cannot be used as it is without consideration for safety. For example, while driving a vehicle, it is visually recognized that the driver is shown a moving 3D stereoscopic image, or a 3D stereoscopic image or a 3D stereoscopic image in which features such as a 3D display are randomly disordered. Therefore, safety considerations are necessary.
- the present invention has been made to solve the above-described problems, and an object of the present invention is to provide a three-dimensional stereoscopic display device that can provide a three-dimensional stereoscopic display device that ensures safety and is easy to see and easy to use.
- a three-dimensional stereoscopic display device is a three-dimensional stereoscopic display device that is mounted on or held by a moving body, and includes a moving body state detecting unit that detects the state of the moving body, and a plane of an image or video to be displayed.
- a screen compositing processing unit that generates right-eye and left-eye images or video for display or three-dimensional stereoscopic display, a playback processing unit that plays back an image or video generated by the screen compositing processing unit, and a playback processing unit;
- a 3D display monitor unit that inputs the reproduced display target image or video and displays the 3D image, and the screen compositing processing unit displays the flat image according to the state of the moving object detected by the moving object state detection unit. Switching between outputting the right-eye and left-eye images or video images to the reproduction processing unit, or outputting the right-eye and left-eye images or video images for three-dimensional stereoscopic display to the reproduction processing unit.
- FIG. 4 is a flowchart showing a flow of screen composition processing of the three-dimensional stereoscopic display device according to Embodiment 1; It is a figure for demonstrating the screen composition process of a three-dimensional three-dimensional display. It is a figure which shows the flow of the data in the screen composition process of FIG.
- FIG. 1 is a block diagram showing a configuration example of a stereoscopic display system using a three-dimensional stereoscopic display device according to the present invention.
- FIG. 1A shows a stereoscopic display system 1A that displays a stereoscopic image from left and right images captured by a binocular camera.
- a stereoscopic display system 1A includes a left-eye camera 2a, a right-eye camera 2b, a recording / photographing device 3, a screen composition processing unit 4, a video reproduction device (reproduction processing unit) 5, and a stereoscopic display monitor (three-dimensional display). Display monitor unit) 6.
- the left-eye camera 2 a and the right-eye camera 2 b are arranged side by side with an interval considering the binocular parallax, and the scenery A to be photographed is photographed under the control of the recording and photographing device 3.
- the left and right video data of the landscape A photographed by the left-eye camera 2a and the right-eye camera 2b are recorded in the recording / photographing device 3.
- the screen composition processing unit 4 subjects the left and right video data read from the recording / photographing device 3 to a three-dimensional stereoscopic video composition process unique to the present invention, and outputs the resultant to the video reproduction device 5.
- the video reproduction device 5 reproduces the left and right video data processed by the screen composition processing unit 4 and outputs it to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 displays the left and right video data reproduced by the video reproduction device 5 in a stereoscopic manner as viewed from the viewer.
- a stereoscopic display system 1B shown in FIG. 1B includes a stereoscopic video content receiver 7, an image composition processing unit 4, a video reproduction device 5, and a stereoscopic display monitor 6 that communicate with an external device via an antenna 7a.
- the stereoscopic video content receiver 7 is a receiver that receives the stereoscopic video content including the left and right video data as described above from the external device via the antenna 7a.
- the screen composition processing unit 4 subjects the left and right video data of the stereoscopic video content received by the stereoscopic video content receiver 7 to the three-dimensional stereoscopic video synthesis processing unique to the present invention, and outputs it to the video reproduction device 5 To do.
- the stereoscopic display monitor 6 displays the left and right video data reproduced by the video reproduction device 5 in a stereoscopic manner as viewed from the viewer.
- a stereoscopic display system 1C shown in FIG. 1C includes a storage device 8 that stores stereoscopic display content, an image composition processing unit 4, a video reproduction device 5, and a stereoscopic display monitor 6.
- the stereoscopic display content is content data including the left and right video data as described above.
- the storage device 8 may be an HDD (Hard Disk Drive) or a semiconductor memory that stores stereoscopic display content. Further, it may be a drive device that reproduces a storage medium such as a CD or DVD that stores stereoscopic display content.
- the screen composition processing unit 4 subjects the left and right video data of the stereoscopic display content read from the storage device 8 to the three-dimensional stereoscopic video composition processing unique to the present invention, and outputs it to the video reproduction device 5.
- the stereoscopic display monitor 6 displays the left and right video data reproduced by the video reproducing device 5 in a stereoscopic manner as viewed from the viewer.
- so-called three-dimensional data (for example, three-dimensional map data) is stored as stereoscopic display content, and the screen composition processing unit 4 calculates how the image indicated by the three-dimensional data is viewed from the left and right viewpoints.
- left and right video data may be generated.
- FIG. 2 is a diagram for explaining the principle of stereoscopic display on the stereoscopic display monitor, and shows an example of stereoscopic display with the naked eye.
- the stereoscopic display monitor 6 shown in FIG. 2 includes a liquid crystal display element group 6a and a parallax barrier unit 6b.
- the liquid crystal display element group 6a includes a right-eye liquid crystal element group that has directivity so that the right-eye image reaches the right eye, and a left-eye image that has directivity so that the left-eye image reaches the left eye.
- the parallax barrier unit 6b is a visual field barrier that blocks light from a backlight (not shown in FIG. 2) in order to alternately display a right-eye image and a left-eye image.
- the left and right video data played back by the video playback device 5 is displayed as a left-eye (L) video signal and a right-eye (R) video signal alternately as L, R, L, R,. 6 is input.
- the liquid crystal display element group 6a operates the left-eye liquid crystal element group when the left-eye (L) video signal is input, and operates the right-eye liquid crystal element group when the right-eye (R) video signal is input.
- the parallax barrier unit 6b blocks the light of the backlight that has passed through the right-eye liquid crystal display element group during the operation of the left-eye liquid crystal element group, and the left-eye liquid crystal display element during the operation of the right-eye liquid crystal element group. Block the light from the backlight that has passed through the group.
- the right-eye video and the right-eye video are alternately displayed on the screen of the stereoscopic display monitor 6, and the stereoscopic video can be viewed from the viewpoint of the viewer shown in FIG.
- the present invention is not limited to the stereoscopic display monitor 6 having the configuration shown in FIG. 2, and may be a monitor that realizes stereoscopic vision by another mechanism.
- a method of obtaining a stereoscopic image by wearing glasses with different polarizing plates attached to left and right lenses as dedicated glasses may be used.
- FIG. 3 is a block diagram showing the configuration of the in-vehicle information system using the three-dimensional stereoscopic display device according to Embodiment 1 of the present invention.
- the in-vehicle information system 1 is a system that functions as the stereoscopic display system shown in FIG.
- the in-vehicle information system 1 includes a main CPU 4a, a video reproduction device 5, a stereoscopic display monitor 6, a GPS (Global Positioning System) receiver 9, a vehicle speed sensor (moving body state detection unit) 10, an internal memory 11, and a CD / DVD drive.
- the apparatus 12 includes an HDD 13, a radio receiver 14, a DTV receiver 15, an in-vehicle LAN_I / F unit 16, an operation input unit 18, an amplifier 19, and a speaker 20.
- the main CPU 4 a is a CPU that controls each component in the in-vehicle information system 1.
- a program 13d an application program for in-vehicle information processing
- the video reproduction device 5 is a device that reproduces the left and right video data synthesized by the screen synthesis processing unit 4 of the main CPU 4 a and outputs it to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 is a monitor that displays the left and right video data reproduced by the video reproduction device 5 in a stereoscopic manner when viewed from the viewer.
- the GPS receiver 9 is a receiver that receives position information of the host vehicle from GPS satellites
- the vehicle speed sensor 10 is a sensor that detects a vehicle speed pulse for calculating the vehicle speed of the host vehicle.
- the internal memory 11 is a memory serving as a work area when the main CPU 4a executes an application program for in-vehicle information processing.
- the CD / DVD drive device 12 is a device that plays back an AV source stored in a storage medium 12a such as a CD or a DVD.
- the AV source stored in the storage medium 12a includes stereoscopic display video data, it functions as the stereoscopic video content receiver 7 shown in FIG. 1B, and the in-vehicle information system 1 is shown in FIG.
- the HDD (Hard Disk Drive Device) 13 is a large-capacity storage device mounted in the in-vehicle information system 1 and stores a map database (hereinafter abbreviated as map DB) 13a, icon data 13b, and a program 13d.
- map DB 13a is a database in which map data used in navigation processing is registered.
- the map data also includes POI information in which the location of POI (Point Of Interest) on the map or detailed information related thereto is described.
- the icon data 13b is data indicating an icon to be displayed on the screen of the stereoscopic display monitor 6. For example, there is a stereoscopic image icon indicating a stereoscopic image of a feature such as a landmark or a building.
- the stereoscopic image icon includes icons indicating a vehicle position mark, a destination mark, a waypoint mark, and a route mark in navigation processing.
- the program 13d is an application program for in-vehicle information processing executed by the main CPU 4a.
- an application program for map display including a program module that realizes the function of the screen composition processing unit 4.
- the radio receiver 14 is a receiver that receives a radio broadcast.
- the radio receiver 14 is tuned according to an operation of a channel selection button (not shown).
- the DTV receiver 15 is a receiver that receives digital television broadcasts, and, like the radio receiver 14, is selected according to the operation of a channel selection button (not shown).
- the DTV receiver 15 functions as the stereoscopic video content receiver 7 shown in FIG. It functions as the stereoscopic display system 1B shown in FIG.
- the in-vehicle LAN_I / F unit 16 is an interface between the in-vehicle LAN (Local Area Network) 17 and the main CPU 4a, and relays data communication between the main CPU 4a and another device connected to the in-vehicle LAN 17, for example. Further, the storage device 8 shown in FIG. 1C is connected to the in-vehicle LAN 17, and the in-vehicle LAN_I / F unit 16 is configured to relay between the storage device 8 and the screen composition processing unit 4 of the main CPU 4a. When captured, the in-vehicle information system 1 functions as the stereoscopic display system 1C illustrated in FIG.
- the operation input unit 18 is a configuration unit for a user to input an operation.
- Examples of the operation input unit 18 include a key switch (operation switch) and a remote controller provided near the screen of the stereoscopic display monitor 6, and a touch switch when a touch panel is provided on the screen of the stereoscopic display monitor 6. .
- the audio signal reproduced by the CD / DVD drive device 12, the radio receiver 14, and the DTV receiver 15 and the audio signal from the main CPU 4a are amplified by the amplifier 19 and output through the speaker 20.
- the voice signal from the main CPU 4a includes guidance guidance voice in navigation processing.
- the operation will be described.
- the three-dimensional stereoscopic display device displays a stereoscopic image on the planar image
- the stereoscopic image appears to appear on the planar image by stereoscopic viewing according to the state of the moving body on which the device is mounted.
- a three-dimensional display is performed or a three-dimensional image for flat display is displayed.
- a three-dimensional display is made such that a three-dimensional image of a building or the like is raised on the planar map, and the host vehicle is in a running state
- a three-dimensional image of a building or the like is displayed on a planar map.
- three-dimensionally expressed map information three-dimensional contour lines, terrain model
- in-vehicle information screen icon image
- information outside the vehicle vehicle information
- FIG. 4 is a flowchart showing a flow of screen composition processing of the 3D stereoscopic display device according to the first embodiment. Depending on whether or not the host vehicle is stopped, 3D stereoscopic display and 3D on the planar image are performed. A screen composition process for selecting whether to display an image is shown.
- FIG. 5 is a diagram for explaining a screen composition process for three-dimensional stereoscopic display.
- FIG. 6 is a diagram showing a data flow in the screen composition process of FIG.
- FIGS. 5 and 6 will be referred to as appropriate.
- the main CPU 4a reads map data from the map DB 13a stored in the HDD 13, and generates planar map data Pic_plane according to a predetermined map drawing algorithm.
- the planar map data Pic_plane is, for example, a planar map as described on the left side of FIG.
- the main CPU 4 a reads icon data indicating a stereoscopic image icon of a three-dimensional landmark to be stereoscopically displayed on the planar map indicated by the planar map data Pic_plane from the icon data 13 b stored in the HDD 13.
- the position of the right eye of the driver is a point Or (xr, yr, 0)
- the position of the left eye is a point Ol (xl, yl, 0)
- the distance between the left and right eyes is d. That is,
- d.
- the right-eye image data Pic_R (x, y) of the planar map has a straight line (vector Vr) connecting the point p2 on the apparent stereoscopic display surface P1 and the point Or (xr, yr, 0) which is the position of the right eye.
- the point pr intersects with the screen Q of the stereoscopic display monitor 6. That is, the right-eye image data Pic_R (x, y) is a set of points pr obtained by mapping the point p2 at which the stereoscopic image icon “XXX building” is located on the screen Q of the stereoscopic display monitor 6.
- the left-eye image data Pic_L (x, y) of the planar map is a straight line (vector Vl) connecting the point p2 on the apparent stereoscopic display surface P1 and the point Ol (xl, yl, 0) that is the position of the left eye.
- the left-eye image data Pic_R (x, y) is a set of points pl obtained by mapping the point p2 at which the stereoscopic image icon “XX building” is located on the screen Q of the stereoscopic display monitor 6.
- the screen composition processing unit 4 inputs the above-described plane map data Pic_plane and icon data (step ST1). Next, the screen composition processing unit 4 determines whether or not the host vehicle is stopped based on the vehicle speed pulse input from the vehicle speed sensor 10 (step ST2). When the host vehicle is stopped (step ST2; YES), the screen composition processing unit 4 sets parameters Z0 and d indicating the positional relationship between the driver and the screen Q of the stereoscopic display monitor 6 and the driver's eye distance, and the plane. Using the map data Pic_plane and the icon data, the distance between the apparent map display surface P and the driver's eye position Or, Ol is Z0, and the three-dimensional display surface P1 and the driver's eye position Or, Ol.
- the points pr and pl are calculated so that the distance becomes (Z0 ⁇ z1). That is, the screen composition processing unit 4 generates right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y) in which the stereoscopic image icon “XXX building” is stereoscopically displayed by stereoscopic viewing. To do.
- the screen composition processing unit 4 uses the planar map data Pic_plane and icon data to display a three-dimensional image (“2” building) on the planar map (2 Right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y) for displaying (images drawn three-dimensionally on a three-dimensional plane) are generated (step ST4).
- a three-dimensional image of “XX building” for planar display is superimposed on the right-eye image and the left-eye image, respectively.
- the screen composition processing unit 4 outputs the generated right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y) to the video reproduction device 5 ( Step ST5).
- the video reproduction device 5 reproduces the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 and outputs them to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 stereoscopically displays the left-eye image and the right-eye image using the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) reproduced by the video reproduction device 5.
- Step ST6 At this time, if the host vehicle is stopped, the driver sees the stereoscopic image icon of “XX building” as if it is raised from the plane map by stereoscopic view (perceived at the height of z1). ) On the other hand, if the host vehicle is traveling, a three-dimensional image of “XX building” is displayed on the planar map.
- the processing shown in FIG. 4 is repeated for the planar map data continuously updated along with this, A continuous image of the map can be obtained.
- the determination that the vehicle is stopped is not limited to this.
- the parking brake may be tightened and when the vehicle speed is “0”, it may be determined that the vehicle is stopped and a three-dimensional stereoscopic image may be displayed.
- HMI Human Machine Interface
- Each image represented by the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) that expresses a three-dimensional image for planar display is either a right-eye image or a left-eye image.
- a map for planar display and a stereoscopic icon are displayed on the screen Q of the stereoscopic display monitor 6.
- 3D image is displayed. In this way, by making the left and right images for planar display the same image, it is possible to reduce the load in image generation of the screen composition processing unit 4.
- the moving body state detection unit such as the vehicle speed sensor 10 that detects the state of the host vehicle, the planar display or the three-dimensional stereoscopic display for the image or video to be displayed.
- Screen synthesis processing unit 4 for generating right-eye and left-eye images or videos
- video playback device 5 for playing back images or videos generated by the screen synthesis processing unit 4, and display objects played back by the video playback device 5
- a three-dimensional display monitor 6 for inputting three-dimensional images or video
- the three-dimensional stereoscopic display or the planar image display is switched according to the state of the host vehicle, it is possible to provide the in-vehicle information system 1 for safe three-dimensional stereoscopic display that ensures safety and is easy to see and easy to use. .
- Embodiment 2 when the moving body on which the mobile device is mounted is stopped, a three-dimensional stereoscopic map display (so-called street view) is performed on the road on which the moving body travels, and the moving body is in a moving state. If it is, the three-dimensional image for flat display is displayed.
- the screen composition processing unit generates a stereoscopic display screen that three-dimensionally displays the road on which the host vehicle travels when the host vehicle is stopped.
- the configuration is the same as that of the first embodiment. Therefore, in the following description, the three-dimensional stereoscopic display device according to the second embodiment is applied to the in-vehicle information system, and the configuration is referred to FIG.
- FIG. 7 is a diagram for explaining a screen composition process for three-dimensionally displaying a three-dimensional image of a building or the like on a map display surface Pa of a three-dimensional map.
- FIG. 8 is a diagram showing a data flow in the screen composition process of FIG. In the example of FIG. 7, the road in which the buildings 1 and 2 are present is displayed in a three-dimensional manner.
- the main CPU 4a reads map data from the map DB 13a stored in the HDD 13 as shown in FIG. 8, and generates three-dimensional map data according to a predetermined map drawing algorithm.
- the three-dimensional map data represents, for example, the three-dimensional map described on the left side of FIG.
- the main CPU 4 a reads icon data of a stereoscopic image icon such as a building displayed on the three-dimensional map from the icon data 13 b stored in the HDD 13.
- the 3D map indicated by the 3D map data is displayed on the 3D map display surface Pa on the screen Q of the 3D display monitor 6. Furthermore, it displays so that the front surface (side closest to the driver) of the stereoscopic image icon indicating each of the building 1 and the building 2 is included in the icon stereoscopic display surface Pb in front of the stereoscopic map display surface Pa.
- the icon three-dimensional display surface Pb is a display surface that appears as if the front of the feature indicated by the three-dimensional image icon is displayed in a three-dimensional view, and is set in front of the three-dimensional map display surface Pa.
- the icon 3D display surface Pb is separated from the 3D map display surface Pa by a distance corresponding to the degree to which the icon 3D display surface Pa protrudes from the above-mentioned 3D map display surface Pa.
- the building 1 when viewed from the driver, the building 1 appears to protrude from the three-dimensional map display surface Pa by a distance z3.
- the right-eye image data Pic_R (x, y) of the three-dimensional map includes a point p2 (x, y, Z0-z3) on the icon stereoscopic display surface Pb and a point Or (xr, yr, 0) that is the position of the right eye.
- An extension line (vector Vr) of the straight line connecting the three points is represented by a set of points pr intersecting the three-dimensional map display surface Pa.
- the left-eye image data Pic_L (x, y) of the three-dimensional map includes a point p2 (x, y, Z0-z3) on the icon stereoscopic display surface Pb and a point Ol (xl, yl, 0) is represented by a set of points pl that intersect the 3D map display surface Pa.
- the screen composition processing unit 4 uses the three-dimensional map data, parameters Z0 and d, and icon data, as in the first embodiment.
- the distance between the three-dimensional map three-dimensional map display surface Pa and the driver's eye position is Z0
- the distance between the icon three-dimensional display surface Pb of buildings 1 and 2 and the driver's eye position is (Z0-z3). )
- the video reproduction device 5 uses the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 as described above. Is output to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 uses the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) reproduced by the video reproduction device 5 to display a 3D map and a stereoscopic image icon of the building. 3D display. At this time, it seems to the driver that the icon images of the buildings 1 and 2 are projected on the three-dimensional map by stereoscopic viewing.
- the screen composition processing unit 4 displays the three-dimensional images of the buildings 1 and 2 on the planar map using the planar map data and the icon data as in the first embodiment.
- a right-eye image and a left-eye image are generated.
- the stereoscopic display monitor 6 displays the three-dimensional images of the buildings 1 and 2 on the planar map.
- FIG. 9 is a diagram for explaining a screen composition process for three-dimensionally displaying a three-dimensional image of a mountain or the like represented by contour lines on the map display surface P of a three-dimensional map.
- FIG. 10 is a diagram showing a data flow in the screen composition process of FIG. In FIG. 9, “ ⁇ mountain” is three-dimensionally displayed on the planar map when the host vehicle is stopped.
- the main CPU 4a reads map data from the map DB 13a stored in the HDD 13, and generates planar map data Pic_plane according to a predetermined map drawing algorithm.
- the planar map data Pic_plane represents, for example, the planar map described on the left side of FIG.
- the main CPU 4 a reads icon data such as mountains to be displayed on the planar map from the icon data 13 b stored in the HDD 13.
- the planar map indicated by the planar map data is displayed on the map display surface P on the screen Q of the stereoscopic display monitor 6. Furthermore, the top of the stereoscopic image icon indicating “ ⁇ mountain” (the side closest to the driver) is separated from the map display surface P by the distance h corresponding to the height of “ ⁇ mountain”. To be included in. In FIG. 9, when viewed from the driver, “ ⁇ mountain” appears to protrude from the map display surface P by a distance h by a stereoscopic view.
- the right-eye image data Pic_R (x, y) of the planar map includes a point p2 (x, y, Z0-h) on the icon stereoscopic display surface Pb and a point Or (xr, yr, 0) that is the position of the right eye.
- An extended line (vector Vr) of the connecting lines is represented by a set of points pr that intersect with the map display surface P.
- the left-eye image data Pic_L (x, y) of the planar map includes a point p2 (x, y, Z0-h) on the icon stereoscopic display surface Pb and a point Ol (xl, yl, 0) that is the position of the left eye. ) Is represented by a set of points pl that intersect with the map display surface P.
- the screen composition processing unit 4 uses the planar map data, parameters Z0, d, h, and icon data, and is the same as in the first embodiment.
- the distance between the map display surface P and the driver's eye position is Z0
- the distance between the icon solid display surface Pb of “ ⁇ mountain” and the driver's eye position is (Z0 ⁇ z3).
- the points pr and pl are calculated so as to generate right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y), and output them to the video reproduction device 5.
- the video reproduction device 5 uses the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 as described above. Is output to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 uses the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) reproduced by the video reproduction device 5, and uses a three-dimensional map of “ ⁇ mountain”. 3D display of image icons. At this time, it seems to the driver that the icon image of “ ⁇ mountain” is raised on the planar map by stereoscopic viewing.
- the screen composition processing unit 4 detects that the host vehicle is moving by the moving body state detection unit, the plane of the image or video to be displayed is displayed.
- the right-eye and left-eye images or videos for display are generated and it is detected that the host vehicle is stopped, the right-eye and left-eye images for 3D stereoscopic display of the image or video to be displayed Generate an image or video.
- a map is displayed as a three-dimensional stereoscopic image by stereoscopic viewing while the vehicle is stopped, and a normal planar map and a stereoscopic image icon are displayed on the planar image while traveling, thus ensuring safety. Therefore, it is possible to provide an in-vehicle information system 1 for 3D stereoscopic display that is easy to see and safe.
- the three-dimensional building has been described as an icon in the first embodiment or the second embodiment, but the three-dimensional building is stored in the icon data 13b or the map DB 13a as three-dimensional data. May be.
- This three-dimensional data is stored in, for example, an OpenGL format that is a standard three-dimensional image storage data format.
- the image for three-dimensional stereoscopic display may be synthesized by the screen composition processing unit 4 based on this three-dimensional data.
- the three-dimensional building may be three-dimensionally displayed on the plane in the OpenGL format for plane display.
- Embodiment 3 FIG.
- a software button for operation input such as an icon is three-dimensionally displayed on another apparent display surface parallel to the map display surface of the planar map.
- the screen composition processing unit stereoscopically displays a software key for operation input such as an icon on another apparent display surface parallel to the apparent map display surface.
- a display screen is generated, but its basic configuration is the same as in the first embodiment. Therefore, in the following description, the 3D stereoscopic display device according to Embodiment 3 is applied to the in-vehicle information system, and FIG. 3 is referred to for the configuration.
- FIG. 11 is a diagram for explaining a screen composition process in which the apparent display surface of the icon is located in front of the map display surface of the planar map.
- FIG. 12 is a diagram showing a data flow in the screen composition process of FIG. In the example of FIG. 11, each icon of the determination button and the return button is displayed on the icon solid display surface Pb in front of the apparent map display surface P of the planar map.
- the main CPU 4a reads map data from the map DB 13a stored in the HDD 13, and generates planar map data Pic_plane according to a predetermined map drawing algorithm.
- the planar map data Pic_plane represents, for example, the planar map described on the left side of FIG.
- the main CPU 4 a reads icon data of icons to be displayed on the plane map indicated by the plane map data Pic_plane from the icon data 13 b stored in the HDD 13.
- the determination button and the return button that are operation icons are displayed on the icon solid display surface Pb in front of the map display surface P of the planar map.
- the distance between the map display surface P of the planar map and the icon solid display surface Pb is dz. That is, from the driver, each icon of the determination button and the return button is seen floating from the plane map by a distance dz by stereoscopic view.
- the right-eye image data Pic_R (x, y) of the planar map includes a point p (x, y, Z0-dz) on the icon stereoscopic display surface Pb and a point Or (xr, yr, 0) that is the position of the right eye.
- An extended line (vector Vr) of the connecting straight lines is expressed by a set of points pr that intersect with the map display surface P of the planar map (same as the screen Q of the stereoscopic display monitor 6).
- the left-eye image data Pic_L (x, y) of the planar map includes a point p (x, y, Z0-dz) on the icon stereoscopic display surface Pb and a point Ol (xl, yl, 0) that is the position of the left eye. ) Is represented by a set of points pl (xl, yl, Z0) intersecting the map display surface P.
- the screen composition processing unit 4 uses the plane map data Pic_plane, the parameters Z0, d, dz, and icon data as described in the first embodiment. Similarly, the distance between the map display surface P and the driver's eye position is Z0, and the distance between the icon three-dimensional display surface Pb and the driver's eye position is (Z0-dz). Pr and pl are calculated to generate right-eye image data Pic_R (x, y) and left-eye image data Pic_L (x, y), and output them to the video reproduction device 5.
- the video reproduction device 5 reproduces the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) generated by the screen composition processing unit 4 and outputs them to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 stereoscopically displays the planar map and the icon using the right-eye image data Pic_R (x, y) and the left-eye image data Pic_L (x, y) reproduced by the video reproduction device 5. At this time, it appears to the driver that the icon image has floated on the planar map indicated by the planar map data Pic_plane by stereoscopic viewing.
- the icon selected by the user operation may be expressed by making the distance longer than the distance dz between the map display surface P of the planar map and the icon stereoscopic display surface Pb in the stereoscopic display of other icons, and further changing the color.
- the distance dz other than the icon selected by the user operation may be shorter than the selected icon.
- the moving body state detection unit such as the vehicle speed sensor 10 that detects the state of the host vehicle, the plane display or the three-dimensional stereoscopic display for the image or video to be displayed.
- Screen synthesis processing unit 4 for generating right-eye and left-eye images or videos
- video playback device 5 for playing back images or videos generated by the screen synthesis processing unit 4
- a 3D display monitor 6 for inputting a 3D image and displaying the 3D image
- the screen composition processing unit 4 displays an image or video to be displayed according to the state of the host vehicle detected by the moving body state detection unit.
- a right-eye and left-eye image or video for three-dimensional stereoscopic display in which the position of the apparent display surface for three-dimensional stereoscopic display is changed is generated.
- a stereoscopic image icon is displayed on the front side (driver side) of the map display surface P of the base plane map.
- Embodiment 4 FIG. In the first to third embodiments, the case where the screen compositing process is realized by software processing by the main CPU 4a has been shown. However, in the fourth embodiment, a configuration in which the screen compositing process is performed at high speed by hardware logic will be described.
- FIG. 13 is a block diagram showing a configuration of an in-vehicle information system using a three-dimensional stereoscopic display device according to Embodiment 4 of the present invention.
- the in-vehicle information system 1a has the above-described first embodiment in that a stereoscopic image generation processing core 40 that performs screen synthesis processing at high speed with hardware logic is configured on the same LSI as the main CPU 4a. 3 is different from that shown in FIG.
- the hardware logic for performing screen composition processing may be configured as a dedicated LSI and connected to the main CPU 4a as the stereoscopic image generation processing core 40 instead of the same LSI.
- FIG. 14 is a block diagram showing the configuration of the stereoscopic image generation processing core.
- the stereoscopic image generation processing core 40 includes a stereoscopic image calculation circuit 41, a planar image memory plane 42, a three-dimensional stereoscopic image data plane 43, a right-eye image memory plane 44a, and a left-eye image memory plane 44b.
- the planar image memory plane 42 is a memory plane that receives and stores planar image data indicating a planar image serving as display content.
- planar map data is input from the map DB 13a of the HDD 13.
- the three-dimensional stereoscopic image data plane 43 is a memory plane that inputs and stores stereoscopic image data indicating a stereoscopic image as display contents.
- icon stereoscopic image data is input from the icon data 13 b of the HDD 13.
- the stereoscopic image calculation circuit 41 inputs planar map data from the planar image memory plane 42, stereoscopic image data from the three-dimensional stereoscopic image data plane 43, and parameters Z0, d, dz from the internal memory 11, etc. This is a circuit for executing the same screen composition processing as in the first to third embodiments.
- the right-eye image memory plane 44 a is a memory plane that inputs and stores right-eye image data obtained as a result of the screen composition processing of the stereoscopic image calculation circuit 41.
- the left-eye image memory plane 44b is a memory plane for inputting and storing left-eye image data obtained as a result of the screen composition processing of the stereoscopic image calculation circuit 41.
- the right-eye image memory plane 44a and the left-eye image memory plane 44b output the right-eye image data and the left-eye image data to the video reproduction device 5 at a predetermined timing.
- the stereoscopic image calculation circuit 41 selects three-dimensional stereoscopic display or three-dimensional image for planar display according to the speed information of the host vehicle as vehicle information. If the host vehicle is stopped, select the 3D stereoscopic display that allows the features to appear on the map. If the vehicle is running, use the 3D display to display the features on the flat map. The three-dimensional image is selected.
- the plane image data stored in the plane image memory plane 42 may be expressed in a two-dimensional drawing library such as OpenVR, for example.
- the stereoscopic image data stored in the three-dimensional stereoscopic image data plane 43 may be expressed by a three-dimensional drawing library such as OpenGL, for example. Since the expression in these libraries is a standard I / F, the convenience of the stereoscopic image generation processing core 40 can be improved.
- the stereoscopic image generation processing core 40 which is a dedicated LSI for performing the hardware synthesis for the screen synthesis process, is provided, the image synthesis process can be executed at high speed. .
- control is performed so that the three-dimensional stereoscopic image is not displayed while the host vehicle is traveling.
- a specific icon such as an operation input icon or a direction icon is not traveling.
- Three-dimensional stereoscopic display may be performed. For example, only an icon for which a function that allows operation while traveling is executed is displayed in a three-dimensional manner. That is, when the screen composition processing unit 4 is operated on the icon image, when the function corresponding to the icon image is executed, the icon image corresponding to the function permitted to be executed while the host vehicle is traveling.
- Images for right and left eyes for only three-dimensional stereoscopic display are generated, and the stereoscopic display monitor 6 uses the right and left eyes for three-dimensional stereoscopic display of the icon image reproduced by the video reproduction device 5.
- FIG. 15 is a block diagram showing a configuration of an in-vehicle information system using a three-dimensional stereoscopic display device according to Embodiment 5 of the present invention.
- the in-vehicle information system 1 includes a left-eye camera 2a, a right-eye camera 2b, and a camera signal input device 2c in addition to the configuration of FIG. 3 described in the first embodiment. Is a system that functions as the stereoscopic display system 1A shown in FIG.
- This in-vehicle information system 1 operates in substantially the same manner as the system described in the first embodiment, but differs in that the back monitor image is automatically displayed in a three-dimensional manner when the host vehicle is moved backward.
- the screen composition processing unit 4 instructs the camera signal input device 2c to capture the right eye image and the left eye behind the host vehicle, which are captured by the left eye camera 2a and the right eye camera 2b.
- the reverse detection of the host vehicle may include a method of providing a sensor for recognizing the shift position of the shift lever, or a circuit for detecting lighting of a back lamp that illuminates the rear of the host vehicle. That is, these sensors function as a moving body state detection unit by notifying the screen composition processing unit 4 of the detection result via the in-vehicle LAN_I / F unit 16.
- the screen composition processing unit 4 subjects the right-eye image data and the left-eye image data behind the host vehicle acquired from the camera signal input device 2c to a three-dimensional stereoscopic image composition as a back monitor image.
- the video is output to the video playback device 5.
- the character information indicating the distance between the host vehicle and an obstacle existing behind the vehicle is subjected to a three-dimensional stereoscopic image synthesis process so as to be displayed two-dimensionally.
- the video reproduction device 5 reproduces the back monitor image input from the screen composition processing unit 4 and outputs it to the stereoscopic display monitor 6.
- the stereoscopic display monitor 6 uses the right-eye image data and the left-eye image data behind the host vehicle reproduced by the video playback device 5 to display the captured image behind the host vehicle in a three-dimensional manner.
- the screen composition processing unit 4 performs the composition processing of the three-dimensional stereoscopic image, so that the character information indicating the distance between the host vehicle and the obstacle existing behind the vehicle is displayed on the screen of the stereoscopic display monitor 6. It is displayed in a plane. By doing so, a real-time video and character information are clearly separated and expressed, and an easy-to-use parking support system can be realized.
- the distance between the host vehicle and the obstacle can be obtained by using a back sensor (not shown) that detects an obstacle behind the host vehicle or by recognizing images taken by the cameras 2a and 2b. it can.
- a guideline that is used for parking assistance and that indicates the direction in which the host vehicle should move may be displayed in a three-dimensional manner. That is, when it is detected that the host vehicle is moving backward, the screen composition processing unit 4 superimposes a three-dimensional solid line guideline on the image acquired from the camera signal input device 2c.
- the stereoscopic display monitor 6 displays a three-dimensional stereoscopic image of the three-dimensional stereoscopic image on which the guideline reproduced by the video reproduction device 5 is superimposed.
- the back camera unit configured by the cameras 2a and 2b and the camera signal input device 2c that captures the rear of the host vehicle and obtains a three-dimensional stereoscopic image to be captured.
- the screen composition processing unit 4 outputs the three-dimensional stereoscopic image obtained by the back camera unit to the video reproduction device 5 when it is detected that the host vehicle is moving backward, and the stereoscopic display monitor 6 A three-dimensional stereoscopic image reproduced by the video reproduction device 5 is input and three-dimensional stereoscopic display is performed.
- the back camera uses the guideline indicating the moving direction of the host vehicle moving backward as a three-dimensional solid line.
- the 3D display monitor 6 inputs the 3D stereoscopic image on which the guideline reproduced by the video reproduction device 5 is superimposed and displays the 3D stereoscopic image.
- the icon for operation input may be displayed in multiple stages in order to indicate that the icon displayed in three dimensions is operated.
- the distance dz between the map display surface P of the planar map and the icon solid display surface Pb is shortened each time the button is pressed. By doing in this way, it becomes easy to visually recognize that the icon displayed three-dimensionally is operated.
- whether or not three-dimensional stereoscopic display is possible is determined depending on whether the host vehicle is traveling or stopped. However, depending on the vehicle speed of the host vehicle (the moving speed of the moving body that holds the host device). Thus, the degree to which the image to be displayed three-dimensionally appears in front of the screen may be changed. That is, the screen composition processing unit 4 changes the position of the apparent display surface for three-dimensional stereoscopic display of an image or video to be displayed according to the vehicle speed of the host vehicle.
- the three-dimensional display monitor 6 displays the right-eye image and the left-eye image or video for three-dimensional stereoscopic display. For example, in FIG.
- the distance representing the degree of three-dimensional elevation is set to z1 according to the height of “XX building”, but the distance z1 is set to a predetermined value D1 while the host vehicle is traveling.
- a distance indicating the degree to which the icon that appears to the viewer side most in stereoscopic view (for example, the icon of the highest feature) is raised is set to a predetermined value D2. May be.
- the traveling speed of the host vehicle is larger than a predetermined threshold, the above values D1 and D2 may be changed so that the degree of relief by stereoscopic viewing is reduced.
- the user may be able to freely set a distance that indicates the degree to which the icon or landmark on the map that is three-dimensionally displayed while the host vehicle is traveling is raised.
- the three-dimensional stereoscopic image display or the planar display is switched depending on whether the host vehicle is traveling or stopped.
- the day and night may be judged based on the illumination signal, and the degree of the three-dimensional stereoscopic image switching or icon pop-up may be changed.
- a camera that captures the outside of the vehicle is provided in the host vehicle, and an image recognition determination unit that determines the day and night by recognizing the weather and illumination signals outside the vehicle from the captured image of the camera is provided in the main CPU 4a.
- a detection unit moving body state detection unit
- the screen compositing processing unit 4 changes the degree of switching of the three-dimensional stereoscopic image and the icon depending on the detection result of the detection unit.
- the planar map is displayed in a three-dimensional manner.
- the AV system menu screen if it is generally displayed on the in-vehicle information system, the AV system menu screen, vehicle information, safety information, and the like are displayed. You may apply to. For example, you may use for the display of the icon for control of an air-conditioner, the meter panel of a dashboard, the fuel consumption of a vehicle, preventive safety information, VICS (trademark) information, etc.
- Embodiments 1 to 5 described above stereoscopic display that is stereoscopically viewed with the naked eye is shown, but a stereoscopic display system that obtains a stereoscopic image using polarized glasses may be used.
- the present invention is applicable to all display devices having a stereoscopic display monitor as described above. Is possible.
- the present invention can be applied not only to a vehicle-mounted navigation device but also to a display device of an RSE (Rear Seat Entertainment), a television receiver, a mobile phone terminal, or a personal digital assistant (PDA).
- the present invention may be applied to a display device such as a PND (Portable Navigation Device) that is carried and used by a person in a moving body such as a vehicle, a railway, a ship, or an aircraft.
- PND Portable Navigation Device
- the moving body is moving using detection information of an acceleration sensor built in the PND or the mobile phone terminal.
- the three-dimensional stereoscopic display device of the present invention is applied to a mobile phone terminal, the three-dimensional stereoscopic display is changed depending on whether or not the user who carries the three-dimensional stereoscopic display device is walking.
- the three-dimensional stereoscopic display device can improve the visibility of a three-dimensional stereoscopic image or a three-dimensional stereoscopic video, and thus is suitable for a display device of an in-vehicle information system.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Instructional Devices (AREA)
- Navigation (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
実施の形態1.
図1は、この発明に係る3次元立体表示装置を用いた立体表示システムの構成例を示すブロック図である。図1(a)は、両眼用のカメラで撮影された左右映像から立体映像を表示する立体表示システム1Aを示している。図1(a)において、立体表示システム1Aは、左目用カメラ2a、右目用カメラ2b、記録・撮影装置3、画面合成処理部4、映像再生装置(再生処理部)5及び立体表示モニタ(立体表示モニタ部)6を備える。
左目用カメラ2aと右目用カメラ2bは、両眼の視差を考慮した間隔で並べて配置されており、記録・撮影装置3の制御によって、撮影対象の風景Aを撮影する。左目用カメラ2aと右目用カメラ2bに撮影された風景Aの左右映像データは、記録・撮影装置3に記録される。画面合成処理部4は、記録・撮影装置3から読み出した左右映像データに対して、本発明に特有な3次元立体映像の合成処理を施して映像再生装置5へ出力する。
映像再生装置5は、画面合成処理部4で処理された左右映像データを再生して立体表示モニタ6へ出力する。立体表示モニタ6は、映像再生装置5に再生された左右映像データを、視聴者から見て立体的に表示する。
画面合成処理部4は、立体映像用コンテンツ受信機7が受信した立体映像用コンテンツの左右映像データに対して、本発明に特有な3次元立体映像の合成処理を施して映像再生装置5へ出力する。図1(a)と同様にして、立体表示モニタ6が、映像再生装置5に再生された左右映像データを、視聴者から見て立体的に表示する。
なお、立体表示用コンテンツとして、いわゆる3次元データ(例えば3次元地図データ等)を格納しておき、画面合成処理部4が、この3次元データが示す画像について左右の視点での見え方を演算して左右映像データを生成してもよい。
地図DB13aは、ナビゲーション処理で利用される地図データが登録されたデータベースである。地図データには、地図上のPOI(Point Of Interest)の所在地又はこれに関連する詳細情報が記述されたPOI情報も含まれる。
アイコンデータ13bは、立体表示モニタ6の画面上に表示するアイコンを示すデータである。例えば、ランドマーク等の地物や建物の立体像を示す立体画像アイコンが挙げられる。また、立体画像アイコンには、ナビゲーション処理における、自車位置マークや、目的地マーク、経由地マーク、経路マークを示すアイコンも含まれる。
プログラム13dは、メインCPU4aが実行する車載情報処理用のアプリケーションプログラムである。例えば、画面合成処理部4の機能を実現するプログラムモジュールを含む地図表示用のアプリケーションプログラムがある。
DTV受信機15は、デジタルテレビ放送を受信する受信機であり、ラジオ受信機14と同様に、不図示の選局ボタンの操作に応じて選局がなされる。また、DTV受信機15は、受信したデジタルテレビ放送に3次元立体表示映像データが含まれる場合、図1(b)で示した立体映像用コンテンツ受信機7として機能し、車載情報システム1は、図1(b)で示した立体表示システム1Bとして機能する。
実施の形態1による3次元立体表示装置は、平面画像上に立体像を表示するにあたり、自装置を搭載する移動体の状態に応じて、立体視により平面画像上に立体像が浮き出して見える3次元立体表示を行うか、平面表示用の3次元画像を表示する。
例えば、車載用ナビゲーション装置の地図表示において、自車両が停車中(パーキングブレーキあり)であると、平面地図上で建物等の立体像が浮き出して見える3次元立体表示を行い、自車両が走行状態(走行中、停車中(パーキングブレーキなし))になると、平面地図上で建物等の3次元画像を表示する。
また、図5は、3次元立体表示の画面合成処理を説明するための図である。図6は、図5の画面合成処理におけるデータの流れを示す図である。以降では、図4に沿って画面合成処理の詳細を説明し、図5及び図6を適宜参照する。
また、メインCPU4aは、HDD13に格納されるアイコンデータ13bから、平面地図データPic_planeが示す平面地図上に立体表示させる3次元ランドマークの立体画像アイコンを示すアイコンデータを読み込む。
平面地図データPic_planeが示す平面地図上の点p2(x,y)に位置する「○○ビル」の高さに応じて、立体視で浮き出す度合いを表す距離をz1とすると、このビルの頂上面は、立体視による見かけ上の立体表示面P1に含まれ、立体表示面P1上の点p2(x,y,Z0-z1)で表される。
同様に、平面地図の左目用画像データPic_L(x,y)は、見かけ上の立体表示面P1上の点p2と左目の位置である点Ol(xl,yl,0)を結ぶ直線(ベクトルVl)が、立体表示モニタ6の画面Qと交わる点plの集合で表される。つまり、左目用画像データPic_R(x,y)は、「○○ビル」の立体画像アイコンが位置する点p2を、立体表示モニタ6の画面Qに写像して得られる点plの集合となる。
次に、画面合成処理部4は、車速センサ10から入力した車速パルスを基に、自車両が停車中か否かを判定する(ステップST2)。自車両が停車中である場合(ステップST2;YES)、画面合成処理部4は、運転者と立体表示モニタ6の画面Qの位置関係及び運転者の目の間隔を示すパラメータZ0,d、平面地図データPic_plane及びアイコンデータを用いて、見かけ上の地図表示面Pと運転者の目の位置Or,Olとの距離がZ0となり、さらに立体表示面P1と運転者の目の位置Or,Olとの距離が(Z0-z1)となるように点pr,plを計算する。すなわち、画面合成処理部4は、立体視により「○○ビル」の立体画像アイコンが立体表示される右目用画像データPic_R(x,y)と左目用画像データPic_L(x,y)とを生成する。
映像再生装置5は、画面合成処理部4により生成された右目用画像データPic_R(x,y)と左目用画像データPic_L(x,y)とを再生して立体表示モニタ6へ出力する。立体表示モニタ6は、映像再生装置5により再生された右目用画像データPic_R(x,y)と左目用画像データPic_L(x,y)とを用いて左目用画像及び右目用画像を立体表示する(ステップST6)。
このとき、自車両が停止中であれば、運転者からは、「○○ビル」の立体画像アイコンが、立体視によって平面地図から浮き出すように表示されて見える(z1の高さで知覚される)。一方、自車両が走行中であれば、平面地図上で「○○ビル」の3次元画像が表示される。
上記の説明では、車速センサ10から入力した車速パルスを基に自車両が停車中か否かを判定したが、停車中の判断は、これに限定されるものではない。例えば、図示はしないが、パーキングブレーキを締め、かつ車速が“0”のときを停車と判断して、3次元立体画像を表示してもよい。また、国や州によってHMI(Human Machine Interface)の安全規定が異なるので、対象となる車両状態を基に、平面画像を表示するか、3次元立体画像を表示するかを切り替えるようにしてもよい。
実施の形態2では、自装置を搭載する移動体が停止中である場合に、移動体が走行する道路について立体視による地図の3次元立体表示(いわゆるストリートビュー)を行い、移動体が移動状態にある場合には平面表示用の3次元画像を表示する。
実施の形態2による3次元立体表示装置は、画面合成処理部が、自車両が停車中の場合に、自車両が走行する道路を3次元立体表示する立体表示画面を生成するが、その基本的な構成は上記実施の形態1と同様である。従って、以降の説明では、実施の形態2による3次元立体表示装置を車載情報システムに適用し、その構成については、図3を参照することとする。
実施の形態2では、立体地図の地図表示面上に建物などの立体像を3次元立体表示する3次元立体画像を合成して表示する。
図7は、立体地図の地図表示面Pa上に建物などの立体像を3次元立体表示する画面合成処理を説明するための図である。図8は、図7の画面合成処理におけるデータの流れを示す図である。図7の例では、周囲にビル1、ビル2が存在する道路を3次元立体表示している。
アイコン立体表示面Pbは、立体視によって立体画像アイコンが示す地物の前面が表示されているように見える表示面であり、立体地図表示面Paの手前に平行に設定される。また、アイコン立体表示面Pbは、上述した立体地図表示面Paから手前に浮き出る度合いに応じた距離だけ立体地図表示面Paと離隔している。図7では、運転者から見ると、ビル1が距離z3だけ立体地図表示面Paから浮き出して見える。
同様に、3次元地図の左目用画像データPic_L(x,y)は、アイコン立体表示面Pb上の点p2(x,y,Z0-z3)と左目の位置である点Ol(xl,yl,0)とを結ぶ直線の延長線(ベクトルVr)が、立体地図表示面Paと交わる点plの集合で表される。
図9は、立体地図の地図表示面P上に等高線で表された山等の立体像を3次元立体表示する画面合成処理を説明するための図である。図10は、図9の画面合成処理におけるデータの流れを示す図である。図9では、自車両の停車時に平面地図上に「△△山」を3次元立体表示する。
また、3次元立体表示用の画像は、画面合成処理部4が、この3次元データを基に合成してもよい。さらに、自車両が走行中である場合、3次元建造物を平面表示用のOpenGL形式で平面に3次元表示してもよい。
実施の形態3では、アイコン等の操作入力用のソフトウェアボタンについて、平面地図の地図表示面と平行な別の見かけ上の表示面に立体表示する場合を述べる。
実施の形態3による3次元立体表示装置は、画面合成処理部が、アイコン等の操作入力用のソフトウェアキーを、見かけ上の地図表示面と平行な別の見かけ上の表示面に立体表示する立体表示画面を生成するが、その基本的な構成は上記実施の形態1と同様である。従って、以降の説明では、実施の形態3による3次元立体表示装置を車載情報システムに適用し、その構成については、図3を参照することとする。
図11は、アイコンの見かけ上の表示面を平面地図の地図表示面よりも手前にした画面合成処理を説明するための図である。図12は、図11の画面合成処理におけるデータの流れを示す図である。図11の例では、決定ボタン及び戻るボタンの各アイコンを、平面地図の見かけ上の地図表示面Pより手前のアイコン立体表示面Pbに表示する。
上記実施の形態1~3では、画面合成処理をメインCPU4aによるソフトウェア処理で実現する場合を示したが、この実施の形態4は、画面合成処理をハードウェアロジックで高速に実施する構成について述べる。
立体視画像計算回路41は、平面画像メモリプレーン42から平面地図データ、3次元立体画像データプレーン43から立体画像データ、内部メモリ11等からパラメータZ0,d,dzを入力し、これらを用いて、上記実施の形態1~3と同様な画面合成処理を実行する回路である。
例えば、走行中の操作が許容される機能が実行されるアイコンについてのみを3次元立体表示する。つまり、画面合成処理部4が、アイコン画像に対して操作がなされると、当該アイコン画像に対応する機能が実行される場合、自車両の走行中に実行が許可された機能に対応するアイコン画像のみの3次元立体表示用の右目用及び左目用の画像を生成し、立体表示モニタ6が、映像再生装置5により再生された当該アイコン画像の3次元立体表示用の右目用及び左目用の画像を3次元立体表示する。
この場合、アイコンの高さ(アイコンが立体視で浮き出す度合い)は一定にした方が見やすい。
この実施の形態5は、自車両の後方を撮影して、右目用画像と左目用画像をバックモニタ画像として得る左右2つのバックカメラを備えており、自車両を後進させると自動的にバックモニタ画像を3次元立体表示する。
図15は、この発明の実施の形態5による3次元立体表示装置を用いた車載情報システムの構成を示すブロック図である。図15において、車載情報システム1は、上記実施の形態1で説明した図3の構成に加えて、左目用カメラ2a、右目用カメラ2b及びカメラ信号入力装置2cを備え、地図などの画像や映像の表示に関して、図1に示した立体表示システム1Aとして機能するシステムである。この車載情報システム1は、上記実施の形態1で示したシステムとほぼ同様に動作するが、自車両を後進させると自動的にバックモニタ画像を3次元立体表示する点で異なる。
なお、自車両と障害物との距離は、自車両の後方の障害物を検知するバックセンサ(不図示)を用いたり、カメラ2a,2bで撮影された映像を画像認識することで得ることができる。
すなわち、画面合成処理部4が、自車両の車速に応じて、表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を変更した3次元立体表示用の右目用及び左目用の画像又は映像を生成し、立体表示モニタ6が、当該3次元立体表示用の右目用及び左目用の画像又は映像を3次元立体表示する。
例えば、図5において、「○○ビル」の高さに応じて、立体視で浮き出す度合いを表す距離をz1としたが、自車両の走行中には、当該距離z1を予め定めた値D1で除算してもよい。また、地図上に立体表示したアイコンのうち、立体視によって視聴者側に最も浮き出して見えるアイコン(例えば最も高い地物のアイコン)の浮き出す度合いを示す距離を予め定めた値D2になるようにしてもよい。
さらに、自車両の走行速度が所定の閾値より大きい場合には、立体視によって浮き出す度合いが低くなるように、上記値D1,D2を変更してもよい。
この他、自車両の走行中に3次元立体表示する地図上のアイコンやランドマークの立体視によって浮き出す度合いを示す距離を、ユーザが自由に設定できるようにしてもかまわない。
また、車両、鉄道、船舶又は航空機等の移動体内に、人が携帯して持ち込んで使用するPND(Portable Navigation Device)等の表示装置に適用してもかまわない。
この場合、PNDや携帯電話端末に内蔵される加速度センサの検知情報を用いて、上記移動体が移動しているか否かを判定する。本発明の3次元立体表示装置を携帯電話端末に適用した場合には、これを携帯するユーザが歩行中であるか否かに応じて3次元立体表示を変更する。
Claims (18)
- 移動体に搭載又は保持される3次元立体表示装置であって、
前記移動体の状態を検出する移動体状態検出部と、
表示対象の画像又は映像の平面表示用又は3次元立体表示用の右目用及び左目用の画像又は映像を生成する画面合成処理部と、
前記画面合成処理部で生成された画像又は映像を再生する再生処理部と、
前記再生処理部で再生された前記表示対象の画像又は映像を入力して3次元立体表示する立体表示モニタ部とを備え、
前記画面合成処理部は、前記移動体状態検出部が検出した前記移動体の状態に応じて、前記平面表示用の右目用及び左目用の画像又は映像を前記再生処理部へ出力するか、前記3次元立体表示用の右目用及び左目用の画像又は映像を前記再生処理部へ出力するかを切り替える3次元立体表示装置。 - 前記画面合成処理部は、前記移動体状態検出部により前記移動体が移動中であることが検出された場合に、前記表示対象の画像又は映像の平面表示用の右目用及び左目用の画像又は映像を生成し、前記移動体が停止していることが検出された場合には、前記表示対象の画像又は映像の3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。
- 前記表示対象の画像又は映像は、地図の画像又は映像、文字情報を記述した画像及びアイコン画像のうちの少なくとも一つを含むことを特徴とする請求項1記載の3次元立体表示装置。
- 前記画面合成処理部は、複数のアイコン画像の3次元立体表示用の右目用及び左目用の画像を生成するにあたり、前記複数のアイコン画像をそれぞれ3次元立体表示する見かけ上の表示面の位置を全て合わせることを特徴とする請求項3記載の3次元立体表示装置。
- 前記画面合成処理部は、アイコン画像に対して操作がなされると、当該アイコン画像に対応する機能が実行される場合、前記移動体の移動中に実行が許可された機能に対応するアイコン画像のみの3次元立体表示用の右目用及び左目用の画像を生成することを特徴とする請求項3記載の3次元立体表示装置。
- 前記画面合成処理部は、前記表示対象の画像又は映像の平面表示用の右目用及び左目用の画像又は映像として、前記右目用及び前記左目用の画像又は映像がともに同一な画像又は映像を生成することを特徴とする請求項1記載の3次元立体表示装置。
- 前記移動体に搭載され、当該移動体の後方を撮影して撮影対象の3次元立体画像を取得するバックカメラ部を備え、
前記画面合成処理部は、前記移動体状態検出部により前記移動体が後進していることが検出された場合、前記バックカメラ部で得られた前記3次元立体画像を前記再生処理部へ出力し、
前記立体表示モニタ部は、前記再生処理部で再生された前記3次元立体画像を入力して3次元立体表示することを特徴とする請求項1記載の3次元立体表示装置。 - 前記画面合成処理部は、後進する前記移動体の移動方向を示すガイドラインを3次元立体線として前記バックカメラ部で得られた前記3次元立体画像上に重畳し、
前記立体表示モニタ部は、前記再生処理部で再生された前記ガイドラインを重畳した前記3次元立体画像を入力して3次元立体表示することを特徴とする請求項7記載の3次元立体表示装置。 - 前記移動体状態検出部は、前記移動体の周囲環境の状態を検出し、
前記画面合成処理部は、前記移動体状態検出部が検出した前記移動体の周囲環境の状態に応じて、前記平面表示用の右目用及び左目用の画像又は映像を前記再生処理部へ出力するか、前記3次元立体表示用の右目用及び左目用の画像又は映像を前記再生処理部へ出力するかを切り替えることを特徴とする請求項1記載の3次元立体表示装置。 - 移動体に搭載又は保持される3次元立体表示装置であって、
前記移動体の状態を検出する移動体状態検出部と、
表示対象の画像又は映像の平面表示用又は3次元立体表示用の右目用及び左目用の画像又は映像を生成する画面合成処理部と、
前記画面合成処理部で生成された画像又は映像を再生する再生処理部と、
前記再生処理部で再生された前記表示対象の画像又は映像を入力して3次元立体表示する立体表示モニタ部とを備え、
前記画面合成処理部は、前記移動体状態検出部が検出した前記移動体の状態に応じて、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を変更した3次元立体表示用の右目用及び左目用の画像又は映像を生成する3次元立体表示装置。 - 前記画面合成処理部は、前記移動体状態検出部が検出した前記移動体の速度に応じて、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を変更した3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項10記載の3次元立体表示装置。
- 前記表示対象の画像又は映像は、地図の画像又は映像、文字情報を記述した画像及びアイコン画像のうちの少なくとも一つを含むことを特徴とする請求項10記載の3次元立体表示装置。
- 前記画面合成処理部は、複数のアイコン画像の3次元立体表示用の右目用及び左目用の画像を生成するにあたり、前記複数のアイコン画像をそれぞれ3次元立体表示する見かけ上の表示面の位置を全て合わせることを特徴とする請求項12記載の3次元立体表示装置。
- 前記画面合成処理部は、アイコン画像に対して操作がなされると、当該アイコン画像に対応する機能が実行される場合、前記移動体の移動中に実行が許可された機能に対応するアイコン画像のみの3次元立体表示用の右目用及び左目用の画像を生成することを特徴とする請求項12記載の3次元立体表示装置。
- 前記画面合成処理部は、前記表示対象の画像又は映像の平面表示用の右目用及び左目用の画像又は映像として、前記右目用及び前記左目用の画像又は映像がともに同一な画像又は映像を生成することを特徴とする請求項10記載の3次元立体表示装置。
- 前記移動体に搭載され、当該移動体の後方を撮影して撮影対象の3次元立体画像を取得するバックカメラ部を備え、
前記画面合成処理部は、前記移動体状態検出部により前記移動体が後進していることが検出された場合、前記バックカメラ部で得られた前記3次元立体画像を前記再生処理部へ出力し、
前記立体表示モニタ部は、前記再生処理部で再生された前記3次元立体画像を入力して3次元立体表示することを特徴とする請求項10記載の3次元立体表示装置。 - 前記画面合成処理部は、前記移動体状態検出部により前記移動体が後進していることが検出されると、後進する前記移動体の移動方向を示すガイドラインを3次元立体線として前記バックカメラ部で得られた前記3次元立体画像上に重畳し、
前記立体表示モニタ部は、前記再生処理部により再生された前記ガイドラインを重畳した前記3次元立体画像を入力して3次元立体表示することを特徴とする請求項16記載の3次元立体表示装置。 - 前記移動体状態検出部は、前記移動体の周囲環境の状態を検出し、
前記画面合成処理部は、前記移動体状態検出部が検出した前記移動体の周囲環境の状態に応じて、前記表示対象の画像又は映像を3次元立体表示する見かけ上の表示面の位置を変更した3次元立体表示用の右目用及び左目用の画像又は映像を生成することを特徴とする請求項10記載の3次元立体表示装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012539476A JP5726201B2 (ja) | 2010-10-19 | 2010-10-19 | 3次元立体視表示装置、3次元立体視表示制御装置、およびlsi回路 |
PCT/JP2010/006186 WO2012053029A1 (ja) | 2010-10-19 | 2010-10-19 | 3次元立体表示装置 |
US13/702,332 US9179140B2 (en) | 2010-10-19 | 2010-10-19 | 3dimension stereoscopic display device |
DE112010005944T DE112010005944T5 (de) | 2010-10-19 | 2010-10-19 | Stereoskopische Dreidimensionen-Anzeigevorichtung |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/006186 WO2012053029A1 (ja) | 2010-10-19 | 2010-10-19 | 3次元立体表示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012053029A1 true WO2012053029A1 (ja) | 2012-04-26 |
Family
ID=45974770
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/006186 WO2012053029A1 (ja) | 2010-10-19 | 2010-10-19 | 3次元立体表示装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9179140B2 (ja) |
JP (1) | JP5726201B2 (ja) |
DE (1) | DE112010005944T5 (ja) |
WO (1) | WO2012053029A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012252153A (ja) * | 2011-06-02 | 2012-12-20 | Alpine Electronics Inc | 画像表示装置 |
US9437047B2 (en) | 2014-01-15 | 2016-09-06 | Htc Corporation | Method, electronic apparatus, and computer-readable medium for retrieving map |
WO2017042923A1 (ja) * | 2015-09-10 | 2017-03-16 | 三菱電機株式会社 | 表示制御装置、表示装置及び表示制御方法 |
JP2019057869A (ja) * | 2017-09-22 | 2019-04-11 | トヨタ自動車株式会社 | 車両内コミュニケーション装置 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10795457B2 (en) | 2006-12-28 | 2020-10-06 | D3D Technologies, Inc. | Interactive 3D cursor |
JPWO2012127824A1 (ja) * | 2011-03-18 | 2014-07-24 | パナソニック株式会社 | 眼鏡、立体視映像処理装置、システム |
KR101985674B1 (ko) * | 2012-09-18 | 2019-06-04 | 삼성전자 주식회사 | 비접촉식 사용자 인터페이스 동작 인식 방법 및 그 장치 |
JP6318470B2 (ja) * | 2013-05-15 | 2018-05-09 | ソニー株式会社 | 表示制御装置、表示制御方法および記録媒体 |
JP6545108B2 (ja) * | 2016-01-14 | 2019-07-17 | アルパイン株式会社 | 駐車支援装置および駐車支援方法 |
WO2017180869A1 (en) * | 2016-04-14 | 2017-10-19 | Gentex Corporation | Vision correcting vehicle display |
KR102581359B1 (ko) * | 2016-09-02 | 2023-09-20 | 엘지전자 주식회사 | 차량용 사용자 인터페이스 장치 및 차량 |
DE112017005090T5 (de) * | 2016-10-06 | 2019-08-01 | Sony Corporation | Wiedergabevorrichtung, wiedergabevorrichtung; wiedergabeverfahren und programm |
TWI782384B (zh) * | 2021-01-06 | 2022-11-01 | 幻景啟動股份有限公司 | 浮空影像系統 |
CN114882813B (zh) * | 2021-01-19 | 2024-05-14 | 幻景启动股份有限公司 | 浮空图像系统 |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6257392A (ja) * | 1985-09-05 | 1987-03-13 | Clarion Co Ltd | 立体バツクアイカメラ |
JPH04255891A (ja) * | 1991-02-08 | 1992-09-10 | Omron Corp | 車両用ディスプレイ装置 |
JPH05147456A (ja) * | 1991-11-27 | 1993-06-15 | Nippon Seiki Co Ltd | 車両用表示装置 |
JPH07105484A (ja) * | 1993-10-04 | 1995-04-21 | Honda Motor Co Ltd | 車両用情報表示装置 |
JPH08124096A (ja) * | 1994-10-25 | 1996-05-17 | Honda Motor Co Ltd | 車両用現在位置表示装置 |
JPH1144545A (ja) * | 1997-07-29 | 1999-02-16 | Hitachi Ltd | 3次元景観図表示ナビゲーション装置 |
JPH11119147A (ja) * | 1997-10-14 | 1999-04-30 | Asahi Optical Co Ltd | ヘッドアップディスプレイ |
WO2004099718A1 (ja) * | 2003-05-07 | 2004-11-18 | Seijiro Tomita | カーナビゲーションシステムにおける画像の表示方法及び装置 |
WO2007069573A1 (ja) * | 2005-12-16 | 2007-06-21 | Matsushita Electric Industrial Co., Ltd. | 移動体用入力装置、及び方法 |
WO2007129382A1 (ja) * | 2006-04-28 | 2007-11-15 | Panasonic Corporation | ナビゲーション装置およびその方法 |
JP2008230560A (ja) * | 2007-03-23 | 2008-10-02 | Pioneer Electronic Corp | 駐車支援装置、駐車支援制御方法、及び、駐車支援プログラム |
JP2008538037A (ja) * | 2005-04-14 | 2008-10-02 | フオルクスヴアーゲン アクチエンゲゼルシヤフト | 交通手段における情報表示方法及び自動車用コンビネーションインストルメント |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5883739A (en) | 1993-10-04 | 1999-03-16 | Honda Giken Kogyo Kabushiki Kaisha | Information display device for vehicle |
US5826212A (en) | 1994-10-25 | 1998-10-20 | Honda Giken Kogyo Kabushiki Kaisha | Current-position map and three dimensional guiding objects displaying device for vehicle |
JP2003280812A (ja) | 2002-03-20 | 2003-10-02 | Hitachi Ltd | タッチパネル付きディスプレイ装置及び表示方法 |
JP2004280496A (ja) | 2003-03-17 | 2004-10-07 | Kyocera Mita Corp | 操作パネル装置 |
JP2005175566A (ja) | 2003-12-08 | 2005-06-30 | Shinichi Hirabayashi | 立体表示システム |
KR101695809B1 (ko) * | 2009-10-09 | 2017-01-13 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
-
2010
- 2010-10-19 DE DE112010005944T patent/DE112010005944T5/de not_active Withdrawn
- 2010-10-19 US US13/702,332 patent/US9179140B2/en active Active
- 2010-10-19 WO PCT/JP2010/006186 patent/WO2012053029A1/ja active Application Filing
- 2010-10-19 JP JP2012539476A patent/JP5726201B2/ja active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6257392A (ja) * | 1985-09-05 | 1987-03-13 | Clarion Co Ltd | 立体バツクアイカメラ |
JPH04255891A (ja) * | 1991-02-08 | 1992-09-10 | Omron Corp | 車両用ディスプレイ装置 |
JPH05147456A (ja) * | 1991-11-27 | 1993-06-15 | Nippon Seiki Co Ltd | 車両用表示装置 |
JPH07105484A (ja) * | 1993-10-04 | 1995-04-21 | Honda Motor Co Ltd | 車両用情報表示装置 |
JPH08124096A (ja) * | 1994-10-25 | 1996-05-17 | Honda Motor Co Ltd | 車両用現在位置表示装置 |
JPH1144545A (ja) * | 1997-07-29 | 1999-02-16 | Hitachi Ltd | 3次元景観図表示ナビゲーション装置 |
JPH11119147A (ja) * | 1997-10-14 | 1999-04-30 | Asahi Optical Co Ltd | ヘッドアップディスプレイ |
WO2004099718A1 (ja) * | 2003-05-07 | 2004-11-18 | Seijiro Tomita | カーナビゲーションシステムにおける画像の表示方法及び装置 |
JP2008538037A (ja) * | 2005-04-14 | 2008-10-02 | フオルクスヴアーゲン アクチエンゲゼルシヤフト | 交通手段における情報表示方法及び自動車用コンビネーションインストルメント |
WO2007069573A1 (ja) * | 2005-12-16 | 2007-06-21 | Matsushita Electric Industrial Co., Ltd. | 移動体用入力装置、及び方法 |
WO2007129382A1 (ja) * | 2006-04-28 | 2007-11-15 | Panasonic Corporation | ナビゲーション装置およびその方法 |
JP2008230560A (ja) * | 2007-03-23 | 2008-10-02 | Pioneer Electronic Corp | 駐車支援装置、駐車支援制御方法、及び、駐車支援プログラム |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012252153A (ja) * | 2011-06-02 | 2012-12-20 | Alpine Electronics Inc | 画像表示装置 |
US9437047B2 (en) | 2014-01-15 | 2016-09-06 | Htc Corporation | Method, electronic apparatus, and computer-readable medium for retrieving map |
WO2017042923A1 (ja) * | 2015-09-10 | 2017-03-16 | 三菱電機株式会社 | 表示制御装置、表示装置及び表示制御方法 |
JPWO2017042923A1 (ja) * | 2015-09-10 | 2017-11-24 | 三菱電機株式会社 | 表示制御装置、表示装置及び表示制御方法 |
JP2019057869A (ja) * | 2017-09-22 | 2019-04-11 | トヨタ自動車株式会社 | 車両内コミュニケーション装置 |
Also Published As
Publication number | Publication date |
---|---|
US20130076876A1 (en) | 2013-03-28 |
US9179140B2 (en) | 2015-11-03 |
DE112010005944T5 (de) | 2013-08-14 |
JPWO2012053029A1 (ja) | 2014-02-24 |
JP5726201B2 (ja) | 2015-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5726201B2 (ja) | 3次元立体視表示装置、3次元立体視表示制御装置、およびlsi回路 | |
JP5709886B2 (ja) | 3次元立体表示装置および3次元立体表示信号生成装置 | |
JP5781080B2 (ja) | 3次元立体表示装置および3次元立体表示処理装置 | |
KR102566377B1 (ko) | 운전자 보조 시스템 및 방법 | |
CN109644256B (zh) | 车载视频系统 | |
US7554461B2 (en) | Recording medium, parking support apparatus and parking support screen | |
CN100403340C (zh) | 图象产生装置、图象产生方法和电子设备 | |
JP5412979B2 (ja) | 周辺表示装置 | |
JPH11108684A (ja) | カーナビゲーションシステム | |
JP6121131B2 (ja) | 多重表示装置 | |
JP5914114B2 (ja) | 駐車支援装置、及び駐車支援方法 | |
JP2010128951A (ja) | 画像処理装置、画像処理方法及びコンピュータプログラム | |
JP2010175329A (ja) | 車載情報装置 | |
JP6611918B2 (ja) | 駐車支援用表示制御装置および駐車支援用表示制御方法 | |
JP5465334B2 (ja) | 3次元立体表示装置 | |
WO2016072019A1 (ja) | 表示制御装置 | |
JP5618139B2 (ja) | 駐車支援装置 | |
JP5677168B2 (ja) | 画像表示システム、画像生成装置及び画像生成方法 | |
JP5955373B2 (ja) | 3次元立体表示装置および3次元立体表示信号生成装置 | |
JP2007323001A (ja) | 画像表示装置 | |
JP2004328216A (ja) | 車載立体ディスプレイシステム | |
JP6432312B2 (ja) | ナビゲーションシステム、ナビゲーション方法、及びナビゲーションプログラム | |
JP2022164079A (ja) | 運転支援装置、運転支援方法及びプログラム | |
TWM521172U (zh) | 行車導航裝置 | |
JP2012191479A (ja) | 情報処理システム、サーバ装置、および、車載装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10858588 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012539476 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13702332 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112010005944 Country of ref document: DE Ref document number: 1120100059446 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10858588 Country of ref document: EP Kind code of ref document: A1 |