WO2019049237A1 - Information display control device, information display device, and information display control method - Google Patents

Information display control device, information display device, and information display control method Download PDF

Info

Publication number
WO2019049237A1
WO2019049237A1 PCT/JP2017/032106 JP2017032106W WO2019049237A1 WO 2019049237 A1 WO2019049237 A1 WO 2019049237A1 JP 2017032106 W JP2017032106 W JP 2017032106W WO 2019049237 A1 WO2019049237 A1 WO 2019049237A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
distance
depth
unit
display
Prior art date
Application number
PCT/JP2017/032106
Other languages
French (fr)
Japanese (ja)
Inventor
悠希 住吉
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/032106 priority Critical patent/WO2019049237A1/en
Publication of WO2019049237A1 publication Critical patent/WO2019049237A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements

Definitions

  • the present invention relates to an information display control device for controlling a head-up display (hereinafter referred to as "HUD") device, an information display device, and an information display control method.
  • HUD head-up display
  • HUD Augmented Reality
  • 3D three-dimensional
  • the information display control device described in Patent Document 1 controls the parallax of the 3D image in accordance with at least one of the state of the vehicle, the information on the road on which the vehicle travels, and the like.
  • the information display control device increases the parallax when the vehicle speed is high or the distance from the vehicle to the image is long, and reduces the parallax when the vehicle speed is low or the distance from the vehicle to the image is short.
  • Non-Patent Document 1 even if the distance from the driver to the image is as close as 2 m, motion parallax becomes effective if the vehicle speed is 20 km / h or more. The driver perceives the image farther than the distance 2 m to the actual image.
  • Non-Patent Document 1 does not mention how to display an image in a situation where motion parallax under a vehicle speed of 20 km / h is invalid.
  • the present invention has been made to solve the above-described problems, and in order to prevent the occurrence of motion sickness or the like when a vehicle occupant views the display of the head-up display device, the display time of the 3D image is minimized.
  • the aim is to limit the
  • the information display control device is an information display control device for controlling a head-up display device that causes a display image to be viewed stereoscopically or planarly, and includes a speed acquisition unit that acquires information indicating the speed of the vehicle, and The depth of field of the object viewed from the occupant of the vehicle is estimated based on the distance acquiring unit that acquires information indicating the distance to the object present in the traveling direction of the vehicle and the distance from the vehicle to the object A depth of field estimation unit, an image generation unit for generating a display image displayed in a state of being added to an object by a head-up display device, and a vehicle speed equal to or lower than a predetermined speed
  • the maximum parallax amount distance which is the distance viewed when the display image having the parallax amount is displayed on the head-up display device, is within the depth of field
  • the image generation unit In which a control unit for controlling the amount of parallax of the displayed image more generated to a value greater than zero.
  • the parallax amount of the display image is a value larger than zero. Because it controls to, it is possible to minimize the display time of 3D image.
  • FIG. 1 is a block diagram showing a configuration example of an information display device according to Embodiment 1.
  • FIG. 2 is a diagram showing an example of the configuration of a HUD device controlled by the information display control device according to the first embodiment.
  • 5 is a diagram showing an example of a depth of field table in Embodiment 1.
  • FIG. 4A and 4B are diagrams showing examples of display images in the first embodiment.
  • FIG. 5A and FIG. 5B are diagrams for explaining an outline of a control method of the parallax amount by the control unit of the first embodiment.
  • 5 is a flowchart showing an operation example of the information display control device according to the first embodiment.
  • 10 is a flowchart showing an operation example of the information display control device in accordance with Embodiment 2.
  • FIG. 1 is a block diagram showing a configuration example of an information display device according to Embodiment 1.
  • FIG. 2 is a diagram showing an example of the configuration of a HUD device controlled by the information display control device according to the
  • FIG. 16 is a block diagram showing an example of configuration of an information display device according to Embodiment 3.
  • FIG. 18 is a diagram showing an example of a depth of field table in the third embodiment.
  • 15 is a flowchart showing an operation example of the information display control device in accordance with the third embodiment.
  • FIG. 18 is a block diagram showing an example of configuration of an information display device according to Embodiment 4.
  • 21 is a flowchart showing an operation example of the information display control device in accordance with Embodiment 4.
  • FIG. 13A and FIG. 13B are diagrams showing an example of a hardware configuration of the information display device according to each embodiment.
  • FIG. 1 is a block diagram showing an example of the configuration of the information display apparatus according to the first embodiment.
  • the information display device according to the first embodiment controls the display of the HUD device 100 mounted on a vehicle.
  • the information display control device 10 includes a speed acquisition unit 11, a distance acquisition unit 12, a depth of field estimation unit 13, a control unit 15, and an image generation unit 16. Further, in the example of FIG. 1, the information display control device 10 is provided with a depth of field table 14.
  • FIG. 2 is a diagram showing a configuration example of the HUD device 100 controlled by the information display control device 10 according to the first embodiment.
  • the HUD device 100 shown in FIG. 2 includes a display unit 101, a drive unit 102, a reflecting mirror 103, and a lens 104.
  • the HUD device 100 is disposed, for example, inside a dashboard of a vehicle.
  • the display unit 101 includes a liquid crystal display or the like for displaying a display image generated by the image generation unit 16, and a parallax barrier or a lenticular lens or the like disposed in front of the liquid crystal display or the like.
  • the drive unit 102 is a drive mechanism for changing the posture of the reflecting mirror 103.
  • the drive unit 102 will be described in detail in the fourth embodiment.
  • the reflecting mirror 103 reflects the display image displayed on the display unit 101.
  • the lens 104 enlarges or reduces the display image reflected by the reflecting mirror 103.
  • the optical members such as the reflecting mirror 103 and the lens 104 project the display image displayed on the display unit 101 onto the windshield 105 or a combiner (not shown).
  • the HUD device 100 is a 3D-HUD, and realizes an AR (Augmented Reality) display by displaying a 2D image or a 3D image superimposed on an actual landscape.
  • the display image projected on the windshield 105 or the like is visually recognized by the occupant of the vehicle in a state of being superimposed on the landscape ahead of the vehicle through the windshield 105.
  • a driver 120 is used as an example of a passenger.
  • the display image is visually recognized as a 2D image at the position of the virtual image distance 121. That is, the driver 120 views the display image in plan view.
  • the display image is visually recognized as a 3D image at a position farther from the virtual image distance 121 as the parallax amount is larger. That is, the driver 120 stereoscopically views the display image.
  • the virtual image distance 121 at which the HUD device 100 displays the display image that is, the distance at which the driver 120 recognizes a 2D image with zero parallax amount is 3 m
  • the display image with the maximum parallax amount is a 3D image.
  • a distance to be visually recognized (hereinafter, referred to as “maximum parallax amount distance 122”) is 10 m.
  • the virtual image distance 121 and the maximum disparity distance 122 are distances dependent on the HUD device 100, and are not limited to 3 m and 10 m.
  • the virtual image distance 121 and the maximum parallax distance 122 are assumed to be given to the control unit 15 in advance.
  • the speed acquisition unit 11 illustrated in FIG. 1 acquires information indicating the speed of the vehicle from a CAN (Controller Area Network) 111 or the like, and outputs the information to the control unit 15.
  • CAN Controller Area Network
  • the distance acquisition unit 12 acquires information indicating the distance between an object present in the traveling direction of the vehicle and the vehicle from at least one of the car navigation device (hereinafter referred to as "car navigation") 112 or the external sensor 113, Output to 15.
  • the car navigation system 112 detects, as an object, the nearest intersection or the like present on the planned travel route of the vehicle using the map information, calculates the distance from the vehicle to the object, and outputs the distance to the distance acquisition unit 12.
  • the out-of-vehicle sensor 113 detects another vehicle or a pedestrian or the like existing in the traveling direction of the vehicle as an object using a camera or a distance sensor, and calculates the distance from the vehicle to the object to the distance acquisition unit 12 Output.
  • the means for calculating the distance from the vehicle to the object is not limited to the car navigation 112 and the outside sensor 113.
  • the depth of field estimation unit 13 receives the information indicating the distance from the vehicle to the object acquired by the distance acquisition unit 12 via the control unit 15.
  • the depth of field estimation unit 13 estimates the depth of field of the object viewed from the driver based on the distance from the vehicle to the object, and outputs the estimated depth to the control unit 15.
  • the depth of field is a distance range in which it is assumed that the driver's eye is in focus when the driver is looking at an object. In the example of FIG. 1, the depth of field estimation unit 13 estimates the depth of field with reference to the depth of field table 14 included in the information display control device 10.
  • FIG. 3 is a diagram showing an example of the depth of field table 14 according to the first embodiment.
  • the depth of field table 14 stores the distance from the vehicle to the object and the depth of field in association with each other. For example, when the distance from the vehicle to the object is 100 m, the depth of field estimation unit 13 estimates the depth of field to be 10 to 100 m with reference to the depth of field table 14. When looking at an object 100 m ahead, the driver can view the object simultaneously with the object without discomfort as long as the object exists at a distance of 10 to 100 m ahead.
  • the depth of field estimation method of the depth of field estimation unit 13 is not limited to the above method.
  • the depth of field estimation unit 13 may estimate the depth of field using a mathematical expression that defines the relationship between the distance from the vehicle to the object and the depth of field.
  • the control unit 15 receives information indicating the speed of the vehicle from the speed acquisition unit 11, receives information indicating the distance from the vehicle to the object from the distance acquisition unit 12, and estimates the information indicating the depth of field to the depth of field Received from Part 13.
  • the control unit 15 controls the parallax amount of the display image displayed on the display unit 101 of the HUD device 100 based on at least one of the speed of the vehicle, the distance from the vehicle to the object, or the depth of field. .
  • the control unit 15 outputs information on the amount of parallax to the image generation unit 16. Details of the control unit 15 will be described later.
  • the image generation unit 16 generates a display image to be displayed in a state of being added to the object, and outputs the display image to the display unit 101 of the HUD device 100 for display. At this time, the image generation unit 16 combines the right-eye image and the left-eye image based on the amount of parallax received from the control unit 15 to generate a display image.
  • FIG. 4A and 4B show examples of the display images 123 and 124 in the first embodiment.
  • the image generator 16 When the target is an intersection as shown in FIG. 4A, the image generator 16 generates an arrow or the like indicating the traveling direction at the intersection as the display image 123.
  • the display image 123 displayed by the HUD device 100 is viewed in a superimposed state at the intersection via the windshield 105.
  • the image generation unit 16 when the object is another vehicle or a pedestrian or the like, the image generation unit 16 generates, as the display image 124, a numerical value or the like indicating the distance from the vehicle to the other vehicle or the pedestrian.
  • the display image 124 displayed by the HUD device 100 is viewed in a state of being added to another vehicle or a pedestrian or the like through the windshield 105.
  • the display images 123 and 124 are not limited to, for example, an arrow indicating the traveling direction and a numerical value indicating the distance to the object. As described later, the display images 123 and 124 may be 2
  • FIG. 5A and FIG. 5B are diagrams for explaining an outline of a control method of the parallax amount by the control unit 15 of the first embodiment.
  • the predetermined speed is a speed at which the motion parallax becomes effective, for example, 20 km / h.
  • the predetermined speed is assumed to be given to the control unit 15.
  • the control unit 15 sets the parallax amount of the display image to zero.
  • the HUD device 100 displays a 2D image without parallax at the virtual image distance 121. Even when the HUD device 100 displays a 2D image at a virtual image distance 121 close to the driver 120, the motion parallax causes the driver 120 to feel that the 2D image is farther than the virtual image distance 121. Therefore, even if the object is farther than the virtual image distance 121, the driver 120 does not feel a difference in distance between the 2D image and the object. Thus, the driver 120 can easily view a 2D image.
  • the control unit 15 sets the disparity amount of the display image to zero.
  • the HUD device 100 causes the driver 120 to visually recognize a 2D image by displaying a display image without parallax at the virtual image distance 121.
  • the control unit 15 causes the HUD device 100 to display a 2D image, thereby preventing the occurrence of sickness and the like.
  • the information display control device 10 may supplement the content of the 2D image by voice guidance or the like.
  • the control unit 15 sets the disparity amount of the display image to a value larger than zero.
  • the HUD device 100 causes the driver 120 to visually recognize the 3D image at a position farther than the virtual image distance 121 (for example, the maximum parallax amount distance 122) by displaying a display image with a parallax amount larger than zero at the virtual image distance 121.
  • the information display control device 10 prevents the occurrence of drunkenness or the like of the driver 120 by minimizing the display time of the 3D image by setting the display of the 3D image only in the case of (2B).
  • information can be presented in an easy-to-understand manner during travelling.
  • FIG. 6 is a flowchart showing an operation example of the information display control device 10 according to the first embodiment.
  • the information display control device 10 repeatedly performs the operation shown in the flowchart of FIG. 6 until the vehicle arrives at the destination.
  • step ST101 the speed acquisition unit 11 acquires information indicating the speed of the vehicle from the CAN 111 and outputs the information to the control unit 15.
  • step ST102 the control unit 15 determines whether the speed of the vehicle is equal to or less than a predetermined speed (for example, 20 km / h).
  • a predetermined speed for example, 20 km / h.
  • step ST103 the distance acquisition unit 12 acquires information indicating the distance from the vehicle to the object from the car navigation system 112 or the external sensor 113, and outputs the information to the depth of field estimation unit 13 and the control unit 15.
  • step ST104 the control unit 15 determines whether the distance from the vehicle to the object is equal to or greater than the maximum parallax amount distance 122. If the distance from the vehicle to the object is equal to or greater than the maximum parallax distance 122 (YES in step ST104), the control unit 15 proceeds to step ST105 and the distance from the vehicle to the object is smaller than the maximum parallax distance 122 If it is (step ST104 “NO”), the process proceeds to step ST109.
  • step ST105 the depth of field estimation unit 13 estimates the depth of field based on the distance from the vehicle to the object, and outputs the depth of field to the control unit 15.
  • step ST106 the control unit 15 determines whether the maximum parallax amount distance 122 is within the range of the depth of field. When the maximum disparity distance 122 is within the depth of field range as shown in FIG. 5B (step ST106 “YES”), the control unit 15 proceeds to step ST107, and the maximum disparity distance 122 is as shown in FIG. 5A. If it is out of the range of the depth of field ("NO" in step ST106), the process proceeds to step ST108.
  • step ST107 the control unit 15 controls the amount of parallax to the maximum amount of parallax.
  • the maximum amount of parallax is a value corresponding to the maximum amount of parallax distance 122, and is given to the control unit 15 in advance.
  • the control unit 15 outputs the maximum amount of parallax to the image generation unit 16.
  • step ST108 the control unit 15 controls the amount of parallax to be zero and outputs the amount of zero parallax to the image generation unit 16.
  • step ST109 the control unit 15 controls the amount of parallax to a value between zero and the maximum amount of parallax according to the distance from the vehicle to the object. That is, as the vehicle approaches the object, the control unit 15 reduces the amount of parallax so that the distance at which the display image generated by the image generation unit 16 is viewed becomes shorter. For example, in the case of the HUD device 100 in which the virtual image distance 121 is 3 m and the maximum parallax distance 122 is 10 m, the control unit 15 sets the parallax amount to 5 m if the distance from the vehicle to the object is 5 m. Make it Further, the control unit 15 sets the parallax amount to zero if the distance from the vehicle to the object is 2 m or the like in which the virtual image distance 121 or less.
  • step ST110 the image generation unit 16 generates a display image based on the parallax amount designated by the control unit 15.
  • step ST111 the image generation unit 16 outputs the generated display image to the display unit 101 of the HUD device 100.
  • the display unit 101 displays the display image received from the image generation unit 16.
  • the information display control device 10 includes the speed acquisition unit 11, the distance acquisition unit 12, the depth of field estimation unit 13, the image generation unit 16, and the control unit 15.
  • the speed acquisition unit 11 acquires information indicating the speed of the vehicle.
  • the distance acquisition unit 12 acquires information indicating the distance from a vehicle to an object present in the traveling direction of the vehicle.
  • the depth of field estimation unit 13 estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object.
  • the image generation unit 16 generates a display image displayed in a state of being added to the object by the HUD device 100.
  • the control unit 15 is configured such that the maximum parallax distance 122, which is a distance visually recognized when the display image having the maximum parallax amount is displayed on the HUD device 100, is equal to or less than the speed of the vehicle.
  • the parallax amount of the display image generated by the image generation unit 16 is controlled to a value larger than zero. Since the 3D image is displayed when the speed of the vehicle is less than or equal to the predetermined speed and the maximum parallax distance 122 is within the depth of field (2B), the display time of the 3D image is minimized. Can be limited. As a result, it is possible to prevent the occurrence of drunkenness or the like of the occupant.
  • the control unit 15 of the first embodiment follows the approach to the object The amount of parallax is reduced so that the distance viewed when the display image is displayed on the HUD device 100 becomes short. This makes it possible to present easy-to-understand information.
  • the control unit 15 of the first embodiment when the speed of the vehicle is greater than a predetermined speed, or the speed of the vehicle is equal to or less than a predetermined speed, the maximum parallax distance 122 If it is outside the depth of field, the parallax amount of the display image generated by the image generation unit 16 is set to zero. If the speed of the vehicle is greater than the predetermined speed (1), the motion parallax causes the driver to feel that the 2D image is farther than the virtual image distance 121. This makes it possible to present easy-to-understand information.
  • the driver can not simultaneously focus on both the object and the 3D image, and it is likely to cause a sickness or the like, and thus a 2D image is displayed. This can prevent the occurrence of sickness and the like.
  • the configuration of the information display control device 10 according to the second embodiment is the same as the configuration shown in FIG. 1 of the first embodiment in the drawings, so FIG. 1 will be used hereinafter.
  • the control unit 15 of the second embodiment adjusts at least one of the display position or the size of the display image generated by the image generation unit 16 based on the distance from the vehicle to the object.
  • FIG. 7 is a flowchart showing an operation example of the information display control device 10 according to the second embodiment.
  • the operations of steps ST101 to ST109 and ST111 shown in the flowchart of FIG. 7 are the same as the operations of steps ST101 to ST109 and ST111 shown in the flowchart of FIG.
  • step ST201 the control unit 15 determines the position of the display image generated by the image generation unit 16 using the information indicating the distance from the vehicle acquired from the car navigation system 112 or the external sensor 113 to the object. Specifically, the control unit 15 raises the display position of the display image displayed at the virtual image distance 121 as the distance from the vehicle to the object is longer. On the other hand, the control unit 15 lowers the display position of the display image displayed at the virtual image distance 121 as the distance from the vehicle to the object is shorter. The control unit 15 outputs information on the determined display position to the image generation unit 16.
  • step ST202 the control unit 15 determines the size of the display image generated by the image generation unit 16 using the information indicating the distance from the vehicle acquired from the car navigation system 112 or the external sensor 113 to the object. Specifically, the control unit 15 reduces the display image displayed at the virtual image distance 121 as the distance from the vehicle to the object is longer. On the other hand, the control part 15 enlarges the display image displayed on the virtual image distance 121, so that the distance from a vehicle to a target object is short. The control unit 15 outputs information on the determined size of the display image to the image generation unit 16.
  • step ST203 the image generation unit 16 generates a display image based on the amount of parallax, the display position, and the size designated by the control unit 15.
  • control unit 15 of the second embodiment lowers the display position of the display image generated by the image generation unit 16 as the vehicle approaches the target.
  • the position of the display image visually recognized by the driver can be adjusted to the position of the object, and information can be easily presented.
  • control unit 15 of the second embodiment enlarges the display image generated by the image generation unit 16 as the vehicle approaches the object.
  • the position of the display image visually recognized by the driver can be adjusted to the position of the object, and information can be easily presented.
  • FIG. 8 is a block diagram showing a configuration example of the information display device according to the third embodiment.
  • An information display control device 10 according to the third embodiment has a configuration in which a brightness acquisition unit 17 is added to the information display control device 10 of the first embodiment shown in FIG. Further, the information display control device 10 is configured to use a luminance meter 114 mounted on a vehicle.
  • a luminance meter 114 mounted on a vehicle.
  • the brightness acquiring unit 17 acquires information indicating the brightness around the vehicle from the luminance meter 114 or the like, and outputs the information to the depth of field estimating unit 13.
  • the luminance meter 114 detects the luminance around the vehicle, and outputs the detected luminance to the brightness acquiring unit 17 as information indicating the brightness around the vehicle.
  • the brightness around the vehicle is not limited to the luminance, and may be illuminance or the like.
  • the depth of field estimation unit 13 receives, from the brightness acquisition unit 17, information indicating the brightness around the vehicle.
  • the depth of field estimation unit 13 estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object and the brightness around the vehicle, and outputs the depth of field to the control unit 15 .
  • the depth of field estimation unit 13 estimates the depth of field with reference to the depth of field table 14 a included in the information display control device 10.
  • FIG. 9 shows an example of the depth of field table 14a according to the third embodiment.
  • the depth of field table 14a stores the distance from the vehicle to the object in association with the depth of field when the brightness around the vehicle is relatively bright and dark.
  • the depth of field estimation unit 13 determines “bright” when the brightness around the vehicle is equal to or greater than a predetermined threshold, and determines “dark” when the brightness is less than the threshold.
  • the depth of field estimation unit 13 refers to the depth of field table 14 a to have a depth of field of 8 to 100 m. Estimate.
  • the depth of field estimation method of the depth of field estimation unit 13 is not estimated by the above method.
  • the depth of field estimation unit 13 estimates the depth of field using a mathematical expression that defines the relationship between the distance from the vehicle to the object, the brightness around the vehicle, and the depth of field. It is also good.
  • FIG. 10 is a flowchart showing an operation example of the information display control device 10 according to the third embodiment.
  • the operations of steps ST101 to ST104 and ST106 to ST111 shown in the flowchart of FIG. 10 are the same as the operations of steps ST101 to ST104 and ST106 to ST111 shown in the flowchart of FIG.
  • step ST301 the brightness acquisition unit 17 acquires information indicating the brightness around the vehicle from the luminance meter 114, and outputs the information to the depth of field estimation unit 13.
  • step ST302 the depth of field estimation unit 13 estimates the depth of field based on the distance from the vehicle to the object and the brightness around the vehicle, and outputs the depth of field to the control unit 15.
  • the information display control device 10 includes the brightness acquisition unit 17 that acquires information indicating the brightness around the vehicle.
  • the depth of field estimation unit 13 estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object and the brightness around the vehicle. The estimation accuracy is improved by the depth of field estimation unit 13 estimating the depth of field in consideration of the brightness around the vehicle.
  • the control unit 15 can control the amount of parallax with high accuracy by using the highly accurate depth of field.
  • the information display control device 10 according to the third embodiment has a configuration in which the brightness acquisition unit 17 is added to the information display control device 10 according to the first embodiment, the information display according to the second embodiment The brightness acquisition unit 17 may be added to the control device 10.
  • FIG. 11 is a block diagram showing a configuration example of an information display device according to the fourth embodiment.
  • the information display control device 10 according to the fourth embodiment has a configuration in which an eye position acquisition unit 18 and a stereoscopic area adjustment unit 19 are added to the information display control device 10 according to the first embodiment shown in FIG. It is. Further, the information display control device 10 is configured to use an eye position detection device 115 mounted on a vehicle.
  • FIG. 11 the parts that are the same as or correspond to those in FIG. 1 are given the same reference numerals, and descriptions thereof will be omitted. Further, FIG. 2 is referred to in the fourth embodiment.
  • the eye position acquisition unit 18 acquires information indicating the positions of both eyes of an occupant such as a driver from the eye position detection device 115 or the like, and outputs the information to the stereoscopic viewing area adjustment unit 19.
  • the eye position detection device 115 is a DMS (Driver Monitoring System) or the like, detects the position of the both eyes of the driver 120 in FIG. 2, and outputs the position to the stereoscopic viewing area adjustment unit 19.
  • the stereoscopic viewing range adjustment unit 19 determines the adjustment amount of at least one of the vertical direction and the lateral direction of the stereoscopic viewing range of the HUD device 100 based on the positions of both eyes of the occupant.
  • the stereoscopic viewing area adjustment unit 19 outputs information indicating the determined adjustment amount of the stereoscopic viewing area to the drive unit 102 of the HUD device 100.
  • the stereoscopic viewing area is an area where the driver can visually recognize a display image having a parallax amount larger than zero and displayed by the HUD device 100 at the position of the virtual image distance 121 as a 3D image.
  • the drive unit 102 receives information indicating the adjustment amount of the stereoscopic viewing area from the stereoscopic viewing area adjustment unit 19.
  • the drive unit 102 moves the position of the stereoscopic viewing area in at least one of the up and down direction or the left and right direction by changing the posture of the reflecting mirror 103 according to the designated adjustment amount.
  • the drive unit 102 adjusts the stereoscopic viewing area by changing the attitude of the reflecting mirror 103, but the adjustment method is not limited to this, and the HUD device
  • the stereoscopic viewing area may be adjusted by a method according to the configuration of 100.
  • FIG. 12 is a flowchart showing an operation example of the information display control device 10 according to the fourth embodiment.
  • the operations of steps ST101 to ST111 shown in the flowchart of FIG. 12 are the same as the operations of steps ST101 to ST111 shown in the flowchart of FIG.
  • step ST401 the eye position acquisition unit 18 acquires information indicating the positions of the driver's eyes from the eye position detection device 115, and outputs the information to the stereoscopic viewing area adjustment unit 19.
  • the stereoscopic viewing range adjustment unit 19 determines an adjustment amount for adjusting the stereoscopic viewing range of the HUD device 100 based on the position of the driver's eyes.
  • step ST403 the stereoscopic viewing range adjustment unit 19 outputs the determined adjustment amount to the drive unit 102 of the HUD device 100.
  • the drive unit 102 adjusts the stereoscopic viewing area in accordance with the adjustment amount received from the stereoscopic viewing area adjustment unit 19.
  • the information display control device 10 includes the eye position acquisition unit 18 and the stereoscopic vision area adjustment unit 19.
  • the eye position acquisition unit 18 acquires information indicating the positions of both eyes of the occupant.
  • the stereoscopic viewing range adjustment unit 19 determines an adjustment amount for adjusting the stereoscopic viewing range of the HUD device 100 based on the positions of the occupant's eyes, and instructs the HUD device 100 on the determined adjustment amount.
  • the occurrence of 3D crosstalk and the like can be prevented, and information can be presented in an easy-to-understand manner.
  • the information display control device 10 according to the fourth embodiment has a configuration in which the eye position acquisition unit 18 and the stereoscopic vision area adjustment unit 19 are added to the information display control device 10 according to the first embodiment.
  • the configuration may be such that the eye position acquisition unit 18 and the stereoscopic vision area adjustment unit 19 are added to the information display control device 10 according to the second embodiment or the third embodiment.
  • FIG. 13A and FIG. 13B are diagrams showing an example of a hardware configuration of the information display device according to each embodiment.
  • the speed acquisition unit 11, the distance acquisition unit 12, the depth of field estimation unit 13, the depth of field tables 14 and 14a, the control unit 15, the image generation unit 16, the brightness acquisition unit 17, and the eye position in the information display control device 10 Each function of the acquisition unit 18 and the stereoscopic vision area adjustment unit 19 is realized by a processing circuit. That is, the information display control device 10 includes a processing circuit for realizing the above functions.
  • the processing circuit may be the processing circuit 1 as dedicated hardware or may be the processor 2 that executes a program stored in the memory 3.
  • the processing circuit 1 or the processor 2 and the memory 3 are connected to the HUD device 100, the CAN 111, the car navigation system 112, the external sensor 113, the luminance meter 114, and the eye position detection device 115.
  • the processing circuit 1 when the processing circuit is dedicated hardware, the processing circuit 1 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC). , FPGA (Field Programmable Gate Array), or a combination thereof. Speed acquisition unit 11, distance acquisition unit 12, depth of field estimation unit 13, depth of field tables 14 and 14a, control unit 15, image generation unit 16, brightness acquisition unit 17, eye position acquisition unit 18, and three-dimensional The function of the viewing zone adjustment unit 19 may be realized by a plurality of processing circuits 1, or the function of each unit may be realized by one processing circuit 1.
  • ASIC application specific integrated circuit
  • FPGA Field Programmable Gate Array
  • the processing circuit is the processor 2, the speed acquisition unit 11, the distance acquisition unit 12, the depth of field estimation unit 13, the control unit 15, the image generation unit 16, the brightness acquisition unit 17, and the eye
  • Each function of the position acquisition unit 18 and the stereoscopic vision area adjustment unit 19 is realized by software, firmware, or a combination of software and firmware.
  • the depth of field tables 14 and 14 a are realized by the memory 3.
  • the software or firmware is written as a program and stored in the memory 3.
  • the processor 2 realizes the functions of the respective units by reading and executing the program stored in the memory 3. That is, the information display control device 10 includes the memory 3 for storing a program which, when executed by the processor 2, results in the steps shown in the flowchart of FIG.
  • the program also includes a velocity acquisition unit 11, a distance acquisition unit 12, a depth of field estimation unit 13, a control unit 15, an image generation unit 16, a brightness acquisition unit 17, an eye position acquisition unit 18, and adjustment of the stereoscopic viewing area. It can also be said that the computer is made to execute the procedure or method of part 19.
  • the processor 2 refers to a central processing unit (CPU), a processing device, an arithmetic device, a microprocessor, a microcomputer, or the like.
  • the memory 3 may be a nonvolatile or volatile semiconductor memory such as a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), or a flash memory, a hard disk, a flexible disk, etc. Or an optical disc such as a CD (Compact Disc) or a DVD (Digital Versatile Disc).
  • the respective functions of the stereoscopic viewing area adjustment unit 19 may be partially realized by dedicated hardware and partially realized by software or firmware.
  • the processing circuit in the information display control device 10 can realize each of the functions described above by hardware, software, firmware, or a combination thereof.
  • the information display control device prevents the occurrence of sickness etc.
  • the information display control device for mobiles including vehicles, railways, ships, aircrafts, etc. particularly the information display control device suitable for on-vehicle use Suitable for use in

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

In a case where the speed of a vehicle is equal to or lower than a predetermined speed, and where a maximum parallax amount distance (122), which is a distance of visual recognition during displaying of a display image having the maximum parallax amount on an HUD device (100), falls within the depth of field, a control unit (15) controls the parallax amount of a display image generated by an image generating unit (16) to a value greater than zero.

Description

情報表示制御装置、情報表示装置、および情報表示制御方法INFORMATION DISPLAY CONTROL DEVICE, INFORMATION DISPLAY DEVICE, AND INFORMATION DISPLAY CONTROL METHOD
 この発明は、ヘッドアップディスプレイ(以下、「HUD」と称する)デバイスを制御する情報表示制御装置、情報表示装置、および情報表示制御方法に関するものである。 The present invention relates to an information display control device for controlling a head-up display (hereinafter referred to as "HUD") device, an information display device, and an information display control method.
 従来、HUDなどを用いたAR(Augmented Reality)技術が知られている。また、HUDとして、視差を制御することによって画像が認識される位置を調整する3次元(3D)-HUDも知られている。 Conventionally, AR (Augmented Reality) technology using a HUD or the like is known. Also known as a HUD is a three-dimensional (3D) -HUD that adjusts the position at which an image is recognized by controlling parallax.
 特許文献1に記載された情報表示制御装置は、車両の状態または車両が走行する道路に関する情報などのうちの少なくとも1つに応じて、3D画像の視差を制御する。この情報表示制御装置は、車速が速いまたは車両から画像までの距離が遠い場合は視差を大きくし、車速が遅いまたは車両から画像までの距離が近い場合は視差を小さくする。 The information display control device described in Patent Document 1 controls the parallax of the 3D image in accordance with at least one of the state of the vehicle, the information on the road on which the vehicle travels, and the like. The information display control device increases the parallax when the vehicle speed is high or the distance from the vehicle to the image is long, and reduces the parallax when the vehicle speed is low or the distance from the vehicle to the image is short.
 非特許文献1に記載された3D表示技術によれば、運転者から画像までの距離が2m程度と近い場合であっても、車速が20km/h以上であれば、運動視差が有効になり、運転者は実際の画像までの距離2mよりも遠方に画像を知覚する。 According to the 3D display technology described in Non-Patent Document 1, even if the distance from the driver to the image is as close as 2 m, motion parallax becomes effective if the vehicle speed is 20 km / h or more. The driver perceives the image farther than the distance 2 m to the actual image.
特開2015-65614号公報JP, 2015-65614, A
 運転者に長時間3D画像を視認させると酔い等が発生する可能性がある。特許文献1に記載された情報表示制御装置では3D画像の表示時間が長くなることが想定されるため、運転者に酔い等が発生するという課題があった。 If the driver visually recognizes the 3D image for a long time, it is possible that a sickness or the like may occur. In the information display control device described in Patent Document 1, since it is assumed that the display time of the 3D image becomes long, there is a problem that the driver experiences a sickness or the like.
 非特許文献1では、車速20km/h未満の運動視差が無効な状況においてどのように画像を表示するか言及されていない。 Non-Patent Document 1 does not mention how to display an image in a situation where motion parallax under a vehicle speed of 20 km / h is invalid.
 この発明は、上記のような課題を解決するためになされたもので、車両の乗員がヘッドアップディスプレイデバイスの表示を見る際の酔い等の発生を防止するために、3D画像の表示時間を最小限に抑えることを目的とする。 The present invention has been made to solve the above-described problems, and in order to prevent the occurrence of motion sickness or the like when a vehicle occupant views the display of the head-up display device, the display time of the 3D image is minimized. The aim is to limit the
 この発明に係る情報表示制御装置は、表示画像を立体視および平面視させるヘッドアップディスプレイデバイスを制御する情報表示制御装置であって、車両の速度を示す情報を取得する速度取得部と、車両から車両の進行方向に存在する対象物までの距離を示す情報を取得する距離取得部と、車両から対象物までの距離に基づいて、車両の乗員から見た対象物の被写界深度を推定する被写界深度推定部と、ヘッドアップディスプレイデバイスによって対象物に付加された状態に表示される表示画像を生成する画像生成部と、車両の速度が予め定められた速度以下であり、かつ、最大視差量をもつ表示画像がヘッドアップディスプレイデバイスに表示されたときに視認される距離である最大視差量距離が被写界深度内である場合、画像生成部により生成される表示画像の視差量をゼロより大きい値に制御する制御部とを備えるものである。 The information display control device according to the present invention is an information display control device for controlling a head-up display device that causes a display image to be viewed stereoscopically or planarly, and includes a speed acquisition unit that acquires information indicating the speed of the vehicle, and The depth of field of the object viewed from the occupant of the vehicle is estimated based on the distance acquiring unit that acquires information indicating the distance to the object present in the traveling direction of the vehicle and the distance from the vehicle to the object A depth of field estimation unit, an image generation unit for generating a display image displayed in a state of being added to an object by a head-up display device, and a vehicle speed equal to or lower than a predetermined speed When the maximum parallax amount distance, which is the distance viewed when the display image having the parallax amount is displayed on the head-up display device, is within the depth of field, the image generation unit In which a control unit for controlling the amount of parallax of the displayed image more generated to a value greater than zero.
 この発明によれば、車両の速度が予め定められた速度以下であり、かつ、ヘッドアップディスプレイデバイスの最大視差量距離が被写界深度内である場合、表示画像の視差量をゼロより大きい値に制御するようにしたので、3D画像の表示時間を最小限に抑えることができる。 According to the present invention, when the speed of the vehicle is equal to or less than a predetermined speed, and the maximum parallax distance of the head-up display device is within the depth of field, the parallax amount of the display image is a value larger than zero. Because it controls to, it is possible to minimize the display time of 3D image.
実施の形態1に係る情報表示装置の構成例を示すブロック図である。FIG. 1 is a block diagram showing a configuration example of an information display device according to Embodiment 1. 実施の形態1に係る情報表示制御装置が制御するHUDデバイスの構成例を示す図である。FIG. 2 is a diagram showing an example of the configuration of a HUD device controlled by the information display control device according to the first embodiment. 実施の形態1における被写界深度テーブルの一例を示す図である。5 is a diagram showing an example of a depth of field table in Embodiment 1. FIG. 図4Aおよび図4Bは、実施の形態1における表示画像の例を示す図である。4A and 4B are diagrams showing examples of display images in the first embodiment. 図5Aおよび図5Bは、実施の形態1の制御部による視差量の制御方法の概要を説明する図である。FIG. 5A and FIG. 5B are diagrams for explaining an outline of a control method of the parallax amount by the control unit of the first embodiment. 実施の形態1に係る情報表示制御装置の動作例を示すフローチャートである。5 is a flowchart showing an operation example of the information display control device according to the first embodiment. 実施の形態2に係る情報表示制御装置の動作例を示すフローチャートである。10 is a flowchart showing an operation example of the information display control device in accordance with Embodiment 2. 実施の形態3に係る情報表示装置の構成例を示すブロック図である。FIG. 16 is a block diagram showing an example of configuration of an information display device according to Embodiment 3. 実施の形態3における被写界深度テーブルの一例を示す図である。FIG. 18 is a diagram showing an example of a depth of field table in the third embodiment. 実施の形態3に係る情報表示制御装置の動作例を示すフローチャートである。15 is a flowchart showing an operation example of the information display control device in accordance with the third embodiment. 実施の形態4に係る情報表示装置の構成例を示すブロック図である。FIG. 18 is a block diagram showing an example of configuration of an information display device according to Embodiment 4. 実施の形態4に係る情報表示制御装置の動作例を示すフローチャートである。21 is a flowchart showing an operation example of the information display control device in accordance with Embodiment 4. 図13Aおよび図13Bは、各実施の形態に係る情報表示装置のハードウェア構成例を示す図である。FIG. 13A and FIG. 13B are diagrams showing an example of a hardware configuration of the information display device according to each embodiment.
 以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、実施の形態1に係る情報表示装置の構成例を示すブロック図である。図1に示されるように、実施の形態1に係る情報表示装置は、車両に搭載されたHUDデバイス100の表示を制御する。情報表示制御装置10は、速度取得部11、距離取得部12、被写界深度推定部13、制御部15、および画像生成部16を備える。また、図1の例では、情報表示制御装置10は被写界深度テーブル14を備える。
Hereinafter, in order to explain the present invention in more detail, a mode for carrying out the present invention will be described according to the attached drawings.
Embodiment 1
FIG. 1 is a block diagram showing an example of the configuration of the information display apparatus according to the first embodiment. As shown in FIG. 1, the information display device according to the first embodiment controls the display of the HUD device 100 mounted on a vehicle. The information display control device 10 includes a speed acquisition unit 11, a distance acquisition unit 12, a depth of field estimation unit 13, a control unit 15, and an image generation unit 16. Further, in the example of FIG. 1, the information display control device 10 is provided with a depth of field table 14.
 図2は、実施の形態1に係る情報表示制御装置10が制御するHUDデバイス100の構成例を示す図である。図2に示されるHUDデバイス100は、表示部101、駆動部102、反射鏡103、およびレンズ104を備える。このHUDデバイス100は、例えば、車両のダッシュボード内部に配置される。表示部101は、画像生成部16が生成する表示画像を表示する液晶ディスプレイ等と、液晶ディスプレイ等の手前に配置されるパララックスバリアまたはレンチキュラレンズ等とを含む。駆動部102は、反射鏡103の姿勢を変更するための駆動機構である。駆動部102については実施の形態4にて詳述する。反射鏡103は、表示部101に表示された表示画像を反射する。レンズ104は、反射鏡103で反射した表示画像を拡大または縮小する。反射鏡103およびレンズ104等の光学部材は、表示部101に表示された表示画像を、フロントガラス105または図示しないコンバイナ等に投影する。 FIG. 2 is a diagram showing a configuration example of the HUD device 100 controlled by the information display control device 10 according to the first embodiment. The HUD device 100 shown in FIG. 2 includes a display unit 101, a drive unit 102, a reflecting mirror 103, and a lens 104. The HUD device 100 is disposed, for example, inside a dashboard of a vehicle. The display unit 101 includes a liquid crystal display or the like for displaying a display image generated by the image generation unit 16, and a parallax barrier or a lenticular lens or the like disposed in front of the liquid crystal display or the like. The drive unit 102 is a drive mechanism for changing the posture of the reflecting mirror 103. The drive unit 102 will be described in detail in the fourth embodiment. The reflecting mirror 103 reflects the display image displayed on the display unit 101. The lens 104 enlarges or reduces the display image reflected by the reflecting mirror 103. The optical members such as the reflecting mirror 103 and the lens 104 project the display image displayed on the display unit 101 onto the windshield 105 or a combiner (not shown).
 HUDデバイス100は、3D-HUDであり、実際の風景に重ねて2D画像または3D画像を表示することによってAR(Augmented Reality)表示を実現する。フロントガラス105等に投影された表示画像は、フロントガラス105越しの車両前方風景に重なった状態に、車両の乗員に視認される。以下では、乗員として運転者120を例に用いる。表示画像を構成する右眼用画像と左眼用画像との視差量がゼロである場合、その表示画像は虚像距離121の位置で2D画像として視認される。つまり、運転者120は表示画像を平面視する。表示画像を構成する右眼用画像と左眼用画像との視差量がゼロより大きい場合、その表示画像は視差量が大きいほど虚像距離121の位置から遠い位置で3D画像として視認される。つまり、運転者120は表示画像を立体視する。ここで、HUDデバイス100が表示画像を表示する虚像距離121、つまり視差量ゼロの2D画像が運転者120に視認される距離を3mとし、視差量最大の表示画像が3D画像として運転者120に視認される距離(以下、「最大視差量距離122」と称する)を10mとする。虚像距離121および最大視差量距離122は、HUDデバイス100に依存する距離であり、3mおよび10mに限定されない。虚像距離121および最大視差量距離122は、予め制御部15に与えられているものとする。 The HUD device 100 is a 3D-HUD, and realizes an AR (Augmented Reality) display by displaying a 2D image or a 3D image superimposed on an actual landscape. The display image projected on the windshield 105 or the like is visually recognized by the occupant of the vehicle in a state of being superimposed on the landscape ahead of the vehicle through the windshield 105. In the following, a driver 120 is used as an example of a passenger. When the parallax amount between the right-eye image and the left-eye image constituting the display image is zero, the display image is visually recognized as a 2D image at the position of the virtual image distance 121. That is, the driver 120 views the display image in plan view. When the parallax amount between the right-eye image and the left-eye image constituting the display image is larger than zero, the display image is visually recognized as a 3D image at a position farther from the virtual image distance 121 as the parallax amount is larger. That is, the driver 120 stereoscopically views the display image. Here, the virtual image distance 121 at which the HUD device 100 displays the display image, that is, the distance at which the driver 120 recognizes a 2D image with zero parallax amount is 3 m, and the display image with the maximum parallax amount is a 3D image. A distance to be visually recognized (hereinafter, referred to as “maximum parallax amount distance 122”) is 10 m. The virtual image distance 121 and the maximum disparity distance 122 are distances dependent on the HUD device 100, and are not limited to 3 m and 10 m. The virtual image distance 121 and the maximum parallax distance 122 are assumed to be given to the control unit 15 in advance.
 図1に示される速度取得部11は、車両の速度を示す情報をCAN(Controller Area Network)111等から取得し、制御部15へ出力する。 The speed acquisition unit 11 illustrated in FIG. 1 acquires information indicating the speed of the vehicle from a CAN (Controller Area Network) 111 or the like, and outputs the information to the control unit 15.
 距離取得部12は、車両の進行方向に存在する対象物と車両との距離を示す情報をカーナビゲーション装置(以下、「カーナビ」と称する)112または車外センサ113の少なくとも一方から取得し、制御部15へ出力する。カーナビ112は、地図情報を用いて、車両の走行予定経路上に存在する直近の交差点等を対象物として検出し、車両から対象物までの距離を算出して距離取得部12へ出力する。車外センサ113は、カメラまたは距離センサ等を用いて、車両の進行方向に存在する他車両または歩行者等を対象物として検出し、車両から対象物までの距離を算出して距離取得部12へ出力する。なお、車両から対象物までの距離を算出する手段は、カーナビ112および車外センサ113に限定されない。 The distance acquisition unit 12 acquires information indicating the distance between an object present in the traveling direction of the vehicle and the vehicle from at least one of the car navigation device (hereinafter referred to as "car navigation") 112 or the external sensor 113, Output to 15. The car navigation system 112 detects, as an object, the nearest intersection or the like present on the planned travel route of the vehicle using the map information, calculates the distance from the vehicle to the object, and outputs the distance to the distance acquisition unit 12. The out-of-vehicle sensor 113 detects another vehicle or a pedestrian or the like existing in the traveling direction of the vehicle as an object using a camera or a distance sensor, and calculates the distance from the vehicle to the object to the distance acquisition unit 12 Output. The means for calculating the distance from the vehicle to the object is not limited to the car navigation 112 and the outside sensor 113.
 被写界深度推定部13は、距離取得部12が取得した車両から対象物までの距離を示す情報を、制御部15を経由して受け取る。被写界深度推定部13は、車両から対象物までの距離に基づいて、運転者から見た対象物の被写界深度を推定して制御部15へ出力する。被写界深度は、運転者が対象物を見ているときに運転者の眼のピントがあっていると想定される距離範囲である。図1の例では、被写界深度推定部13は、情報表示制御装置10が備える被写界深度テーブル14を参照して被写界深度を推定する。 The depth of field estimation unit 13 receives the information indicating the distance from the vehicle to the object acquired by the distance acquisition unit 12 via the control unit 15. The depth of field estimation unit 13 estimates the depth of field of the object viewed from the driver based on the distance from the vehicle to the object, and outputs the estimated depth to the control unit 15. The depth of field is a distance range in which it is assumed that the driver's eye is in focus when the driver is looking at an object. In the example of FIG. 1, the depth of field estimation unit 13 estimates the depth of field with reference to the depth of field table 14 included in the information display control device 10.
 図3は、実施の形態1における被写界深度テーブル14の一例を示す図である。被写界深度テーブル14は、車両から対象物までの距離と被写界深度とを対応付けて記憶している。例えば、車両から対象物までの距離が100mである場合、被写界深度推定部13は、被写界深度テーブル14を参照して被写界深度を10~100mと推定する。運転者は、100m先の対象物を見ている場合、前方10~100mまでの距離に存在するものであれば対象物と同時に違和感なく見ることができる。 FIG. 3 is a diagram showing an example of the depth of field table 14 according to the first embodiment. The depth of field table 14 stores the distance from the vehicle to the object and the depth of field in association with each other. For example, when the distance from the vehicle to the object is 100 m, the depth of field estimation unit 13 estimates the depth of field to be 10 to 100 m with reference to the depth of field table 14. When looking at an object 100 m ahead, the driver can view the object simultaneously with the object without discomfort as long as the object exists at a distance of 10 to 100 m ahead.
 なお、被写界深度推定部13の被写界深度推定方法は上記の方法に限定されない。例えば、被写界深度推定部13は、車両から対象物までの距離と被写界深度との関係性を定義した数式を用いて、被写界深度を推定してもよい。 The depth of field estimation method of the depth of field estimation unit 13 is not limited to the above method. For example, the depth of field estimation unit 13 may estimate the depth of field using a mathematical expression that defines the relationship between the distance from the vehicle to the object and the depth of field.
 制御部15は、車両の速度を示す情報を速度取得部11から受け取り、車両から対象物までの距離を示す情報を距離取得部12から受け取り、被写界深度を示す情報を被写界深度推定部13から受け取る。制御部15は、車両の速度、車両から対象物までの距離、または被写界深度のうちの少なくとも1つに基づいて、HUDデバイス100の表示部101に表示させる表示画像の視差量を制御する。制御部15は、視差量の情報を画像生成部16へ出力する。制御部15の詳細は後述する。 The control unit 15 receives information indicating the speed of the vehicle from the speed acquisition unit 11, receives information indicating the distance from the vehicle to the object from the distance acquisition unit 12, and estimates the information indicating the depth of field to the depth of field Received from Part 13. The control unit 15 controls the parallax amount of the display image displayed on the display unit 101 of the HUD device 100 based on at least one of the speed of the vehicle, the distance from the vehicle to the object, or the depth of field. . The control unit 15 outputs information on the amount of parallax to the image generation unit 16. Details of the control unit 15 will be described later.
 画像生成部16は、対象物に付加された状態に表示される表示画像を生成し、HUDデバイス100の表示部101へ出力して表示させる。その際、画像生成部16は、制御部15から受け取った視差量に基づいて、右眼用画像と左眼用画像とを合成して表示画像にする。 The image generation unit 16 generates a display image to be displayed in a state of being added to the object, and outputs the display image to the display unit 101 of the HUD device 100 for display. At this time, the image generation unit 16 combines the right-eye image and the left-eye image based on the amount of parallax received from the control unit 15 to generate a display image.
 図4Aおよび図4Bは、実施の形態1における表示画像123,124の例を示す図である。図4Aのように対象物が交差点である場合、画像生成部16は、この交差点での進行方向を示す矢印等を表示画像123として生成する。HUDデバイス100によって表示された表示画像123は、フロントガラス105越しの交差点に重畳した状態に視認される。図4Bのように対象物が他車両またが歩行者等である場合、画像生成部16は、車両から他車両または歩行者等までの距離を示す数値等を表示画像124として生成する。HUDデバイス100によって表示された表示画像124は、フロントガラス105越しの他車両または歩行者等に付加された状態に視認される。表示画像123,124は、進行方向を示す矢印等および対象物までの距離を示す数値等に限定されない。後述するように、表示画像123,124は、2D画像である場合もあれば3D画像である場合もある。 4A and 4B show examples of the display images 123 and 124 in the first embodiment. When the target is an intersection as shown in FIG. 4A, the image generator 16 generates an arrow or the like indicating the traveling direction at the intersection as the display image 123. The display image 123 displayed by the HUD device 100 is viewed in a superimposed state at the intersection via the windshield 105. As shown in FIG. 4B, when the object is another vehicle or a pedestrian or the like, the image generation unit 16 generates, as the display image 124, a numerical value or the like indicating the distance from the vehicle to the other vehicle or the pedestrian. The display image 124 displayed by the HUD device 100 is viewed in a state of being added to another vehicle or a pedestrian or the like through the windshield 105. The display images 123 and 124 are not limited to, for example, an arrow indicating the traveling direction and a numerical value indicating the distance to the object. As described later, the display images 123 and 124 may be 2D images or 3D images.
 次に、図2、図5Aおよび図5Bを用いて、制御部15による視差量の制御方法の概要を説明する。
 図5Aおよび図5Bは、実施の形態1の制御部15による視差量の制御方法の概要を説明する図である。
Next, an outline of a control method of the parallax amount by the control unit 15 will be described with reference to FIGS. 2, 5A, and 5B.
FIG. 5A and FIG. 5B are diagrams for explaining an outline of a control method of the parallax amount by the control unit 15 of the first embodiment.
(1)車両の速度が予め定められた速度より大きい場合
 予め定められた速度は、運動視差が有効になる速度であり、例えば20km/hである。予め定められた速度は、制御部15に与えられているものとする。(1)の場合、制御部15は、表示画像の視差量をゼロにする。HUDデバイス100は、視差のない2D画像を虚像距離121に表示する。HUDデバイス100が運転者120に近い虚像距離121に2D画像を表示した場合でも、運動視差により、運転者120は2D画像が虚像距離121より遠くにあるように感じる。そのため、対象物が虚像距離121より遠方にあっても、運転者120に2D画像と対象物との距離の差を感じさせない。よって、運転者120は2D画像を見やすい。
(1) When the speed of the vehicle is higher than the predetermined speed The predetermined speed is a speed at which the motion parallax becomes effective, for example, 20 km / h. The predetermined speed is assumed to be given to the control unit 15. In the case of (1), the control unit 15 sets the parallax amount of the display image to zero. The HUD device 100 displays a 2D image without parallax at the virtual image distance 121. Even when the HUD device 100 displays a 2D image at a virtual image distance 121 close to the driver 120, the motion parallax causes the driver 120 to feel that the 2D image is farther than the virtual image distance 121. Therefore, even if the object is farther than the virtual image distance 121, the driver 120 does not feel a difference in distance between the 2D image and the object. Thus, the driver 120 can easily view a 2D image.
(2)車両の速度が上記予め定められた速度以下である場合
 制御部15は、図5Aのように(2A)最大視差量距離122が対象物の被写界深度外である場合と、図5Bのように(2B)最大視差量距離122が対象物の被写界深度内である場合とで、視差量を変更する。
(2) In the case where the speed of the vehicle is equal to or less than the above-mentioned predetermined speed: As shown in FIG. 5A, (2A) the maximum parallax amount distance 122 is out of the depth of field of the object as shown in FIG. As in 5B (2B), the parallax amount is changed in the case where the maximum parallax amount distance 122 is within the depth of field of the object.
(2A)最大視差量距離122が対象物の被写界深度外である場合
 制御部15は、表示画像の視差量をゼロにする。HUDデバイス100は、視差のない表示画像を虚像距離121に表示することにより、2D画像を運転者120に視認させる。(2A)の場合にHUDデバイス100が3D画像を表示すると、運転者120は対象物と3D画像の両方同時にピントを合わせることができないため、3D画像が見難く酔い等が発生する可能性がある。そこで、(2A)の場合、制御部15は、HUDデバイス100に2D画像を表示させることにより、酔い等の発生を防止する。その際、情報表示制御装置10は、2D画像の内容を、音声案内等によって補足してもよい。
(2A) When the Maximum Disparity Amount Distance 122 is Outside the Depth of Field of the Object The control unit 15 sets the disparity amount of the display image to zero. The HUD device 100 causes the driver 120 to visually recognize a 2D image by displaying a display image without parallax at the virtual image distance 121. In the case of (2A), when the HUD device 100 displays a 3D image, the driver 120 can not simultaneously focus on both the object and the 3D image, which may make it difficult to see the 3D image and may cause sickness or the like. . Therefore, in the case of (2A), the control unit 15 causes the HUD device 100 to display a 2D image, thereby preventing the occurrence of sickness and the like. At that time, the information display control device 10 may supplement the content of the 2D image by voice guidance or the like.
(2B)最大視差量距離122が対象物の被写界深度内である場合
 制御部15は、表示画像の視差量をゼロより大きい値にする。HUDデバイス100は、視差量がゼロより大きい表示画像を虚像距離121に表示することにより、虚像距離121より遠い位置(例えば、最大視差量距離122)に3D画像を運転者120に視認させる。
(2B) When the Maximum Disparity Amount Distance 122 is Within the Depth of Field of the Object The control unit 15 sets the disparity amount of the display image to a value larger than zero. The HUD device 100 causes the driver 120 to visually recognize the 3D image at a position farther than the virtual image distance 121 (for example, the maximum parallax amount distance 122) by displaying a display image with a parallax amount larger than zero at the virtual image distance 121.
 このように、情報表示制御装置10は、3D画像の表示を(2B)の場合のみにして3D画像の表示時間を最小限に抑えることにより、運転者120の酔い等の発生を防止する。また、走行中すべてにおいて分かりやすい情報提示ができる。 As described above, the information display control device 10 prevents the occurrence of drunkenness or the like of the driver 120 by minimizing the display time of the 3D image by setting the display of the 3D image only in the case of (2B). In addition, information can be presented in an easy-to-understand manner during travelling.
 次に、情報表示制御装置10の動作例を説明する。
 図6は、実施の形態1に係る情報表示制御装置10の動作例を示すフローチャートである。情報表示制御装置10は、車両が目的地に到着するまで、図6のフローチャートに示される動作を繰り返し行う。
Next, an operation example of the information display control device 10 will be described.
FIG. 6 is a flowchart showing an operation example of the information display control device 10 according to the first embodiment. The information display control device 10 repeatedly performs the operation shown in the flowchart of FIG. 6 until the vehicle arrives at the destination.
 ステップST101において、速度取得部11は、車両の速度を示す情報をCAN111から取得し、制御部15へ出力する。 In step ST101, the speed acquisition unit 11 acquires information indicating the speed of the vehicle from the CAN 111 and outputs the information to the control unit 15.
 ステップST102において、制御部15は、車両の速度が予め定められた速度(例えば、20km/h)以下であるか否かを判定する。制御部15は、車両の速度が予め定められた速度以下である場合(ステップST102“YES”)、ステップST103へ進み、車両の速度が予め定められた速度より大きい場合(ステップST102“NO”)、ステップST108へ進む。 In step ST102, the control unit 15 determines whether the speed of the vehicle is equal to or less than a predetermined speed (for example, 20 km / h). When the speed of the vehicle is equal to or less than the predetermined speed (step ST102 "YES"), the control unit 15 proceeds to step ST103, and when the speed of the vehicle is larger than the predetermined speed (step ST102 "NO") , And proceeds to step ST108.
 ステップST103において、距離取得部12は、車両から対象物までの距離を示す情報をカーナビ112または車外センサ113から取得し、被写界深度推定部13および制御部15へ出力する。 In step ST103, the distance acquisition unit 12 acquires information indicating the distance from the vehicle to the object from the car navigation system 112 or the external sensor 113, and outputs the information to the depth of field estimation unit 13 and the control unit 15.
 ステップST104において、制御部15は、車両から対象物までの距離が最大視差量距離122以上であるか否かを判定する。制御部15は、車両から対象物までの距離が最大視差量距離122以上である場合(ステップST104“YES”)、ステップST105へ進み、車両から対象物までの距離が最大視差量距離122より小さい場合(ステップST104“NO”)、ステップST109へ進む。 In step ST104, the control unit 15 determines whether the distance from the vehicle to the object is equal to or greater than the maximum parallax amount distance 122. If the distance from the vehicle to the object is equal to or greater than the maximum parallax distance 122 (YES in step ST104), the control unit 15 proceeds to step ST105 and the distance from the vehicle to the object is smaller than the maximum parallax distance 122 If it is (step ST104 “NO”), the process proceeds to step ST109.
 ステップST105において、被写界深度推定部13は、車両から対象物までの距離に基づいて被写界深度を推定し、制御部15へ出力する。 In step ST105, the depth of field estimation unit 13 estimates the depth of field based on the distance from the vehicle to the object, and outputs the depth of field to the control unit 15.
 ステップST106において、制御部15は、最大視差量距離122が被写界深度の範囲内であるか否かを判定する。制御部15は、図5Bのように最大視差量距離122が被写界深度の範囲内である場合(ステップST106“YES”)、ステップST107へ進み、図5Aのように最大視差量距離122が被写界深度の範囲外である場合(ステップST106“NO”)、ステップST108へ進む。 In step ST106, the control unit 15 determines whether the maximum parallax amount distance 122 is within the range of the depth of field. When the maximum disparity distance 122 is within the depth of field range as shown in FIG. 5B (step ST106 “YES”), the control unit 15 proceeds to step ST107, and the maximum disparity distance 122 is as shown in FIG. 5A. If it is out of the range of the depth of field ("NO" in step ST106), the process proceeds to step ST108.
 ステップST107において、制御部15は、視差量を最大視差量に制御する。最大視差量は、最大視差量距離122に対応する値であり、予め制御部15に与えられているものとする。制御部15は、最大視差量を画像生成部16へ出力する。 In step ST107, the control unit 15 controls the amount of parallax to the maximum amount of parallax. The maximum amount of parallax is a value corresponding to the maximum amount of parallax distance 122, and is given to the control unit 15 in advance. The control unit 15 outputs the maximum amount of parallax to the image generation unit 16.
 ステップST108において、制御部15は、視差量をゼロに制御し、ゼロ視差量を画像生成部16へ出力する。 In step ST108, the control unit 15 controls the amount of parallax to be zero and outputs the amount of zero parallax to the image generation unit 16.
 ステップST109において、制御部15は、視差量を、車両から対象物までの距離に応じてゼロから最大視差量までの間の値に制御する。即ち、制御部15は、車両が対象物に接近するに従い、画像生成部16により生成される表示画像が視認される距離が短くなるように、視差量を小さくする。例えば、虚像距離121が3mであり、最大視差量距離122が10mであるHUDデバイス100の場合、制御部15は、車両から対象物までの距離が5mであれば視差量を5mに対応する値にする。また、制御部15は、車両から対象物までの距離が虚像距離121以下である2m等であれば視差量をゼロにする。 In step ST109, the control unit 15 controls the amount of parallax to a value between zero and the maximum amount of parallax according to the distance from the vehicle to the object. That is, as the vehicle approaches the object, the control unit 15 reduces the amount of parallax so that the distance at which the display image generated by the image generation unit 16 is viewed becomes shorter. For example, in the case of the HUD device 100 in which the virtual image distance 121 is 3 m and the maximum parallax distance 122 is 10 m, the control unit 15 sets the parallax amount to 5 m if the distance from the vehicle to the object is 5 m. Make it Further, the control unit 15 sets the parallax amount to zero if the distance from the vehicle to the object is 2 m or the like in which the virtual image distance 121 or less.
 ステップST110において、画像生成部16は、制御部15が指定する視差量に基づいて表示画像を生成する。 In step ST110, the image generation unit 16 generates a display image based on the parallax amount designated by the control unit 15.
 ステップST111において、画像生成部16は、生成した表示画像を、HUDデバイス100の表示部101へ出力する。表示部101は、画像生成部16から受け取った表示画像を表示する。 In step ST111, the image generation unit 16 outputs the generated display image to the display unit 101 of the HUD device 100. The display unit 101 displays the display image received from the image generation unit 16.
 以上のように、実施の形態1に係る情報表示制御装置10は、速度取得部11、距離取得部12、被写界深度推定部13、画像生成部16、および制御部15を備える。速度取得部11は、車両の速度を示す情報を取得する。距離取得部12は、車両からこの車両の進行方向に存在する対象物までの距離を示す情報を取得する。被写界深度推定部13は、車両から対象物までの距離に基づいて、車両の乗員から見た対象物の被写界深度を推定する。画像生成部16は、HUDデバイス100によって対象物に付加された状態に表示される表示画像を生成する。制御部15は、車両の速度が予め定められた速度以下であり、かつ、最大視差量をもつ表示画像がHUDデバイス100に表示されたときに視認される距離である最大視差量距離122が被写界深度内である場合、画像生成部16により生成される表示画像の視差量をゼロより大きい値に制御する。車両の速度が予め定められた速度以下であり、かつ、最大視差量距離122が被写界深度内である場合(2B)に3D画像を表示するようにしたので、3D画像の表示時間を最小限に抑えることができる。これにより、乗員の酔い等の発生を防止することができる。 As described above, the information display control device 10 according to the first embodiment includes the speed acquisition unit 11, the distance acquisition unit 12, the depth of field estimation unit 13, the image generation unit 16, and the control unit 15. The speed acquisition unit 11 acquires information indicating the speed of the vehicle. The distance acquisition unit 12 acquires information indicating the distance from a vehicle to an object present in the traveling direction of the vehicle. The depth of field estimation unit 13 estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object. The image generation unit 16 generates a display image displayed in a state of being added to the object by the HUD device 100. The control unit 15 is configured such that the maximum parallax distance 122, which is a distance visually recognized when the display image having the maximum parallax amount is displayed on the HUD device 100, is equal to or less than the speed of the vehicle. When it is within the depth of field, the parallax amount of the display image generated by the image generation unit 16 is controlled to a value larger than zero. Since the 3D image is displayed when the speed of the vehicle is less than or equal to the predetermined speed and the maximum parallax distance 122 is within the depth of field (2B), the display time of the 3D image is minimized. Can be limited. As a result, it is possible to prevent the occurrence of drunkenness or the like of the occupant.
 また、実施の形態1の制御部15は、車両の速度が予め定められた速度以下であり、かつ、車両から対象物までの距離が最大視差量距離122より小さい場合、対象物に接近するに従い、表示画像がHUDデバイス100に表示されたときに視認される距離が短くなるように、視差量を小さくする。これにより、分かりやすい情報提示ができる。 Further, when the speed of the vehicle is equal to or less than a predetermined speed and the distance from the vehicle to the object is smaller than the maximum parallax amount distance 122, the control unit 15 of the first embodiment follows the approach to the object The amount of parallax is reduced so that the distance viewed when the display image is displayed on the HUD device 100 becomes short. This makes it possible to present easy-to-understand information.
 また、実施の形態1の制御部15は、車両の速度が予め定められた速度より大きい場合、または、車両の速度が予め定められた速度以下であり、かつ、最大視差量距離122が被写界深度外である場合、画像生成部16により生成される表示画像の視差量をゼロにする。車両の速度が予め定められた速度より大きい場合(1)、運動視差により、運転者は2D画像が虚像距離121より遠くにあるように感じる。これにより、分かりやすい情報提示ができる。最大視差量距離122が被写界深度外である場合(2A)、運転者は対象物と3D画像の両方同時にピントを合わせることができず酔い等が発生しやすいため、2D画像を表示する。これにより、酔い等の発生を防止できる。 In the control unit 15 of the first embodiment, when the speed of the vehicle is greater than a predetermined speed, or the speed of the vehicle is equal to or less than a predetermined speed, the maximum parallax distance 122 If it is outside the depth of field, the parallax amount of the display image generated by the image generation unit 16 is set to zero. If the speed of the vehicle is greater than the predetermined speed (1), the motion parallax causes the driver to feel that the 2D image is farther than the virtual image distance 121. This makes it possible to present easy-to-understand information. When the maximum parallax distance 122 is out of the depth of field (2A), the driver can not simultaneously focus on both the object and the 3D image, and it is likely to cause a sickness or the like, and thus a 2D image is displayed. This can prevent the occurrence of sickness and the like.
実施の形態2.
 実施の形態2に係る情報表示制御装置10の構成は、実施の形態1の図1に示された構成と図面上は同一であるため、以下では図1を援用する。実施の形態2の制御部15は、車両から対象物までの距離に基づいて、画像生成部16により生成される表示画像の表示位置または大きさの少なくとも一方を調整する。
Second Embodiment
The configuration of the information display control device 10 according to the second embodiment is the same as the configuration shown in FIG. 1 of the first embodiment in the drawings, so FIG. 1 will be used hereinafter. The control unit 15 of the second embodiment adjusts at least one of the display position or the size of the display image generated by the image generation unit 16 based on the distance from the vehicle to the object.
 図7は、実施の形態2に係る情報表示制御装置10の動作例を示すフローチャートである。図7のフローチャートに示されるステップST101~ST109,ST111の動作は、図6のフローチャートに示されるステップST101~ST109,ST111の動作と同じであるため説明を省略する。 FIG. 7 is a flowchart showing an operation example of the information display control device 10 according to the second embodiment. The operations of steps ST101 to ST109 and ST111 shown in the flowchart of FIG. 7 are the same as the operations of steps ST101 to ST109 and ST111 shown in the flowchart of FIG.
 ステップST201において、制御部15は、カーナビ112または車外センサ113から取得した車両から対象物までの距離を示す情報を用いて、画像生成部16が生成する表示画像の位置を決定する。具体的には、制御部15は、車両から対象物までの距離が長いほど、虚像距離121に表示される表示画像の表示位置を高くする。一方、制御部15は、車両から対象物までの距離が短いほど、虚像距離121に表示される表示画像の表示位置を低くする。制御部15は、決定した表示位置の情報を画像生成部16へ出力する。 In step ST201, the control unit 15 determines the position of the display image generated by the image generation unit 16 using the information indicating the distance from the vehicle acquired from the car navigation system 112 or the external sensor 113 to the object. Specifically, the control unit 15 raises the display position of the display image displayed at the virtual image distance 121 as the distance from the vehicle to the object is longer. On the other hand, the control unit 15 lowers the display position of the display image displayed at the virtual image distance 121 as the distance from the vehicle to the object is shorter. The control unit 15 outputs information on the determined display position to the image generation unit 16.
 ステップST202において、制御部15は、カーナビ112または車外センサ113から取得した車両から対象物までの距離を示す情報を用いて、画像生成部16が生成する表示画像の大きさを決定する。具体的には、制御部15は、車両から対象物までの距離が長いほど、虚像距離121に表示される表示画像を小さくする。一方、制御部15は、車両から対象物までの距離が短いほど、虚像距離121に表示される表示画像を大きくする。制御部15は、決定した表示画像の大きさの情報を画像生成部16へ出力する。 In step ST202, the control unit 15 determines the size of the display image generated by the image generation unit 16 using the information indicating the distance from the vehicle acquired from the car navigation system 112 or the external sensor 113 to the object. Specifically, the control unit 15 reduces the display image displayed at the virtual image distance 121 as the distance from the vehicle to the object is longer. On the other hand, the control part 15 enlarges the display image displayed on the virtual image distance 121, so that the distance from a vehicle to a target object is short. The control unit 15 outputs information on the determined size of the display image to the image generation unit 16.
 ステップST203において、画像生成部16は、制御部15が指定する視差量、表示位置、および大きさに基づいて表示画像を生成する。 In step ST203, the image generation unit 16 generates a display image based on the amount of parallax, the display position, and the size designated by the control unit 15.
 以上のように、実施の形態2の制御部15は、車両が対象物に接近するに従い、画像生成部16により生成される表示画像の表示位置を低くする。これにより、車両走行中、運転者により視認される表示画像の位置を対象物の位置に合わせることができ、分かりやすい情報提示ができる。 As described above, the control unit 15 of the second embodiment lowers the display position of the display image generated by the image generation unit 16 as the vehicle approaches the target. Thus, while the vehicle is traveling, the position of the display image visually recognized by the driver can be adjusted to the position of the object, and information can be easily presented.
 また、実施の形態2の制御部15は、車両が対象物に接近するに従い、画像生成部16により生成される表示画像を大きくする。これにより、車両走行中、運転者により視認される表示画像の位置を対象物の位置に合わせることができ、分かりやすい情報提示ができる。 Further, the control unit 15 of the second embodiment enlarges the display image generated by the image generation unit 16 as the vehicle approaches the object. Thus, while the vehicle is traveling, the position of the display image visually recognized by the driver can be adjusted to the position of the object, and information can be easily presented.
実施の形態3.
 図8は、実施の形態3に係る情報表示装置の構成例を示すブロック図である。実施の形態3に係る情報表示制御装置10は、図1に示された実施の形態1の情報表示制御装置10に対して明るさ取得部17が追加された構成である。また、情報表示制御装置10は車両に搭載された輝度計114を用いる構成である。図8において、図1と同一または相当する部分は、同一の符号を付し説明を省略する。
Third Embodiment
FIG. 8 is a block diagram showing a configuration example of the information display device according to the third embodiment. An information display control device 10 according to the third embodiment has a configuration in which a brightness acquisition unit 17 is added to the information display control device 10 of the first embodiment shown in FIG. Further, the information display control device 10 is configured to use a luminance meter 114 mounted on a vehicle. In FIG. 8, parts that are the same as or correspond to those in FIG. 1 are given the same reference numerals, and descriptions thereof will be omitted.
 明るさ取得部17は、車両の周囲の明るさを示す情報を輝度計114等から取得し、被写界深度推定部13へ出力する。輝度計114は、車両の周囲の輝度を検出し、検出した輝度を車両の周囲の明るさを示す情報として明るさ取得部17へ出力する。なお、車両の周囲の明るさは、輝度に限定されるものではなく、照度等であってもよい。 The brightness acquiring unit 17 acquires information indicating the brightness around the vehicle from the luminance meter 114 or the like, and outputs the information to the depth of field estimating unit 13. The luminance meter 114 detects the luminance around the vehicle, and outputs the detected luminance to the brightness acquiring unit 17 as information indicating the brightness around the vehicle. The brightness around the vehicle is not limited to the luminance, and may be illuminance or the like.
 被写界深度推定部13は、車両の周囲の明るさを示す情報を明るさ取得部17から受け取る。被写界深度推定部13は、車両から対象物までの距離および車両の周囲の明るさに基づいて、車両の乗員から見た対象物の被写界深度を推定し、制御部15へ出力する。図8の例では、被写界深度推定部13は、情報表示制御装置10が備える被写界深度テーブル14aを参照して被写界深度を推定する。 The depth of field estimation unit 13 receives, from the brightness acquisition unit 17, information indicating the brightness around the vehicle. The depth of field estimation unit 13 estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object and the brightness around the vehicle, and outputs the depth of field to the control unit 15 . In the example of FIG. 8, the depth of field estimation unit 13 estimates the depth of field with reference to the depth of field table 14 a included in the information display control device 10.
 図9は、実施の形態3における被写界深度テーブル14aの一例を示す図である。被写界深度テーブル14aは、車両から対象物までの距離と、車両の周囲の明るさが相対的に明るい場合および暗い場合の各被写界深度とを対応付けて記憶している。この場合、被写界深度推定部13は、車両の周囲の明るさが予め定められた閾値以上の場合に「明るい」と判定し、閾値未満の場合に「暗い」と判定する。車両の周囲が明るいほど、被写界深度の範囲は狭くなり、車両の周囲が暗いほど、被写界深度の範囲は広くなる。例えば、車両から対象物までの距離が100mであり、かつ、車両の周囲が明るい場合、被写界深度推定部13は、被写界深度テーブル14aを参照して被写界深度を8~100mと推定する。 FIG. 9 shows an example of the depth of field table 14a according to the third embodiment. The depth of field table 14a stores the distance from the vehicle to the object in association with the depth of field when the brightness around the vehicle is relatively bright and dark. In this case, the depth of field estimation unit 13 determines “bright” when the brightness around the vehicle is equal to or greater than a predetermined threshold, and determines “dark” when the brightness is less than the threshold. The brighter the surroundings of the vehicle, the narrower the range of depth of field, and the darker the periphery of the vehicle, the wider the range of depth of field. For example, when the distance from the vehicle to the object is 100 m and the surroundings of the vehicle are bright, the depth of field estimation unit 13 refers to the depth of field table 14 a to have a depth of field of 8 to 100 m. Estimate.
 なお、被写界深度推定部13の被写界深度推定方法は上記の方法に推定されない。例えば、被写界深度推定部13は、車両から対象物までの距離、車両の周囲の明るさ、および被写界深度の関係性を定義した数式を用いて、被写界深度を推定してもよい。 The depth of field estimation method of the depth of field estimation unit 13 is not estimated by the above method. For example, the depth of field estimation unit 13 estimates the depth of field using a mathematical expression that defines the relationship between the distance from the vehicle to the object, the brightness around the vehicle, and the depth of field. It is also good.
 図10は、実施の形態3に係る情報表示制御装置10の動作例を示すフローチャートである。図10のフローチャートに示されるステップST101~ST104,ST106~ST111の動作は、図6のフローチャートに示されるステップST101~ST104,ST106~ST111の動作と同じであるため説明を省略する。 FIG. 10 is a flowchart showing an operation example of the information display control device 10 according to the third embodiment. The operations of steps ST101 to ST104 and ST106 to ST111 shown in the flowchart of FIG. 10 are the same as the operations of steps ST101 to ST104 and ST106 to ST111 shown in the flowchart of FIG.
 ステップST301において、明るさ取得部17は、車両の周囲の明るさを示す情報を輝度計114から取得し、被写界深度推定部13へ出力する。 In step ST301, the brightness acquisition unit 17 acquires information indicating the brightness around the vehicle from the luminance meter 114, and outputs the information to the depth of field estimation unit 13.
 ステップST302において、被写界深度推定部13は、車両から対象物までの距離および車両の周囲の明るさに基づいて被写界深度を推定し、制御部15へ出力する。 In step ST302, the depth of field estimation unit 13 estimates the depth of field based on the distance from the vehicle to the object and the brightness around the vehicle, and outputs the depth of field to the control unit 15.
 以上のように、実施の形態3に係る情報表示制御装置10は、車両の周囲の明るさを示す情報を取得する明るさ取得部17を備える。被写界深度推定部13は、車両から対象物までの距離および車両の周囲の明るさに基づいて、車両の乗員から見た対象物の被写界深度を推定する。被写界深度推定部13が車両の周囲の明るさを考慮して被写界深度を推定することにより、推定精度が向上する。また、制御部15は、高精度の被写界深度を用いて視差量を高精度に制御することができる。 As described above, the information display control device 10 according to the third embodiment includes the brightness acquisition unit 17 that acquires information indicating the brightness around the vehicle. The depth of field estimation unit 13 estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object and the brightness around the vehicle. The estimation accuracy is improved by the depth of field estimation unit 13 estimating the depth of field in consideration of the brightness around the vehicle. In addition, the control unit 15 can control the amount of parallax with high accuracy by using the highly accurate depth of field.
 なお、実施の形態3に係る情報表示制御装置10は、実施の形態1に係る情報表示制御装置10に明るさ取得部17が追加された構成であったが、実施の形態2に係る情報表示制御装置10に明るさ取得部17が追加された構成であってもよい。 Although the information display control device 10 according to the third embodiment has a configuration in which the brightness acquisition unit 17 is added to the information display control device 10 according to the first embodiment, the information display according to the second embodiment The brightness acquisition unit 17 may be added to the control device 10.
実施の形態4.
 図11は、実施の形態4に係る情報表示装置の構成例を示すブロック図である。実施の形態4に係る情報表示制御装置10は、図1に示された実施の形態1の情報表示制御装置10に対して眼位置取得部18と立体視域調整部19とが追加された構成である。また、情報表示制御装置10は車両に搭載された眼位置検知デバイス115を用いる構成である。図11において、図1と同一または相当する部分は、同一の符号を付し説明を省略する。また、実施の形態4では図2を援用する。
Fourth Embodiment
FIG. 11 is a block diagram showing a configuration example of an information display device according to the fourth embodiment. The information display control device 10 according to the fourth embodiment has a configuration in which an eye position acquisition unit 18 and a stereoscopic area adjustment unit 19 are added to the information display control device 10 according to the first embodiment shown in FIG. It is. Further, the information display control device 10 is configured to use an eye position detection device 115 mounted on a vehicle. In FIG. 11, the parts that are the same as or correspond to those in FIG. 1 are given the same reference numerals, and descriptions thereof will be omitted. Further, FIG. 2 is referred to in the fourth embodiment.
 眼位置取得部18は、運転者等の乗員の両眼の位置を示す情報を眼位置検知デバイス115等から取得し、立体視域調整部19へ出力する。眼位置検知デバイス115は、DMS(Driver Monitoring System)等であり、図2における運転者120の両眼の位置を検知して立体視域調整部19へ出力する。 The eye position acquisition unit 18 acquires information indicating the positions of both eyes of an occupant such as a driver from the eye position detection device 115 or the like, and outputs the information to the stereoscopic viewing area adjustment unit 19. The eye position detection device 115 is a DMS (Driver Monitoring System) or the like, detects the position of the both eyes of the driver 120 in FIG. 2, and outputs the position to the stereoscopic viewing area adjustment unit 19.
 立体視域調整部19は、乗員の両眼の位置に基づいて、HUDデバイス100の立体視域の上下方向または左右方向の少なくとも一方の調整量を決定する。立体視域調整部19は、決定した立体視域の調整量を示す情報を、HUDデバイス100の駆動部102へ出力する。立体視域は、HUDデバイス100により虚像距離121の位置に表示される視差量がゼロより大きい表示画像を、運転者が3D画像として視認できる領域である。運転者120の両眼が立体視域外に移動すると、3Dクロストーク等が発生し、運転者120は3D画像を正しく視認できなくなる。 The stereoscopic viewing range adjustment unit 19 determines the adjustment amount of at least one of the vertical direction and the lateral direction of the stereoscopic viewing range of the HUD device 100 based on the positions of both eyes of the occupant. The stereoscopic viewing area adjustment unit 19 outputs information indicating the determined adjustment amount of the stereoscopic viewing area to the drive unit 102 of the HUD device 100. The stereoscopic viewing area is an area where the driver can visually recognize a display image having a parallax amount larger than zero and displayed by the HUD device 100 at the position of the virtual image distance 121 as a 3D image. When both eyes of the driver 120 move out of the stereoscopic vision range, 3D crosstalk or the like occurs, and the driver 120 can not correctly view the 3D image.
 駆動部102は、立体視域の調整量を示す情報を、立体視域調整部19から受け取る。駆動部102は、指定された調整量に従って反射鏡103の姿勢を変更することにより、立体視域の位置を上下方向または左右方向の少なくとも一方に移動させる。なお、図2に示される構成のHUDデバイス100においては、駆動部102が反射鏡103の姿勢を変更することによって立体視域を調整するが、この調整方法に限定されるものではなく、HUDデバイス100の構成に応じた方法によって立体視域を調整すればよい。 The drive unit 102 receives information indicating the adjustment amount of the stereoscopic viewing area from the stereoscopic viewing area adjustment unit 19. The drive unit 102 moves the position of the stereoscopic viewing area in at least one of the up and down direction or the left and right direction by changing the posture of the reflecting mirror 103 according to the designated adjustment amount. In the HUD device 100 configured as shown in FIG. 2, the drive unit 102 adjusts the stereoscopic viewing area by changing the attitude of the reflecting mirror 103, but the adjustment method is not limited to this, and the HUD device The stereoscopic viewing area may be adjusted by a method according to the configuration of 100.
 図12は、実施の形態4に係る情報表示制御装置10の動作例を示すフローチャートである。図12のフローチャートに示されるステップST101~ST111の動作は、図6のフローチャートに示されるステップST101~ST111の動作と同じであるため説明を省略する。 FIG. 12 is a flowchart showing an operation example of the information display control device 10 according to the fourth embodiment. The operations of steps ST101 to ST111 shown in the flowchart of FIG. 12 are the same as the operations of steps ST101 to ST111 shown in the flowchart of FIG.
 ステップST401において、眼位置取得部18は、運転者の両眼の位置を示す情報を眼位置検知デバイス115から取得し、立体視域調整部19へ出力する。 In step ST401, the eye position acquisition unit 18 acquires information indicating the positions of the driver's eyes from the eye position detection device 115, and outputs the information to the stereoscopic viewing area adjustment unit 19.
 ステップST402において、立体視域調整部19は、運転者の両眼の位置に基づいてHUDデバイス100の立体視域を調整するための調整量を決定する。 In step ST402, the stereoscopic viewing range adjustment unit 19 determines an adjustment amount for adjusting the stereoscopic viewing range of the HUD device 100 based on the position of the driver's eyes.
 ステップST403において、立体視域調整部19は、決定した調整量をHUDデバイス100の駆動部102へ出力する。駆動部102は、立体視域調整部19から受け取った調整量に従って、立体視域を調整する。 In step ST403, the stereoscopic viewing range adjustment unit 19 outputs the determined adjustment amount to the drive unit 102 of the HUD device 100. The drive unit 102 adjusts the stereoscopic viewing area in accordance with the adjustment amount received from the stereoscopic viewing area adjustment unit 19.
 以上のように、実施の形態4に係る情報表示制御装置10は、眼位置取得部18および立体視域調整部19を備える。眼位置取得部18は、乗員の両眼の位置を示す情報を取得する。立体視域調整部19は、乗員の両眼の位置に基づいてHUDデバイス100の立体視域を調整するための調整量を決定し、決定した調整量をHUDデバイス100へ指示する。これにより、3Dクロストーク等の発生を防止でき、分かりやすい情報提示ができる。 As described above, the information display control device 10 according to the fourth embodiment includes the eye position acquisition unit 18 and the stereoscopic vision area adjustment unit 19. The eye position acquisition unit 18 acquires information indicating the positions of both eyes of the occupant. The stereoscopic viewing range adjustment unit 19 determines an adjustment amount for adjusting the stereoscopic viewing range of the HUD device 100 based on the positions of the occupant's eyes, and instructs the HUD device 100 on the determined adjustment amount. As a result, the occurrence of 3D crosstalk and the like can be prevented, and information can be presented in an easy-to-understand manner.
 なお、実施の形態4に係る情報表示制御装置10は、実施の形態1に係る情報表示制御装置10に眼位置取得部18および立体視域調整部19が追加された構成であったが、実施の形態2または実施の形態3に係る情報表示制御装置10に眼位置取得部18および立体視域調整部19が追加された構成であってもよい。 The information display control device 10 according to the fourth embodiment has a configuration in which the eye position acquisition unit 18 and the stereoscopic vision area adjustment unit 19 are added to the information display control device 10 according to the first embodiment. The configuration may be such that the eye position acquisition unit 18 and the stereoscopic vision area adjustment unit 19 are added to the information display control device 10 according to the second embodiment or the third embodiment.
 最後に、各実施の形態に係る情報表示装置のハードウェア構成を説明する。
 図13Aおよび図13Bは、各実施の形態に係る情報表示装置のハードウェア構成例を示す図である。情報表示制御装置10における速度取得部11、距離取得部12、被写界深度推定部13、被写界深度テーブル14,14a、制御部15、画像生成部16、明るさ取得部17、眼位置取得部18、および立体視域調整部19の各機能は、処理回路により実現される。即ち、情報表示制御装置10は、上記各機能を実現するための処理回路を備える。処理回路は、専用のハードウェアとしての処理回路1であってもよいし、メモリ3に格納されるプログラムを実行するプロセッサ2であってもよい。処理回路1、または、プロセッサ2とメモリ3は、HUDデバイス100、CAN111、カーナビ112、車外センサ113、輝度計114、および眼位置検知デバイス115に接続される。
Finally, the hardware configuration of the information display device according to each embodiment will be described.
FIG. 13A and FIG. 13B are diagrams showing an example of a hardware configuration of the information display device according to each embodiment. The speed acquisition unit 11, the distance acquisition unit 12, the depth of field estimation unit 13, the depth of field tables 14 and 14a, the control unit 15, the image generation unit 16, the brightness acquisition unit 17, and the eye position in the information display control device 10 Each function of the acquisition unit 18 and the stereoscopic vision area adjustment unit 19 is realized by a processing circuit. That is, the information display control device 10 includes a processing circuit for realizing the above functions. The processing circuit may be the processing circuit 1 as dedicated hardware or may be the processor 2 that executes a program stored in the memory 3. The processing circuit 1 or the processor 2 and the memory 3 are connected to the HUD device 100, the CAN 111, the car navigation system 112, the external sensor 113, the luminance meter 114, and the eye position detection device 115.
 図13Aに示すように、処理回路が専用のハードウェアである場合、処理回路1は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものが該当する。速度取得部11、距離取得部12、被写界深度推定部13、被写界深度テーブル14,14a、制御部15、画像生成部16、明るさ取得部17、眼位置取得部18、および立体視域調整部19の機能を複数の処理回路1で実現してもよいし、各部の機能をまとめて1つの処理回路1で実現してもよい。 As shown in FIG. 13A, when the processing circuit is dedicated hardware, the processing circuit 1 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC). , FPGA (Field Programmable Gate Array), or a combination thereof. Speed acquisition unit 11, distance acquisition unit 12, depth of field estimation unit 13, depth of field tables 14 and 14a, control unit 15, image generation unit 16, brightness acquisition unit 17, eye position acquisition unit 18, and three-dimensional The function of the viewing zone adjustment unit 19 may be realized by a plurality of processing circuits 1, or the function of each unit may be realized by one processing circuit 1.
 図13Bに示すように、処理回路がプロセッサ2である場合、速度取得部11、距離取得部12、被写界深度推定部13、制御部15、画像生成部16、明るさ取得部17、眼位置取得部18、および立体視域調整部19の各機能は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせにより実現される。被写界深度テーブル14,14aは、メモリ3により実現される。ソフトウェアまたはファームウェアはプログラムとして記述され、メモリ3に格納される。プロセッサ2は、メモリ3に格納されたプログラムを読みだして実行することにより、各部の機能を実現する。即ち、情報表示制御装置10は、プロセッサ2により実行されるときに、図6等のフローチャートで示されるステップが結果的に実行されることになるプログラムを格納するためのメモリ3を備える。また、このプログラムは、速度取得部11、距離取得部12、被写界深度推定部13、制御部15、画像生成部16、明るさ取得部17、眼位置取得部18、および立体視域調整部19の手順または方法をコンピュータに実行させるものであるとも言える。 As illustrated in FIG. 13B, when the processing circuit is the processor 2, the speed acquisition unit 11, the distance acquisition unit 12, the depth of field estimation unit 13, the control unit 15, the image generation unit 16, the brightness acquisition unit 17, and the eye Each function of the position acquisition unit 18 and the stereoscopic vision area adjustment unit 19 is realized by software, firmware, or a combination of software and firmware. The depth of field tables 14 and 14 a are realized by the memory 3. The software or firmware is written as a program and stored in the memory 3. The processor 2 realizes the functions of the respective units by reading and executing the program stored in the memory 3. That is, the information display control device 10 includes the memory 3 for storing a program which, when executed by the processor 2, results in the steps shown in the flowchart of FIG. The program also includes a velocity acquisition unit 11, a distance acquisition unit 12, a depth of field estimation unit 13, a control unit 15, an image generation unit 16, a brightness acquisition unit 17, an eye position acquisition unit 18, and adjustment of the stereoscopic viewing area. It can also be said that the computer is made to execute the procedure or method of part 19.
 ここで、プロセッサ2とは、CPU(Central Processing Unit)、処理装置、演算装置、マイクロプロセッサ、またはマイクロコンピュータ等のことである。
 メモリ3は、RAM(Random Access Memory)、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、またはフラッシュメモリ等の不揮発性もしくは揮発性の半導体メモリであってもよいし、ハードディスクまたはフレキシブルディスク等の磁気ディスクであってもよいし、CD(Compact Disc)またはDVD(Digital Versatile Disc)等の光ディスクであってもよい。
Here, the processor 2 refers to a central processing unit (CPU), a processing device, an arithmetic device, a microprocessor, a microcomputer, or the like.
The memory 3 may be a nonvolatile or volatile semiconductor memory such as a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), or a flash memory, a hard disk, a flexible disk, etc. Or an optical disc such as a CD (Compact Disc) or a DVD (Digital Versatile Disc).
 なお速度取得部11、距離取得部12、被写界深度推定部13、被写界深度テーブル14,14a、制御部15、画像生成部16、明るさ取得部17、眼位置取得部18、および立体視域調整部19の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。このように、情報表示制御装置10における処理回路は、ハードウェア、ソフトウェア、ファームウェア、またはこれらの組み合わせによって、上述の各機能を実現することができる。 The speed acquisition unit 11, distance acquisition unit 12, depth of field estimation unit 13, depth of field tables 14 and 14a, control unit 15, image generation unit 16, brightness acquisition unit 17, eye position acquisition unit 18, and The respective functions of the stereoscopic viewing area adjustment unit 19 may be partially realized by dedicated hardware and partially realized by software or firmware. Thus, the processing circuit in the information display control device 10 can realize each of the functions described above by hardware, software, firmware, or a combination thereof.
 なお、本発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、各実施の形態の任意の構成要素の変形、または各実施の形態の任意の構成要素の省略が可能である。 In the scope of the present invention, free combinations of the respective embodiments, deformation of any component of each embodiment, or omission of any component of each embodiment are possible within the scope of the invention.
 この発明に係る情報表示制御装置は、酔い等の発生を防止するようにしたので、車両、鉄道、船舶または航空機等を含む移動体用の情報表示制御装置、特に車載に適した情報表示制御装置などに用いるのに適している。 Since the information display control device according to the present invention prevents the occurrence of sickness etc., the information display control device for mobiles including vehicles, railways, ships, aircrafts, etc., particularly the information display control device suitable for on-vehicle use Suitable for use in
 1 処理回路、2 プロセッサ、3 メモリ、10 情報表示制御装置、11 速度取得部、12 距離取得部、13 被写界深度推定部、14,14a 被写界深度テーブル、15 制御部、16 画像生成部、17 明るさ取得部、18 眼位置取得部、19 立体視域調整部、100 HUDデバイス、101 表示部、102 駆動部、103 反射鏡、104 レンズ、105 フロントガラス、111 CAN、112 カーナビ、113 車外センサ、114 輝度計、115 眼位置検知デバイス、120 運転者、121 虚像距離、122 最大視差量距離、123,124 表示画像。 Reference Signs List 1 processing circuit, 2 processor, 3 memory, 10 information display control device, 11 speed acquisition unit, 12 distance acquisition unit, 13 depth of field estimation unit, 14, 14a depth of field table, 15 control unit, 16 image generation Unit, 17 Brightness Acquisition Unit, 18 Eye Position Acquisition Unit, 19 Stereo Vision Area Adjustment Unit, 100 HUD Device, 101 Display Unit, 102 Drive Unit, 103 Reflector, 104 Lens, 105 Front Glass, 111 CAN, 112 Car Navigation System, 113 external sensor, 114 luminance meter, 115 eye position detection device, 120 driver, 121 virtual image distance, 122 maximum parallax distance distance, 123, 124 display image.

Claims (9)

  1.  表示画像を立体視および平面視させるヘッドアップディスプレイデバイスを制御する情報表示制御装置であって、
     車両の速度を示す情報を取得する速度取得部と、
     前記車両から前記車両の進行方向に存在する対象物までの距離を示す情報を取得する距離取得部と、
     前記車両から前記対象物までの距離に基づいて、前記車両の乗員から見た前記対象物の被写界深度を推定する被写界深度推定部と、
     前記ヘッドアップディスプレイデバイスによって前記対象物に付加された状態に表示される表示画像を生成する画像生成部と、
     前記車両の速度が予め定められた速度以下であり、かつ、最大視差量をもつ表示画像が前記ヘッドアップディスプレイデバイスに表示されたときに視認される距離である最大視差量距離が前記被写界深度内である場合、前記画像生成部により生成される表示画像の視差量をゼロより大きい値に制御する制御部とを備えることを特徴とする情報表示制御装置。
    An information display control device for controlling a head-up display device that causes a display image to be viewed stereoscopically or planarly, comprising:
    A speed acquisition unit that acquires information indicating the speed of the vehicle;
    A distance acquisition unit that acquires information indicating a distance from the vehicle to an object present in the traveling direction of the vehicle;
    A depth of field estimation unit which estimates the depth of field of the object viewed from the vehicle occupant based on the distance from the vehicle to the object;
    An image generator for generating a display image to be displayed in a state of being added to the object by the head-up display device;
    The maximum parallax distance which is the distance viewed when the display image having the maximum parallax amount is equal to or less than the predetermined speed and which is the speed of the vehicle is the object field. An information display control device comprising: a control unit configured to control the amount of parallax of the display image generated by the image generation unit to a value larger than zero when within the depth.
  2.  前記制御部は、前記車両の速度が予め定められた速度以下であり、かつ、前記車両から前記対象物までの距離が前記最大視差量距離より小さい場合、前記車両が前記対象物に接近するに従い、表示画像が前記ヘッドアップディスプレイデバイスに表示されたときに視認される距離が短くなるように、前記視差量を小さくすることを特徴とする請求項1記載の情報表示制御装置。 When the speed of the vehicle is equal to or less than a predetermined speed and the distance from the vehicle to the object is smaller than the maximum parallax distance, the control unit follows the approach of the object. The information display control device according to claim 1, wherein the amount of parallax is reduced such that a distance viewed when the display image is displayed on the head-up display device is shortened.
  3.  前記制御部は、前記車両の速度が前記予め定められた速度より大きい場合、または、前記車両の速度が予め定められた速度以下であり、かつ、前記最大視差量距離が前記被写界深度外である場合、前記画像生成部により生成される前記表示画像の視差量をゼロにすることを特徴とする請求項1記載の情報表示制御装置。 When the speed of the vehicle is greater than the predetermined speed, or the speed of the vehicle is equal to or less than a predetermined speed, and the maximum parallax distance is outside the depth of field. The information display control device according to claim 1, wherein the parallax amount of the display image generated by the image generation unit is set to zero in the case of
  4.  前記制御部は、前記車両が前記対象物に接近するに従い、前記画像生成部により生成される前記表示画像の表示位置を低くすることを特徴とする請求項1記載の情報表示制御装置。 The information display control device according to claim 1, wherein the control unit lowers the display position of the display image generated by the image generation unit as the vehicle approaches the object.
  5.  前記制御部は、前記車両が前記対象物に接近するに従い、前記画像生成部により生成される前記表示画像を大きくすることを特徴とする請求項1記載の情報表示制御装置。 The information display control device according to claim 1, wherein the control unit enlarges the display image generated by the image generation unit as the vehicle approaches the object.
  6.  前記車両の周囲の明るさを示す情報を取得する明るさ取得部を備え、
     前記被写界深度推定部は、前記車両から前記対象物までの距離および前記車両の周囲の明るさに基づいて、前記車両の乗員から見た前記対象物の被写界深度を推定することを特徴とする請求項1記載の情報表示制御装置。
    A brightness acquisition unit that acquires information indicating the brightness around the vehicle;
    The depth of field estimation unit estimates the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object and the brightness around the vehicle. The information display control apparatus according to claim 1, characterized in that
  7.  前記乗員の両眼の位置を示す情報を取得する眼位置取得部と、
     前記乗員の両眼の位置に基づいて前記ヘッドアップディスプレイデバイスの立体視域を調整するための調整量を決定し、決定した調整量を前記ヘッドアップディスプレイデバイスへ指示する立体視域調整部とを備えることを特徴とする請求項1記載の情報表示制御装置。
    An eye position acquisition unit that acquires information indicating the positions of the eyes of the occupant;
    Determining an adjustment amount for adjusting the stereoscopic viewing area of the head-up display device based on the positions of the occupant's eyes and a stereoscopic viewing area adjusting unit for instructing the head-up display device of the determined adjustment amount; The information display control device according to claim 1, comprising:
  8.  表示画像を立体視および平面視させるヘッドアップディスプレイデバイスと、
     請求項1記載の情報表示制御装置とを備える情報表示装置。
    A head-up display device for making a display image look stereoscopically and planarly;
    An information display device comprising the information display control device according to claim 1.
  9.  表示画像を立体視および平面視させるヘッドアップディスプレイデバイスを制御する情報表示制御方法であって、
     速度取得部が、車両の速度を示す情報を取得するステップと、
     距離取得部が、前記車両から前記車両の進行方向に存在する対象物までの距離を示す情報を取得するステップと、
     被写界深度推定部が、前記車両から前記対象物までの距離に基づいて、前記車両の乗員から見た前記対象物の被写界深度を推定するステップと、
     画像生成部が、前記ヘッドアップディスプレイデバイスによって前記対象物に付加された状態に表示される表示画像を生成するステップと、
     制御部が、前記車両の速度が予め定められた速度以下であり、かつ、最大視差量をもつ表示画像が前記ヘッドアップディスプレイデバイスに表示されたときに視認される距離である最大視差量距離が前記被写界深度内である場合、前記画像生成部により生成される表示画像の視差量をゼロより大きい値に制御するステップとを備えることを特徴とする情報表示制御方法。
    An information display control method for controlling a head-up display device which causes a display image to be viewed stereoscopically or planarly, comprising:
    The speed acquisition unit acquires information indicating the speed of the vehicle;
    A distance acquiring unit acquiring information indicating a distance from the vehicle to an object present in the traveling direction of the vehicle;
    The depth of field estimating unit estimating the depth of field of the object viewed from the occupant of the vehicle based on the distance from the vehicle to the object;
    The image generation unit generates a display image to be displayed in a state of being added to the object by the head-up display device;
    When the control unit determines that the speed of the vehicle is equal to or less than a predetermined speed and the display image having the maximum parallax amount is displayed on the head-up display device, the maximum parallax amount distance is And controlling the amount of parallax of the display image generated by the image generation unit to a value larger than zero when within the depth of field.
PCT/JP2017/032106 2017-09-06 2017-09-06 Information display control device, information display device, and information display control method WO2019049237A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/032106 WO2019049237A1 (en) 2017-09-06 2017-09-06 Information display control device, information display device, and information display control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/032106 WO2019049237A1 (en) 2017-09-06 2017-09-06 Information display control device, information display device, and information display control method

Publications (1)

Publication Number Publication Date
WO2019049237A1 true WO2019049237A1 (en) 2019-03-14

Family

ID=65633742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/032106 WO2019049237A1 (en) 2017-09-06 2017-09-06 Information display control device, information display device, and information display control method

Country Status (1)

Country Link
WO (1) WO2019049237A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09218376A (en) * 1996-02-09 1997-08-19 Sharp Corp Stereoscopic display device
JP2009008722A (en) * 2007-06-26 2009-01-15 Univ Of Tsukuba Three-dimensional head up display device
JP2011107382A (en) * 2009-11-17 2011-06-02 Nippon Seiki Co Ltd Display device for vehicle
JP2015048007A (en) * 2013-09-03 2015-03-16 株式会社デンソー Information display device
JP2015074391A (en) * 2013-10-10 2015-04-20 アイシン・エィ・ダブリュ株式会社 Head-up display device
JP2016102966A (en) * 2014-11-28 2016-06-02 アイシン・エィ・ダブリュ株式会社 Virtual image display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09218376A (en) * 1996-02-09 1997-08-19 Sharp Corp Stereoscopic display device
JP2009008722A (en) * 2007-06-26 2009-01-15 Univ Of Tsukuba Three-dimensional head up display device
JP2011107382A (en) * 2009-11-17 2011-06-02 Nippon Seiki Co Ltd Display device for vehicle
JP2015048007A (en) * 2013-09-03 2015-03-16 株式会社デンソー Information display device
JP2015074391A (en) * 2013-10-10 2015-04-20 アイシン・エィ・ダブリュ株式会社 Head-up display device
JP2016102966A (en) * 2014-11-28 2016-06-02 アイシン・エィ・ダブリュ株式会社 Virtual image display device

Similar Documents

Publication Publication Date Title
US10754154B2 (en) Display device and moving body having display device
JP6304628B2 (en) Display device and display method
US10302940B2 (en) Head-up display
JP6201690B2 (en) Vehicle information projection system
EP3444139B1 (en) Image processing method and image processing device
JP7194906B2 (en) Video display system, video display method, program, and moving body provided with video display system
JP7183393B2 (en) A 3D augmented reality head-up display that places an image on the ground to bring augmented reality to the driver's perspective
US20170046880A1 (en) Display device and display method
WO2014174575A1 (en) Vehicular head-up display device
EP3200005A1 (en) Head-up display and moving body
WO2019224922A1 (en) Head-up display control device, head-up display system, and head-up display control method
JPWO2018142610A1 (en) Stereoscopic display and head-up display
JP6599058B2 (en) Display control device, display system, and display control method
JPWO2017090464A1 (en) Head-up display
JPWO2015145935A1 (en) Virtual image display device, head-up display system, and vehicle
WO2018003650A1 (en) Head-up display
US9684166B2 (en) Motor vehicle and display of a three-dimensional graphical object
JP2018077400A (en) Head-up display
JP6494764B2 (en) Display control device, display device, and display control method
JP2019102936A (en) Display device, electronic mirror, control method of display device, and display control program
US20160073098A1 (en) Head-up display system using auto-stereoscopy 3d transparent electronic display
EP3961353A1 (en) Method and apparatus for controlling head-up display based on eye tracking status
WO2022230995A1 (en) Display control device, head-up display device, and display control method
WO2019049237A1 (en) Information display control device, information display device, and information display control method
WO2019035177A1 (en) Vehicle-mounted display device, image processing device, and display control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17924441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17924441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP