WO2015064371A1 - Système de projection et dispositif de projection d'informations de véhicule - Google Patents

Système de projection et dispositif de projection d'informations de véhicule Download PDF

Info

Publication number
WO2015064371A1
WO2015064371A1 PCT/JP2014/077562 JP2014077562W WO2015064371A1 WO 2015064371 A1 WO2015064371 A1 WO 2015064371A1 JP 2014077562 W JP2014077562 W JP 2014077562W WO 2015064371 A1 WO2015064371 A1 WO 2015064371A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
projection system
vehicle information
trigger signal
Prior art date
Application number
PCT/JP2014/077562
Other languages
English (en)
Japanese (ja)
Inventor
毅 笠原
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2015064371A1 publication Critical patent/WO2015064371A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present invention relates to a vehicle information projection system that projects a predetermined information image and visually recognizes a virtual image in front of a vehicle occupant, and a projection device used therefor.
  • a system using a head-up display (HUD) device which is a projection device as disclosed in Patent Document 1 is known.
  • HUD head-up display
  • Such a HUD device projects a video image on a windshield of a vehicle to allow a viewer (occupant) to visually recognize a virtual image indicating predetermined information together with a real scene of the outside world of the vehicle.
  • the occupant can confirm the route with little movement of the line of sight while viewing the actual scene It is something that can be done.
  • the windshield on which the HUD device projects an image generally has a curved surface
  • a warping process as disclosed in Patent Document 2 is performed.
  • a coordinate conversion table corresponding to the relative position between the windshield and the occupant's viewpoint is stored in the memory in advance, and the coordinate conversion table in the memory is stored according to the relative position between the windshield and the viewpoint.
  • the distortion generated in the virtual image projected on the windshield is corrected by distorting the image in advance and displaying it on the display (warping process).
  • the HUD device described in Patent Document 2 stores a coordinate conversion table for distortion correction with respect to a predetermined relative position between the windshield and the viewpoint, the viewpoint based on the viewer's physique and posture is used.
  • a coordinate conversion table for distortion correction is required for each of the various viewpoint positions described above, and the coordinate conversion table is stored. There was a problem that the memory capacity would increase.
  • the present invention has been made in view of the above problems, and is a vehicle information projection capable of visually recognizing a high-quality image at a position adapted to the outside scene even when the viewpoint position changes while reducing the memory capacity. It is an object to provide a system and a projection apparatus.
  • a vehicle information projection system includes a display that displays a display image, and an optical member that projects display light obtained by deforming image light indicating the display image onto a transmission / reflection curved surface.
  • a trigger output unit a storage unit that stores a reference warping parameter that is associated with a plurality of reference trigger signals among the trigger signals, and that predistorts the display image and displays the display image on the display unit, and the trigger output unit
  • the trigger signal is input, and warping parameters corresponding to the trigger signals other than the reference trigger signal are set as the reference trigger signal.
  • interpolation using said reference warping parameter as a display controller for displaying on the display unit distort the display image in advance, but provided with.
  • a projection apparatus is a projection apparatus used in the vehicle information projection system, wherein a display that displays a display image, and display light obtained by deforming image light indicating the display image is transmitted and reflected and curved. And an actuator that rotates or / and moves the optical member, and is capable of adjusting the position of a virtual image that is visually recognized through the transmission / reflection curved surface.
  • the present invention it is possible to make an image with high display quality visible at a position adapted to the outside scene even when the viewpoint position changes while reducing the memory capacity.
  • FIGS. 1 A vehicle information projection system 1 according to a first embodiment of the present invention will be described with reference to FIGS.
  • the axis along the left-right direction as viewed from the occupant 3 viewing the virtual image V is the X-axis
  • the axis along the vertical direction is the Y-axis
  • the occupant who views the virtual image V perpendicular to the X-axis and the Y-axis.
  • the axis along the line-of-sight direction 3 is the Z-axis.
  • the direction in which the arrows indicating the X, Y, and Z axes are directed is defined as the + (plus) direction of the axis indicated by the arrow, and the opposite side is appropriately described as the-(minus) direction.
  • FIG. 1 shows a system configuration of a vehicle information projection system 1 according to the first embodiment.
  • the vehicle information projection system 1 uses the display light L (display light La, Lb, Lc in FIG. 2) representing the virtual image V (virtual images Va, Vb, Vc in FIG. 2) to the windshield 2 a of the host vehicle 2.
  • the head-up display device (hereinafter referred to as HUD device) 100 which is a projection device that projects the virtual image V to the occupant (viewer) 3 of the host vehicle 2, and the face of the occupant 3 is captured by the camera 210a and image analysis is performed.
  • the operation input unit 220 that can be operated by the occupant 3, and the vehicle ECU 300 provided in the host vehicle 2.
  • the display controller 400 controls the display of the display, and the bus 500 is connected so as to be able to exchange electric signals between these components.
  • the HUD device (projection device) 100 is mounted on the host vehicle 2, for example, and includes a display 10, a flat mirror 20, a free-form mirror 30, and an actuator 40, as shown in FIG.
  • the display light L is projected onto the windshield 2a, thereby causing the occupant 3 to visually recognize the virtual image V.
  • the HUD device 100 adjusts the position where the display light L is projected based on the positions 3a, 3b, and 3c of the occupant 3 (adjusts the position of the virtual image V viewed by the occupant 3).
  • the occupant 3 visually recognizes the front from the inside of the host vehicle 2, the relative between the specific target W outside the host vehicle 2 and the virtual image V visually recognized by the occupant 3 even if the eye position of the occupant 3 is different. It adjusts so that a general positional relationship may become constant.
  • the HUD device 100 reflects the image light K representing the display image J displayed on the display 10 by the plane mirror 20 in the direction of the free-form curved mirror 30, and the incident image light K is reflected on the windshield 2a by the free-form curved mirror 30.
  • the light is reflected in the direction as the display light L, and emitted from the transmission unit 50b to the outside of the HUD device 100.
  • the display light L emitted from the HUD device 100 is projected onto the windshield 2a of the occupant 3, and the virtual image V related to the display image J is visually recognized by the occupant 3.
  • the contents displayed by the HUD device 100 in this way are various vehicle information, navigation information, and the like.
  • the display 10 emits image light K representing a display image J for notifying predetermined information (various vehicle information, navigation information, etc.).
  • the display 10 includes a display element (not shown) such as a liquid crystal panel.
  • a transmissive liquid crystal display composed of a light source (not shown) for illuminating the display element.
  • the display 10 is not a transmissive liquid crystal display, but a self-luminous organic EL display, a reflective DMD (Digital Micromirror Device), a reflective and transmissive LCOS (registered trademark: Liquid Crystal On Silicon), or the like. It may be constituted by.
  • the display 10 is connected to wiring (bus 500) from a display controller 400, which will be described later, and the backlight light source emits desired light in accordance with an electrical signal from the bus 500 (display control means 406), and the liquid crystal
  • the panel displays a desired display image J on the display surface in accordance with an electrical signal from the bus 500 (display control means 406).
  • the display image J displayed by the display 10 is deformed in advance so that the occupant 3 can visually recognize the virtual image V without distortion through the windshield 2a in consideration of the curved face of the windshield 2a and the curved face of the free-form curved mirror 30. Displayed. A method of generating the display image J deformed in advance will be described in detail later.
  • the plane mirror 20 reflects the image light K emitted from the display 10 toward the free-form curved mirror 30.
  • the planar mirror 20 is a reflecting member having a plane that folds the image light K emitted from the display 10 to the free-form curved mirror 30, but the present invention is not limited thereto, and a curved mirror may be used.
  • another reflective optical member, a refractive optical member such as a lens, or the like may be appropriately added between the display 10 and the free-form surface mirror 30 as necessary.
  • the free-form surface mirror 30 is formed by forming a reflective film on the surface of a concave base material made of, for example, a synthetic resin material by means such as vapor deposition, and enlarges the display image J (image light K) reflected by the flat mirror 20. At the same time, the display image J (image light K) is deformed and emitted as display light L toward the windshield 2a.
  • the display light L emitted from the free-form curved mirror 30 passes through the transmission part 50b provided in the opening 50a of the housing 50 and travels toward the windshield 2a.
  • the display light L that reaches the windshield 2a and is reflected by the windshield 2a toward the occupant 3 is imaged on the retina of the occupant 3, and the occupant 3 is outside the windshield 2a (in the positive direction of the Z axis).
  • the virtual image V of the display image J is visually recognized together with the outside scene of the windshield 2a.
  • the display image J (image light K) that is deformed and displayed in advance by the display 10 is deformed as display light L by the curved surface of the free-form curved mirror 30, and is further deformed by the curved surface of the windshield 2a. It is visually recognized without distortion.
  • the free-form surface mirror 30 has a curved surface with a concave reflection surface, and this curved surface has a different curvature for each region.
  • the free-form curved mirror 30 has a correction characteristic for deforming the image light K so as to reduce the distortion amount of the virtual image V due to the windshield 2a due to the curvature of the curved surface, and is designed to have a different correction characteristic for each region. It is set in consideration of the curved surface shape of the windshield 2a on which V is projected.
  • the free-form surface mirror 30 is rotated about the rotation axis AX by power transmission from an actuator 40 described later, and the projection position of the display light L on the windshield 2a is adjusted by this rotation. Due to the rotation of the free-form curved mirror 30, the reflection area in which the image light K is deformed to generate the display light L is different. That is, the correction characteristic for deforming the image light K is changed by this rotation.
  • the free-form curved mirror 30 in the present embodiment corrects the distortion of the virtual image V caused by the curved surface of the windshield 2a in consideration of the curved surface shape of the windshield 2a that projects the display light L, and enlarges the virtual image V. Can be projected.
  • the actuator 40 includes a motor 41, a power transmission member 42 such as a gear that transmits the power of the motor 41 to the free-form curved mirror 30, and a support base (not shown) that supports the motor 41, the power transmission member 42, and the free-form curved mirror 30. ), And the free-form curved mirror 30 is rotated around the rotation axis AX in accordance with an electric signal from the bus 500 (display control means 406).
  • the motor 41 generates power for rotating the free-form curved mirror 30 around the rotation axis AX, and includes, for example, a stepping motor 41.
  • FIG. 3 is a schematic cross-sectional view of the HUD device 100 in the YZ plane, and the display 10 and the like are omitted for easy understanding of the drawing.
  • FIG. 3A is a diagram showing a light path when the position of the eye of the occupant 3 is the position 3a.
  • the image light K incident from the plane mirror 20 is enlarged and deformed by the region 30a of the free-form curved mirror 30, and projected as a display light La on a predetermined position of the windshield 2a, and the virtual image Va is projected on the vehicle 2 by the occupant 3a. It is visually recognized by overlapping with the specific target W on the outside.
  • FIG. 3A is a diagram showing a light path when the position of the eye of the occupant 3 is the position 3a.
  • the image light K incident from the plane mirror 20 is enlarged and deformed by the region 30a of the free-form curved mirror 30, and projected as a display light La on a predetermined position of the windshield 2a, and the virtual image Va is projected on the
  • 3B is a diagram illustrating a light path when the eye position of the occupant 3 is a position 3c lower than the position 3a.
  • the actuator 40 rotates the free-form surface mirror 30 based on a control signal from the display controller 400 described later when the position of the occupant 3 is at a position 3c lower than the position 3a.
  • CCW Counter Clock Wise
  • the actuator 40 moves the free-form surface mirror 30 to the rotational axis based on a control signal from the display controller 400 described later.
  • Rotate clockwise around AX (CW: Clock Wise).
  • AX Lock Wise
  • the display light Lb emitted from the free-form surface mirror 30 is projected to a high position on the windshield 2a, and the occupant 3b with a high viewpoint can visually recognize the virtual image Vb superimposed on the specific object W outside the host vehicle 2. it can.
  • the above is the configuration of the HUD device 100 according to the present embodiment, and the display light L emitted from the HUD device 100 is projected onto the windshield 2a of the host vehicle 2, so that as shown in FIG.
  • the virtual image V is visually recognized within a predetermined displayable area E of the upper windshield 2a.
  • the size and shape of the displayable area E are determined by the size and shape of the opening 50a of the housing 50.
  • the position at which the virtual image V is visually recognized can be moved in the vertical direction (Y-axis direction) within the displayable area E by driving the actuator 40 as described above.
  • a method for adjusting the position of the virtual image V and correcting the distortion of the virtual image V in the vehicle information projection system 1 of the present embodiment will be described below.
  • the viewpoint position detection unit (trigger output means and imaging means described in claims) 210 has a camera (imaging means) 210a that captures the face of the occupant 3 in the host vehicle 2, and is acquired by the camera 210a. By analyzing the captured image data using a pattern matching method or the like, the viewpoint position of the occupant 3 is detected, and the viewpoint position data (trigger signal) of the occupant 3 is output to the display controller 400 described later.
  • This viewpoint position data is data indicating two-dimensional viewpoint coordinates (x, y) of the occupant 3 in the X-axis direction and the Y-axis direction.
  • the vehicle ECU 300 is an ECU that comprehensively controls the host vehicle 2 and determines an information video to be displayed on the HUD device 100 based on signals output from various sensors (not shown) mounted on the host vehicle 2. Then, by outputting the instruction data of the information video to the display controller 400 described later, the display controller 400 causes the HUD device 100 to project a display image J (virtual image V) indicating desired vehicle information.
  • the vehicle ECU 300 in the present embodiment is connected to a navigation system (not shown), reads map data in the vicinity of the current position from the storage unit based on position information from a GPS controller (not shown), and guides the route. Information on the guidance route (navigation information) is output to the display controller 400, and a guidance route image or the like is displayed on the HUD device 100, whereby the route guidance to the destination set by the passenger 3 is performed.
  • the display controller 400 controls the operation of the display 10 and the actuator 40, and projects the display light L onto a predetermined position of the windshield 2a.
  • the display controller 400 includes a CPU (Central Processing Unit), a circuit including a memory, and the like.
  • the ECU includes a bus interface unit 401, a memory control unit 402, a ROM 403, an image data generation unit 404, a RAM 405, and a display control unit 406, and includes a HUD device 100, a viewpoint position detection unit 210, and an operation input.
  • Signals are transmitted by bus 220 such as unit 220, vehicle ECU 300 and CAN (Controller Area Network).
  • the bus interface unit 401 performs interface processing with respect to the bus 500. For example, the bus interface unit 401 transmits a request signal for requesting viewpoint position specifying data to the viewpoint position detecting unit 210 via the bus 500, and receives the viewpoint position specifying data from the viewpoint position detecting unit 210 via the bus 500. A process of receiving, a process of receiving an operation signal from the operation input unit 220, a process of receiving a vehicle information signal and a navigation information signal from the vehicle ECU 300, and the like are performed.
  • the memory control unit 402 includes image data based on vehicle information signals and navigation information signals input from the bus interface unit 401 (vehicle ECU 300), and viewpoint position specifying data input from the bus interface unit 401 (viewpoint position detection unit 210).
  • the image conversion table data based on the above is read from the ROM 403, output to the image data generation unit 404, and the display image data generated by the image data generation unit 404 is stored in the RAM 405.
  • the RAM 405 is composed of volatile memory such as VRAM (Video RAM), DRAM (Dynamic RAM) and SRAM (Static RAM), and rewritable nonvolatile memory such as flash memory, and is necessary for drawing one frame.
  • a plurality of frame buffers capable of storing various display image data.
  • the memory control unit 402 further retrieves the display image data from the RAM 405 and outputs the display image data to the display control unit 406 upon receiving a command from the display control unit 406 described later.
  • the display control unit 406 is a buffer in the display control unit 406. Store in a memory (not shown). Since the RAM 405 has a plurality of frame buffers, it is possible to write display control data input from the image data generation unit 404 and read display control data to the display control means 406 in parallel. The writing / reading of the frame buffer in the RAM 405 is switched every time counted by a timer (not shown). Specifically, for example, switching is performed every 20 msec.
  • the image data generation unit 404 converts the image data read from the ROM 403 according to the image conversion table data read from the ROM 403.
  • the image conversion table data is data in which a reference warping parameter is associated with a reference viewpoint coordinate including a reference viewpoint coordinate x and a reference viewpoint coordinate y as shown in FIG. This is data for predistorting the display image J to be displayed on the display 10.
  • the reference warping parameter is data for determining display coordinates on the display 10 of a plurality of reference lattice points (reference lattice points) A in the display image J, and the warping parameter of the present embodiment is c Data for determining the display coordinates of the reference grid points A1 to Ac on the display 10 is included.
  • the display image J converted based on the image conversion table of FIG. 5 is specifically displayed on the display surface of the display 10 as a predistorted image as shown in FIG. 6A, for example.
  • the occupant 3 can visually recognize an undistorted virtual image V as shown in FIG. 6 (b).
  • the image data generation unit 404 derives the coordinates other than the pixel associated with the reference grid point A of the display 10 by linear interpolation, polynomial interpolation, spline interpolation, or the like based on a plurality of adjacent coordinate positions.
  • the image data generation unit 404 reads an image conversion table (warping parameter) stored in advance in the ROM 403 in accordance with the viewpoint coordinates (x, y) of the occupant 3 to obtain predetermined image data. Is converted into display image data distorted in advance.
  • the ROM 403 does not store in advance an image conversion table (warping parameter) corresponding to all viewpoint coordinates (x, y).
  • the image data generation unit 404 uses the image conversion table (reference viewpoint coordinates (reference trigger) and reference warping parameters) stored in advance in the ROM 403 to use the warping parameters suitable for the actual viewpoint coordinates (x, y).
  • interpolation is possible by calculating (warping parameter interpolation processing).
  • the warping parameter interpolation method will be described below.
  • the image data generation unit 404 can interpolate the warping parameters (Pi, Qi) from the viewpoint coordinates (x, y) with a polynomial as shown in the following equation.
  • cij and dij are coefficients corresponding to the i-th reference grid point Ai in the warping parameter, and can be calculated by, for example, the method of least squares.
  • the ROM 403 stores preset coefficients cij and dij, but may be adjustable by operating the operation input unit 220 or the like. By making the coefficient adjustable in this way, the warping parameter can be accurately calculated by applying to the influence of the shape and tolerance of the windshield 2a or the assembly tolerance of the HUD device 100.
  • the image data generation unit 404 may interpolate the warping parameters (Pi, Qi) suitable for the viewpoint coordinates (x, y) by linear interpolation as in the following equations (4) and (5).
  • 3a (xa, ya), 3b (xb, yb), and 3c (xc, yc) are selected from the reference viewpoint coordinates stored in advance in the ROM 403 based on the actual viewpoint coordinates 3 (x, y).
  • the coefficients u and v are calculated using the equations (6) and (7) by linear interpolation within the triangle formed by the reference viewpoint coordinates 3a, 3b and 3c as shown in FIG.
  • the display control means 406 includes a driver that drives the display (display element, light source) 10 and the actuator 40, and performs display control of the display 10 and drive control of the actuator 40.
  • the display control means 406 reads the display image data generated by the image data generation unit 404 from the RAM 405, and displays the display image J that has been distorted in advance by controlling the display 10 based on the display image data. Further, the display control means 406 receives the viewpoint position specifying data output from the viewpoint position detecting unit 210, and drives the actuator 40 based on the viewpoint position specifying data, thereby displaying a virtual image displayed on the windshield 2a. Adjust the position of V.
  • the display control means 406 also adjusts the position of the virtual image V displayed on the windshield 2a by an operation signal from the operation input unit 220.
  • the “image correction process” in this embodiment will be described below based on the operation flow of FIG.
  • the display controller 400 controls the display 10 and the actuator 40 of the HUD device 100 to adjust the position of the virtual image V visually recognized on the windshield 2a. This is a process for correcting the distortion of the virtual image V caused by the curved surface shape.
  • step S10 the display controller 400 first transmits a request signal for requesting viewpoint position specifying data (viewpoint coordinates) that is the viewpoint position of the occupant 3 to the viewpoint position detection unit 210 via the bus 500. Via, the viewpoint position detection unit 210 receives viewpoint position specifying data (trigger signal).
  • step S20 the display controller 400 determines whether it is necessary to adjust the position of the virtual image V based on the received viewpoint position specifying data (determines whether the tilt angle ⁇ of the free-form curved mirror 30 needs to be changed).
  • the image data generation unit 404 receives viewpoint position specifying data (viewpoint coordinates) input from the viewpoint position detection unit 210 in step S30.
  • a warping parameter based on (x, y)) is read from the ROM 403.
  • the actual viewpoint coordinates (x, y) is generated (warping parameter interpolation process).
  • step S40 the display controller 400 adjusts the tilt angle ⁇ of the free-form curved mirror 30 to a desired angle. If the display controller 400 determines in step S20 that the tilt angle ⁇ of the free-form curved mirror 30 does not need to be changed (NO in step S20), the display controller 400 proceeds to step S50 without changing the warping parameter. To do.
  • step S50 the display controller 400 reads image data based on various vehicle information and navigation information input from the vehicle ECU 300 from the ROM 403, and displays image data for displaying the display image J distorted in advance using warping parameters. And display the display image J distorted in advance on the display surface of the display 10 based on the display image data (step S60).
  • the display image J (image light K) deformed and displayed in advance by the display 10 is deformed as display light L by the curved surface of the free-form curved mirror 30, and the display light L corresponds to the viewpoint position of the occupant 3. It is projected on a predetermined area of the windshield 2a, and is deformed by a curved surface of the predetermined area of the windshield 2a so that it can be visually recognized by the occupant 3 without distortion.
  • image conversion table that associates a reference warping parameter for converting the image
  • interpolating an image conversion table suitable for the detected viewpoint position using the reference viewpoint position and the reference warping parameter While reducing the memory capacity of the ROM 403, an image with high display quality can be viewed at a position adapted to the outside scene even when the viewpoint position changes.
  • FIG. 10 is a diagram for explaining an image conversion table in which the tilt angle ⁇ of the free-form curved mirror 30 (actuator 40) is associated with a reference warping parameter
  • FIG. 11 is a diagram of image correction processing in the second embodiment. It is a figure explaining an operation
  • the image conversion table data in the second embodiment is data in which a reference warping parameter is associated with a reference tilt angle (reference drive amount information) ⁇ , and the tilt of the free-form curved mirror 30 (actuator 40).
  • This is data for predistorting the display image J displayed on the display 10 for each angle ⁇ .
  • the image data generation unit 404 reads an image conversion table (warping parameter) stored in advance in the ROM 403 in accordance with the actual tilt angle ⁇ of the free-form curved mirror 30, and converts predetermined image data into display image data that has been distorted in advance. Convert.
  • the tilt angle ⁇ of the free-form curved mirror 30 varies depending on the line-of-sight position of the occupant 3 detected by the viewpoint position detection unit 210 in the first embodiment, but based on an operation signal from the operation input unit 220, the tilt angle ⁇ is free.
  • the tilt angle ⁇ of the curved mirror 30 may change.
  • the signal (trigger signal) indicating the tilt angle ⁇ can be estimated from the number of steps when the display control unit 406 performs step control on the actuator 40, but is not limited thereto, and the tilt angle ⁇ of the free-form curved mirror 30 is not limited thereto.
  • Various known methods can be applied to the estimation method.
  • the ROM 403 does not store in advance an image conversion table (warping parameter) corresponding to all tilt angles ⁇ .
  • the image data generation unit 404 can calculate and interpolate a warping parameter suitable for the tilt angle ⁇ using an image conversion table stored in advance in the ROM 403 (warping parameter interpolation process).
  • the warping parameter interpolation method will be described below.
  • the image data generation unit 404 can interpolate the warping parameters (Pi, Qi) from the actual tilt angle ⁇ using a polynomial such as the following equation.
  • the image data generation unit 404 may interpolate the warping parameters (Pi, Qi) suitable for the tilt angle ⁇ by linear interpolation such as the following equation.
  • ⁇ a and ⁇ b are the reference tilt angles ⁇ corresponding to the reference lattice point A selected by the actual tilt angle ⁇ , and the equations (10) and (10) are obtained by linear interpolation of the two reference tilt angles ⁇ a and ⁇ b. 11) is used to calculate image conversion tables (warping parameters) A1 (P1, Q1) to Ac (Pc, Qc) corresponding to the actual tilt angle ⁇ .
  • the display control means 406 receives the viewpoint position specifying data output from the viewpoint position detecting unit 210, and drives the actuator 40 based on the viewpoint position specifying data, whereby the virtual image V displayed on the windshield 2a is displayed. Adjust the position.
  • the “image correction process” in the second embodiment will be described below based on the operation flow of FIG.
  • step S ⁇ b> 10 a the display controller 400 transmits a request signal for requesting viewpoint position specifying data (viewpoint coordinates) that is the viewpoint position of the occupant 3 to the viewpoint position detection unit 210 via the bus 500. Via the viewpoint position detection unit 210.
  • an operation signal is received from the operation input unit 220.
  • step S20a the display controller 400 determines whether or not the position of the virtual image V needs to be adjusted based on the received viewpoint position specifying data (or operation signal) (the tilt angle ⁇ of the free-form surface mirror 30 needs to be changed). Determine).
  • the image data generation unit 404 rotates the free-form surface mirror 30 based on the viewpoint coordinates (or operation signal) in step S30a.
  • the tilt angle ⁇ is calculated, and the warping parameter corresponding to the tilt angle ⁇ is read from the ROM 403.
  • a warping parameter suitable for the calculated tilt angle ⁇ is determined based on the calculated tilt angle ⁇ and the reference warping parameter stored in advance in the ROM 403. Generate (warping parameter interpolation process).
  • step S40a the display controller 400 adjusts the tilt angle ⁇ of the free-form curved mirror 30 to the calculated desired angle (step S40a). If the display controller 400 determines in step S20a that it is not necessary to change the tilt angle ⁇ of the free-form curved mirror 30 (NO in step S20a), the display controller 400 proceeds to step S50a without changing the warping parameter. To do.
  • step S50a the display controller 400 reads image data based on various types of vehicle information and navigation information input from the vehicle ECU 300 from the RAM 405, and displays image data for displaying the display image J distorted in advance using warping parameters. And display the display image J distorted in advance on the display surface of the display 10 based on the display image data (step S60a).
  • the display image J (image light K) deformed and displayed in advance by the display 10 is deformed as display light L by the curved surface of the free-form curved mirror 30, and the display light L corresponds to the viewpoint position of the occupant 3. It is projected on a predetermined area of the windshield 2a, and is deformed by a curved surface of the predetermined area of the windshield 2a so that it can be visually recognized by the occupant 3 without distortion.
  • the image is displayed in advance in the ROM (storage unit) 403 in order to display the reference tilt angle ⁇ and the display image J distorted in advance on the display 10.
  • An image conversion table in which a reference warping parameter for converting data is associated with each other is stored, and an image conversion table suitable for the detected viewpoint position is interpolated using the reference warping parameter. While reducing the capacity, an image with high display quality can be viewed at a position adapted to the outside scene even when the viewpoint position changes.
  • the position at which the virtual image V is projected is adjusted by rotating the concave mirror (correcting optical system) 30 that is a reflective optical member.
  • the position at which the virtual image V is projected may be adjusted.
  • the display image L emitted from the HUD device 100 is rotated and / or moved by a refractive correction optical system such as a lens that deforms the display image J (image light K), thereby projecting a virtual image V.
  • the position may be adjusted.
  • the image conversion table in which the reference parameter relating to the rotation (movement) of the correction optical system and the reference warping parameter are associated with each other is stored in the ROM 403 in advance, and the image data generation unit 404 has the actual correction optical system. You may make it interpolate the warping parameter according to the parameter regarding rotation (movement). Also with such a configuration, in the same manner as in the above-described embodiment, this specific correspondence relationship is obtained in a state where the image correction property of the correction optical system and the curved surface property of the windshield (projection target) 2a are in a specific correspondence relationship. Since the suitable distortion display image J can be displayed on the display 10, the distortion of the virtual image V can be corrected with high accuracy.
  • the display controller 400 stops the display of the display device 10 while driving the free-form surface mirror 30 (decreases visibility to the extent that the occupant 3 cannot visually recognize). Is desirable. With such a configuration, it is possible to prevent the virtual image V from being distorted and visually recognized by the occupant 3 when the display position of the virtual image V continuously moves. In addition, even after the free-form surface mirror 30 reaches the target position (angle), the display on the display 10 may be stopped until the image correction by the warping parameter is completed.
  • the transmission / reflection curved surface is not limited to the windshield 2a of the host vehicle 2, and may be a combiner constituted by a plate-shaped half mirror, a hologram element, or the like.
  • the vehicle information projection system of the present invention can be applied to, for example, a head-up display device that projects an image on a windshield of a vehicle and displays a virtual image.
  • SYMBOLS 1 Vehicle information projection system, 2a Windshield (projection object), 100 HUD apparatus (projection apparatus), 10 display, 20 plane mirror, 30 free-form surface mirror (optical member), 40 actuator, 210 viewpoint position detection part (trigger output) Means), 220 operation input unit (trigger output means), 400 display controller (control means), E displayable area, K display image, L image light, N display light, V virtual image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Geometry (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Instrument Panels (AREA)

Abstract

La présente invention permet, avec un usage de mémoire réduit, de rendre une image ayant une qualité d'affichage élevée, visible au niveau d'une position qui reçoit l'environnement extérieur même si la position d'observation change. Des tables de transformation d'image qui associent des positions d'observation de référence à des paramètres d'enveloppe de référence pour la transformation de données d'image afin d'afficher une image (J) d'affichage pré-déformée sur un dispositif d'affichage (10) sont placées dans une mémoire morte (ROM) (unité de mémoire) (403) à l'avance, et lesdites positions d'observation de référence et lesdits paramètres d'enveloppe de référence sont utilisés pour réaliser une interpolation sur une table de transformation d'image qui est appropriée à une position d'observation détectée.
PCT/JP2014/077562 2013-10-31 2014-10-16 Système de projection et dispositif de projection d'informations de véhicule WO2015064371A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013227139A JP2015087619A (ja) 2013-10-31 2013-10-31 車両情報投影システム及び投影装置
JP2013-227139 2013-10-31

Publications (1)

Publication Number Publication Date
WO2015064371A1 true WO2015064371A1 (fr) 2015-05-07

Family

ID=53003980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/077562 WO2015064371A1 (fr) 2013-10-31 2014-10-16 Système de projection et dispositif de projection d'informations de véhicule

Country Status (2)

Country Link
JP (1) JP2015087619A (fr)
WO (1) WO2015064371A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110001400A (zh) * 2017-12-06 2019-07-12 矢崎总业株式会社 车辆用显示装置
WO2019244326A1 (fr) * 2018-06-22 2019-12-26 三菱電機株式会社 Dispositif d'affichage vidéo
CN111971197A (zh) * 2018-03-30 2020-11-20 日本精机株式会社 显示控制装置、平视显示设备
JP2023029340A (ja) * 2018-03-30 2023-03-03 パナソニックIpマネジメント株式会社 映像表示システム、映像表示方法、及び、プログラム
EP4265460A4 (fr) * 2020-12-16 2024-07-17 Sony Group Corp Appareil de commande d'affichage, procédé de commande d'affichage, support d'enregistrement, et système d'affichage

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6390032B2 (ja) * 2015-09-02 2018-09-19 カルソニックカンセイ株式会社 ヘッドアップディスプレイの歪補正方法とそれを用いたヘッドアップディスプレイの歪補正装置
JPWO2017056953A1 (ja) * 2015-10-02 2018-07-26 株式会社リコー 表示装置
WO2017061026A1 (fr) * 2015-10-09 2017-04-13 日立マクセル株式会社 Dispositif d'affichage d'image
JP6726674B2 (ja) * 2015-10-15 2020-07-22 マクセル株式会社 情報表示装置
KR101855940B1 (ko) 2015-10-27 2018-05-09 엘지전자 주식회사 차량용 증강현실 제공 장치 및 그 제어방법
WO2017141896A1 (fr) * 2016-02-19 2017-08-24 日本精機株式会社 Dispositif d'affichage tête haute
WO2017168953A1 (fr) * 2016-04-01 2017-10-05 株式会社デンソー Dispositif de véhicule, programme de véhicule et programme de conception de filtre
JP6493361B2 (ja) 2016-04-01 2019-04-03 株式会社デンソー 車両用装置、車両用プログラム、フィルタ設計プログラム
FR3049724B1 (fr) * 2016-04-05 2018-05-25 Valeo Comfort And Driving Assistance Systeme d'affichage tete-haute
EP3477291B1 (fr) 2016-06-23 2024-02-28 NGK Insulators, Ltd. Capteur de gaz et procédé de mesure des concentrations d'une pluralité de composants cibles dans un gaz devant être mesuré
JP2018077400A (ja) * 2016-11-10 2018-05-17 日本精機株式会社 ヘッドアップディスプレイ
FI129214B (en) * 2017-11-28 2021-09-30 Dispelix Oy Head-up screen
WO2019181926A1 (fr) * 2018-03-20 2019-09-26 日本精機株式会社 Dispositif d'affichage tête haute
CN113302661B (zh) * 2019-01-10 2024-06-14 三菱电机株式会社 信息显示控制装置及方法、以及记录介质
JP7467883B2 (ja) * 2019-04-29 2024-04-16 セイコーエプソン株式会社 回路装置、電子機器及び移動体
CN111861865B (zh) 2019-04-29 2023-06-06 精工爱普生株式会社 电路装置、电子设备以及移动体
JP7221161B2 (ja) * 2019-07-10 2023-02-13 マクセル株式会社 ヘッドアップディスプレイ及びそのキャリブレーション方法
JPWO2021124916A1 (fr) 2019-12-18 2021-06-24
US20230008648A1 (en) 2019-12-25 2023-01-12 Nippon Seiki Co., Ltd. Display control device and head-up display device
JPWO2021153616A1 (fr) 2020-01-31 2021-08-05
JP7456290B2 (ja) * 2020-05-28 2024-03-27 日本精機株式会社 ヘッドアップディスプレイ装置
JP7559606B2 (ja) 2021-02-24 2024-10-02 日本精機株式会社 表示制御装置、表示装置、及び画像表示制御方法
WO2022230995A1 (fr) * 2021-04-30 2022-11-03 日本精機株式会社 Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé de commande d'affichage
JP7564152B2 (ja) * 2022-06-14 2024-10-08 矢崎総業株式会社 車両用表示装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003090761A (ja) * 2001-07-12 2003-03-28 Minolta Co Ltd 分光特性測定装置および同装置の分光感度の波長シフト補正方法
JP2003287707A (ja) * 2002-03-27 2003-10-10 Denso Corp 画像変換方法、画像処理装置、ヘッドアップディスプレイ、プログラム
JP2008163889A (ja) * 2006-12-28 2008-07-17 Hitachi Ltd 内燃機関の燃料噴射量制御装置
JP2009150947A (ja) * 2007-12-19 2009-07-09 Hitachi Ltd 車両用ヘッドアップディスプレイ装置
JP2009246505A (ja) * 2008-03-28 2009-10-22 Toshiba Corp 画像表示装置及び画像表示装置
JP2013502148A (ja) * 2009-08-11 2013-01-17 エスアールエス・ラブス・インコーポレーテッド スピーカーの知覚されるラウドネスを増加させるためのシステム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003090761A (ja) * 2001-07-12 2003-03-28 Minolta Co Ltd 分光特性測定装置および同装置の分光感度の波長シフト補正方法
JP2003287707A (ja) * 2002-03-27 2003-10-10 Denso Corp 画像変換方法、画像処理装置、ヘッドアップディスプレイ、プログラム
JP2008163889A (ja) * 2006-12-28 2008-07-17 Hitachi Ltd 内燃機関の燃料噴射量制御装置
JP2009150947A (ja) * 2007-12-19 2009-07-09 Hitachi Ltd 車両用ヘッドアップディスプレイ装置
JP2009246505A (ja) * 2008-03-28 2009-10-22 Toshiba Corp 画像表示装置及び画像表示装置
JP2013502148A (ja) * 2009-08-11 2013-01-17 エスアールエス・ラブス・インコーポレーテッド スピーカーの知覚されるラウドネスを増加させるためのシステム

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110001400A (zh) * 2017-12-06 2019-07-12 矢崎总业株式会社 车辆用显示装置
CN111971197A (zh) * 2018-03-30 2020-11-20 日本精机株式会社 显示控制装置、平视显示设备
JP2023029340A (ja) * 2018-03-30 2023-03-03 パナソニックIpマネジメント株式会社 映像表示システム、映像表示方法、及び、プログラム
CN111971197B (zh) * 2018-03-30 2023-11-14 日本精机株式会社 显示控制装置、平视显示设备
JP7515229B2 (ja) 2018-03-30 2024-07-12 パナソニックオートモーティブシステムズ株式会社 映像表示システム、映像表示方法、及び、プログラム
WO2019244326A1 (fr) * 2018-06-22 2019-12-26 三菱電機株式会社 Dispositif d'affichage vidéo
JPWO2019244326A1 (ja) * 2018-06-22 2021-01-07 三菱電機株式会社 映像表示装置
EP4265460A4 (fr) * 2020-12-16 2024-07-17 Sony Group Corp Appareil de commande d'affichage, procédé de commande d'affichage, support d'enregistrement, et système d'affichage

Also Published As

Publication number Publication date
JP2015087619A (ja) 2015-05-07

Similar Documents

Publication Publication Date Title
WO2015064371A1 (fr) Système de projection et dispositif de projection d'informations de véhicule
JP6444513B2 (ja) 車両、ヘッドアップディスプレイシステム及びその投影画像の高さの調節方法
US20090243963A1 (en) Image display apparatus and method for displaying an image
JP6650584B2 (ja) ヘッドアップディスプレイおよびヘッドアップディスプレイを搭載した移動体
US20180143431A1 (en) Head-up display
US20170169612A1 (en) Augmented reality alignment system and method
US10775621B2 (en) Method, device and computer-readable storage medium with instructions for setting a head-up display in a transportation vehicle
US6877864B1 (en) Projector and method of correcting image distortion
JP6724885B2 (ja) 虚像表示装置
JP2016101805A (ja) 表示装置、制御方法、プログラム、及び記憶媒体
JP2014199385A (ja) 表示装置及びその表示方法
JP6911868B2 (ja) ヘッドアップディスプレイ装置
WO2004086135A1 (fr) Afficheur video
JP6724886B2 (ja) 虚像表示装置
JP2014026244A (ja) 表示装置
US20190258057A1 (en) Head-up display
WO2019207965A1 (fr) Dispositif d'affichage tête haute
JP7221161B2 (ja) ヘッドアップディスプレイ及びそのキャリブレーション方法
WO2014049787A1 (fr) Dispositif d'affichage, procédé d'affichage, programme et support d'enregistrement
JP2018132685A (ja) ヘッドアップディスプレイ装置
JP6845988B2 (ja) ヘッドアップディスプレイ
JP7062038B2 (ja) 虚像表示装置
JP2005099680A (ja) 映像表示システム
WO2018105585A1 (fr) Dispositif d'affichage tête haute
JP2019026198A (ja) ヘッドアップディスプレイ装置とそのためのドライバ視点検出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14857017

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14857017

Country of ref document: EP

Kind code of ref document: A1