WO2020016994A1 - Image distortion inspection device, image distortion inspection method, and program - Google Patents

Image distortion inspection device, image distortion inspection method, and program Download PDF

Info

Publication number
WO2020016994A1
WO2020016994A1 PCT/JP2018/027124 JP2018027124W WO2020016994A1 WO 2020016994 A1 WO2020016994 A1 WO 2020016994A1 JP 2018027124 W JP2018027124 W JP 2018027124W WO 2020016994 A1 WO2020016994 A1 WO 2020016994A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
distortion
light beam
inspection
light
Prior art date
Application number
PCT/JP2018/027124
Other languages
French (fr)
Japanese (ja)
Inventor
長谷川 雄史
塚原 整
雅浩 虻川
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/027124 priority Critical patent/WO2020016994A1/en
Publication of WO2020016994A1 publication Critical patent/WO2020016994A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/04Diagnosis, testing or measuring for television systems or their details for receivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to an image distortion inspection device, an image distortion inspection method, and a program.
  • HUD head-up display
  • a virtual image is displayed by projecting an image displayed on a HUD light source onto a concave half mirror called a combiner. Since light is projected from the HUD light source to the combiner through a plurality of lenses, an image displayed by the HUD light source is generally displayed as a virtual image as a distorted image.
  • an image distortion value which is a degree of distortion of a virtual image is inspected in advance, and a distortion correction value is calculated from the image distortion value.
  • a virtual image without distortion is displayed.
  • Image distortion values generally include a method of calculating from an optical design value of the HUD and a method of calculating a distortion inspection image projected on a combiner with a camera and calculating the degree of distortion of the captured image. Since optical members such as lenses constituting the HUD are often manufactured as optical members having optical characteristics deviating from optical design values, the image distortion value is calculated from an image obtained by capturing a distortion inspection image with a camera. A calculation method is desirable.
  • Patent Literature 1 discloses a technique in which a camera installed inside a vehicle is moved, a distortion inspection image is captured at several representative driver viewpoint positions, and an image distortion value and a distortion correction value are calculated. Is described.
  • an object of one or more aspects of the present invention is to enable an imaging unit outside a vehicle to capture an image and inspect the degree of distortion of the image.
  • the image distortion inspection apparatus is configured to project an image based on light output from an image display device installed inside a vehicle to project a virtual image based on an original image at a position where a light beam is input.
  • An image pickup unit that picks up an image from the outside of the vehicle so that a light ray direction that is a direction in which the light ray is input, and an inspection point included in the original image from the picked-up image
  • a detection unit that detects the light beam position and the light beam direction of the light beam, and a virtual image display in which a point corresponding to the inspection point is displayed in the virtual image from the detected light beam position and the detected light beam direction.
  • a display position specifying unit that specifies a position, a pixel position that is a position of a pixel of the inspection point in the original image, and the specified virtual image display position, a degree of distortion of the original image in the virtual image.
  • a distortion calculator for calculating a degree of distortion indicated characterized in that it comprises a.
  • the image distortion inspection method includes an image distortion inspection method that projects an image based on light output from an image display device installed inside a vehicle to project a virtual image based on an original image at a position where a light beam is input.
  • the light ray position and the light ray direction are detected, and from the detected light ray position and the detected light ray direction, in the virtual image, a virtual image display position at which a point corresponding to the inspection point is displayed is specified.
  • a distortion degree indicating the degree of distortion of the original image in the virtual image That.
  • a program is a program that causes a computer to receive light output from an image display device installed inside a vehicle in order to project a virtual image based on an original image, at a position where a light beam is input.
  • a certain ray position and a ray direction that is a direction in which the ray is input, from an image taken from outside the vehicle, a ray position of a ray indicating a point for inspection included in the original image and
  • a detection unit that detects a light beam direction
  • a display position specifying unit that specifies a virtual image display position at which a point corresponding to the inspection point is displayed in the virtual image from the detected light beam position and the detected light beam direction.
  • a distortion degree indicating a degree of distortion of the original image in the virtual image from a pixel position that is a position of a pixel of the inspection point in the original image and the specified virtual image display position. It characterized in that to function as a strain calculating unit, for calculating a household.
  • an image can be captured by an imaging unit outside the vehicle, and the degree of distortion of the image can be inspected.
  • FIG. 3 is a schematic diagram showing a usage state of an image distortion inspection system including the image distortion inspection device according to Embodiments 1 to 3.
  • (A) and (b) are schematic diagrams for explaining an image distortion value. It is the schematic which shows the relationship between a virtual image and a virtual image projection image. It is a schematic diagram for explaining image distortion correction processing.
  • (A) and (b) are schematic diagrams for explaining a distortion correction value.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an image distortion inspection device and a HUD according to Embodiments 1 and 2. It is a schematic diagram for explaining a light beam space image.
  • (A) And (b) is a schematic diagram for explaining an example of a light beam space camera using a micro lens-array.
  • FIG. 2 is a block diagram schematically showing a functional configuration of the image distortion inspection device according to the first embodiment. It is the schematic which shows an example of the optical model of HUD. It is the schematic which shows the 1st example of the image for distortion inspections.
  • FIG. 3 is a schematic diagram illustrating a relationship between a microlens-array and an imaging surface. FIG. 3 is a schematic diagram for explaining a relationship between a microlens and an imaging surface. It is a schematic diagram for explaining a virtual image display position.
  • FIG. 9 is a block diagram schematically showing a functional configuration of an image distortion inspection device according to a second embodiment.
  • FIG. 11 is a block diagram illustrating a hardware configuration example of an image distortion inspection device according to a third embodiment.
  • FIG. 11 is a block diagram schematically showing a functional configuration of an image distortion inspection device according to a third embodiment.
  • 9 is a flowchart illustrating an inspection method in the image distortion inspection device according to the third embodiment.
  • FIG. 1 is a schematic diagram showing a usage state of an image distortion inspection system 100 including an image distortion inspection device 110 according to the first embodiment.
  • the image distortion inspection device 110 is arranged outside the vehicle 101, and inspects the degree of image distortion of an image to be displayed using an image display device (not shown) installed inside the vehicle 101.
  • a HUD is used as the image display device.
  • the image display device is referred to as a HUD.
  • the light projected from the HUD light source 102 of the HUD is divided into reflected light reflected by the combiner 103 and a small amount of transmitted light transmitted through the combiner 103.
  • the combiner 103 functions as a reflector that reflects part of the light projected from the HUD light source 102.
  • the transmitted light passes through the combiner 103 from the HUD light source 102 inside the vehicle 101, travels outside the vehicle 101, and reaches the imaging unit 130 of the image distortion inspection device 110.
  • the reflected light reaches the driver's viewpoint position, and the driver visually recognizes the virtual image 171 displayed at the virtual image display position in front of the vehicle 101 by the reflected light. Since the light from the HUD light source 102 to the combiner 103 is projected through a plurality of lenses, the direction of the light beam (traveling direction) changes depending on the lens.
  • Optical members such as lenses constituting the HUD are often manufactured as optical members having optical characteristics deviating from optical design values. Therefore, the direction of the light beam from the HUD light source 102 changes in a direction different from the assumed optical design value. As a result, both the transmitted light and the reflected light of the combiner 103 travel in directions different from the optical design values, and the driver who observes the reflected light visually recognizes the distorted image as the virtual image 171.
  • the image distortion inspection device 110 captures an image of the transmitted light of the combiner 103 by the imaging unit 130, and detects a light beam position where the light beam of the transmitted light is input and a light beam direction where the light beam is input. Thus, an image distortion value indicating the degree of HUD image distortion is calculated.
  • FIGS. 2A and 2B are schematic diagrams for explaining image distortion values.
  • a display image 170 shown in FIG. 2A is an image displayed on a liquid crystal panel (not shown) of the HUD light source 102.
  • the display resolution of the display image 170 is determined by the liquid crystal panel. As an example, when using a Full-HD (High Definition) liquid crystal panel, the horizontal resolution is 1920 pixels and the vertical resolution is 1080 pixels.
  • a virtual image projection image 172 shown in FIG. 2A is obtained by converting a virtual image 171 (see FIG. 1) in a three-dimensional space displayed at a virtual image display position where a virtual image is displayed into an arbitrary two-dimensional plane. This is an image generated when projected on a space.
  • the virtual image projection image 172 is equivalent to an image obtained when a virtual image in a three-dimensional space is captured by a virtual camera.
  • the traveling direction of the front of the vehicle 101 is the Z axis
  • the height direction is the Y axis
  • the lateral direction is the X axis.
  • an arbitrary two-dimensional plane space on which a virtual image is projected is a plane space (X axis-Y axis) perpendicular to the Z axis.
  • the horizontal direction of the liquid crystal panel is defined as a U axis
  • the vertical direction is defined as a V axis.
  • FIG. 3 is a schematic diagram showing the relationship between the virtual image 171 and the virtual image projection image 172.
  • the arbitrary position shown in FIG. 3 is a position translated in the Z-axis direction from the point 171a at the virtual image display position formed by the center pixel of the input image input to the HUD to the driver's seat. This arbitrary position may be another position, but is preferably near the driver's viewpoint.
  • the virtual image projection image 172 assumes a plane perpendicular to the Z-axis direction (X-axis-Y-axis plane) between the virtual image display position and the arbitrary position, and a straight line from the virtual image display position to the arbitrary position intersects with the plane. The position to be calculated is calculated. If there is no HUD image distortion, the virtual image projection image 172 matches the display image 170 (see FIG. 2A).
  • the image distortion value indicates a pixel of the virtual image projection image 172 corresponding to each pixel of the display image 170.
  • the image distortion value is a matrix representing the positional relationship between each pixel of the display image 170 and each pixel of the virtual image projection image 172. For example, let the pixel at the upper left of the display image 170 be the pixel (1,1), and let the image distortion value corresponding to the pixel (1,1) be (U 1-1 , V 1-1 ). Similarly, let the image distortion value corresponding to the pixel (1, 2) of the display image 170 be (U 1-2 , V 1-2 ).
  • U 1-1 represents a pixel position in the U-axis direction on the corresponding virtual projection image to a virtual image display position where the pixel (1,1) is made of a display image 170, V 1-1, the V-axis Represents the pixel position in the direction.
  • the image distortion values are 1920 ⁇ 1080 matrices, and are two matrices representing a U-axis value and a V-axis value, respectively.
  • the HUD calculates a distortion correction value with reference to the image distortion value calculated by the image distortion inspection apparatus 110, and deforms the display image 170 displayed on the HUD light source 102 according to the distortion correction value, thereby performing an image distortion correction process. Is carried out.
  • FIG. 4 is a schematic diagram for explaining the image distortion correction processing.
  • An input image 173 shown in FIG. 4 is an image of a content to be displayed via the combiner 103.
  • the HUD deforms the input image 173 with reference to the distortion correction value to generate a distortion corrected image 174.
  • the HUD displays the generated distortion-corrected image 174 on the liquid crystal panel on the HUD light source 102 and projects it on the combiner 103. Even when the display image displayed on the liquid crystal panel is distorted by the optical member, the HUD The distortions cancel each other out due to the image deformation, and an image 175 without distortion is displayed as a virtual image.
  • FIGS. 5A and 5B are schematic diagrams for explaining the distortion correction value. As shown in FIG. 5A, the distortion correction value corresponding to the pixel position (1, 1) of the distortion corrected image 174 is (HU 1-1 , HV 1-1 ).
  • the distortion correction value indicates a pixel of the distortion correction image 174 corresponding to each pixel of the input image 173, as shown in FIG.
  • the distortion correction value is a matrix representing the positional relationship between each pixel of the input image 173 and each pixel of the distortion correction image 174.
  • the pixel at the upper left of the distortion corrected image 174 is a pixel (1, 1)
  • a distortion correction value corresponding to the pixel (1, 1) is (HU 1-1 , HV 1-1 ).
  • the distortion correction value corresponding to the pixel (1, 2) of the distortion correction image 174 be (HU 1-2 , HV 1-2 ).
  • the HUD substitutes the pixel value of the pixel position (HU 1-1 , HV 1-1 ) of the input image 173 into the pixel value of the pixel position (1,1) of the distortion corrected image 174. Generates a distortion corrected image 174.
  • FIG. 6 is a block diagram illustrating a hardware configuration example of the image distortion inspection device 110 and the HUD.
  • the image distortion inspection apparatus 110 includes an image processing device 111, an imaging device 117, and a wireless communication device 118.
  • the image processing device 111 is a device that calculates an image distortion value from an image captured by the imaging device 117.
  • the image processing device 111 is an interface for connecting a CPU (Central Processing Unit) 112 functioning as a processor, a RAM (Random Access Memory) 113 and a ROM (Read Only Memory) 114 functioning as a memory, and an imaging device 117. It comprises a sensor IO (Input-Output) 115 and a wireless communication IO 116 which is an interface for connecting a wireless communication device 118.
  • a CPU Central Processing Unit
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the functions executed by the image processing device 111 may be realized by the CPU 112 reading a program stored in the ROM 114 into the RAM 113 and executing the read program. Such a program may be provided through a network, or may be provided by being recorded on a recording medium. Such a program may be provided as a program product, for example.
  • the image processing device 111 may be provided as a computer. As the image processing device 111, a general computer may be used, or a device in which the above devices are integrated into an embedded device or an FPGA board may be used.
  • the imaging device 117 is a light space camera capable of acquiring a light space image.
  • the light ray space image is an image showing the position and direction of a light ray existing in an arbitrary three-dimensional space. As shown in FIG. 7, if the position and direction of the light beam that reaches the two-dimensional plane can be obtained, a light space image in a three-dimensional space can be calculated from the light space image.
  • a normal camera cannot record the direction of a light beam only, but it can record the direction of a light beam by using a light beam space camera.
  • FIGS. 8A and 8B are schematic diagrams for explaining an example of a light beam spatial camera using a microlens-array.
  • FIG. 8A shows an optical model of a normal camera
  • FIG. 8B shows an optical model of a ray space camera using a microlens-array.
  • FIG. 8A in a normal camera, light from a subject is focused on an imaging surface 117b by a lens 117a to capture an image of the subject.
  • a lens 117a For example, in FIG. 8A, light from the subject 1 is condensed on the imaging surface 117b by the lens 117a, and an image in which the subject 1 is in focus can be obtained. The light of the subject 2 is not converged on the imaging surface 117b and becomes an out-of-focus image.
  • a microlens-array in which a plurality of microlenses 117c are arranged is provided between a lens 117a and an imaging surface 117b.
  • the imaging surface 117b is arranged at a position where the light beam transmitted through the micro lens 117c is focused.
  • the lens is a device that converts the direction of a light beam into the position of an image
  • the direction of the light beam that has reached the microlens 117c can be calculated from the position of the light beam that has reached the imaging surface 117b.
  • FIG. 8B shows an example in which a microlens-array is added between the lens 117a and the imaging surface 117b.
  • FIG. Can also be configured by arranging the cameras 117d vertically and horizontally and horizontally.
  • the wireless communication device 118 is a communication device for performing wireless communication with the HUD 150, and functions as a communication unit for performing communication with the HUD 150.
  • a wireless LAN Local Area Network
  • Wi-Fi registered trademark
  • the HUD 150 includes an image processing device 151, an image display device 157, and a wireless communication device 158.
  • the image processing device 151 is a device that outputs a display image to the image display device 157.
  • the image processing device 151 is an interface for connecting the CPU 152 that functions as a processor, the RAM 153 and the ROM 154 that function as memories, the display IO 155 that is an interface for connecting the image display device 157, and the wireless communication device 158. And a wireless communication IO 156.
  • the image processing device 151 deforms the input image with reference to the distortion correction value calculated from the image distortion value inspected by the image distortion inspection device 110, and performs distortion correction as the deformed input image.
  • the image is output to the image display device 157.
  • the image processing device 151 a general computer may be used, or a device in which the above devices are integrated into an embedded device or an FPGA board may be used.
  • the image display device 157 is an image projection device including the HUD light source 102 and the combiner 103.
  • the image display device 157 projects the distortion corrected image acquired from the image processing device 151 from the HUD light source 102 to the combiner 103 to display a virtual image.
  • the wireless communication device 158 is a communication device that performs wireless communication with the image distortion inspection device 110, and functions as a communication unit that communicates with the image distortion inspection device 110.
  • wireless communication is used for communication between the image distortion inspection device 110 and the HUD 150.
  • wired communication may be performed using a wired cable such as a LAN cable or a CAN cable.
  • the image distortion inspection apparatus 110 and the HUD 150 may each include a wired communication device for performing wired communication.
  • the image distortion value inspected by the image distortion inspection apparatus 110 may be stored in an arbitrary recording medium, and the recording medium may be read by the HUD 150 so that a hardware configuration without a wireless communication device or a priority communication device may be adopted.
  • the image distortion inspection apparatus 110 may include a writing device for writing data on a recording medium
  • the HUD 150 may include a reading device for reading data from the recording medium.
  • FIG. 10 is a block diagram schematically showing a functional configuration of the image distortion inspection device 110.
  • the image distortion inspection device 110 includes an imaging unit 130, a light ray detection unit 131, a display position identification unit 132, a display position correspondence information storage unit 133, and an image distortion value calculation unit 134.
  • the hardware that implements the function of the imaging unit 130 is the imaging device 117 in FIG.
  • the hardware that realizes the functions of the light ray detection unit 131, the display position identification unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134 is the image processing device 111 in FIG.
  • the imaging unit 130 converts the image of the light output from the HUD 150 to project a virtual image into the vehicle 101 so that the position of the light beam and the direction of the light beam can be recognized. Image from outside. Specifically, when the HUD light source 102 displays a distortion inspection image as an original image, the imaging unit 130 captures a light space image output from the HUD light source 102 and transmitted by the combiner 103.
  • FIG. 11 is a schematic diagram illustrating an example of an optical model of the HUD 150 that is the image display device 157.
  • the light output from the HUD light source 102 reaches the combiner 103 and is split into reflected light and transmitted light.
  • the light beam of the transmitted light is input to the imaging unit 130.
  • FIG. 12 is a schematic diagram illustrating an example of a distortion inspection image.
  • the distortion inspection image 176 shown in FIG. 12 is an image in which vertical and horizontal white lines are superimposed on a black background.
  • the degree of distortion of the image is inspected from the degree of distortion of the white line.
  • the pixels constituting the white line are used as inspection points, and the degree of image distortion is inspected using the inspection points.
  • the inspection point may be a pixel where white lines cross each other.
  • the light beam detection unit 131 is a detection unit that detects the light beam position and the light beam direction of the light beam indicating the inspection point from the image captured by the imaging unit 130.
  • a light beam position and a light beam direction are detected when a light beam space camera using a microlens-array is used as the imaging device 117.
  • FIG. 13 is a schematic diagram showing the relationship between the microlens-array and the imaging surface.
  • the light beam that has reached the microlens 117c from the lens 117a is focused on the imaging surface 117b.
  • FIG. 14 is a schematic diagram for explaining the relationship between the micro lens 117c and the imaging surface 117b.
  • one microlens 117c # 1 included in the microlens-array is assigned a pixel area 117b # 1 including a plurality of pixels included in the imaging surface 117b. ing.
  • the pixel region 117b # 1 includes pixels in a range where light that has reached the microlenses 117c # 1 from the lens 117a is focused on the imaging surface 117b.
  • the ray position is detected from the position of the microlens-array that the ray has reached.
  • the method of detecting the position of the light beam is almost the same as that of the normal camera.
  • one microlens corresponds to one pixel on the imaging surface of the normal camera.
  • the light beam that has reached the microlens is focused on the imaging surface, so that a light beam that is not focused can be acquired at the installation position of the microlens-array.
  • the position of the light beam indicating the inspection point is detected.
  • the lens is a device that converts the direction of a light beam into the position of an image
  • the position of the light beam that reaches the imaging surface 117b changes depending on the direction of the light beam that has reached the microlens 117c.
  • the direction of the light beam can be detected from the position where the light beam reaches the imaging surface 117b.
  • the direction of the light beam can be detected by the position where the light beam from one micro lens 117c # 1 is focused on the corresponding pixel area 117b # 1.
  • the direction of the light beam indicating the inspection point is detected. For example, when the ray position of a ray indicating a point for inspection is specified, the ray direction of the ray input to the ray position is detected.
  • the display position specifying unit 132 specifies the virtual image display position of the HUD 150 from the light beam position and the light beam direction detected by the light beam detecting unit 131. Specifically, the display position specifying unit 132 corresponds to the inspection point in the virtual image projected by the HUD 150 from the ray position and the ray direction of the point corresponding to the inspection point detected by the ray detection unit 131.
  • the virtual image display position which is the position where the point to be displayed is displayed is specified.
  • the display position specifying unit 132 specifies the virtual image display position corresponding to the light beam position and the light beam direction detected by the light beam detection unit 131 by referring to the display position correspondence information.
  • the display position correspondence information storage unit 133 is an information storage unit that stores display position correspondence information.
  • the display position correspondence information is derived from the optical design values of the HUD 150.
  • FIG. 15 is a schematic diagram for explaining a virtual image display position.
  • the driver visually recognizes the virtual image at the virtual image display position by the reflected light of the combiner 103.
  • the reflecting surface of the combiner 103 has a concave shape, and the light from the HUD light source 102 installed inside the vehicle 101 is reflected by the concave mirror.
  • the reflected light from the concave mirror has the same function as the transmitted light from the convex lens.
  • the concave mirror is , Can be replaced as a convex lens.
  • FIG. 16 is a schematic diagram showing an example in which the concave mirror of the combiner 103 is replaced with a convex lens 103 # 1 using the virtual HUD light source 102 # 1.
  • the focal length of the convex lens 103 # 1 shown in FIG. 16 is f and the distance between the virtual HUD light source 102 # 1 and the convex lens 103 # 1 is a
  • the Z-axis direction from the convex lens 103 # 1 to the virtual image display position Satisfies the following relational expression (1).
  • the distance a between the virtual HUD light source 102 # 1 and the convex lens 103 # 1 is the same as the distance from the HUD light source 102 to the combiner 103 and is known. Therefore, the distance b in the Z-axis direction from the convex lens 103 # 1 to the virtual image display position can be calculated by Expression (1).
  • a virtual image display position (X V1 , YV1 , b) can also be calculated.
  • the virtual image display position can be calculated from the optical design value of the HUD 150. Further, the position and direction of the light beam reflected from the HUD light source 102 and reflected by the combiner 103 are also specified based on the installation position of the HUD light source 102 and the optical characteristics of the combiner 103. This makes it possible to create information indicating the correspondence between the virtual image display position and the ray position and ray direction of the reflected light.
  • FIG. 17 is a schematic diagram illustrating an example of information indicating a correspondence relationship between a virtual image display position and a light ray position and a light ray direction of reflected light.
  • the virtual image display position is represented by one point in a three-dimensional space.
  • (X V1 , Y V1 , Z V1 ) in FIG. 17 indicates the position of one point of the X-axis value X V1 , the Y-axis value Y V1 , and the Z-axis value Z V1 .
  • the ray position and ray direction of the reflected light are represented by one point on the three-dimensional space and the vector direction. In FIG.
  • (X R1 , Y R1 , Z R1 , ⁇ R1 , ⁇ R1 ) indicates the position of one point of the X-axis value X R1 , the Y-axis value of Y R1 , and the Z-axis value of Z R1.
  • the angle of the Z axis plane is ⁇ R1
  • the angle between the Y axis and the Z axis direction is the vector direction of ⁇ R1 .
  • ray position and beam direction (X R1, Y R1, Z R1, ⁇ R1, ⁇ R1) a virtual image display position at which reflected light is made is (X V1, Y V1, Z V1), and the light ray position and ray direction (X R2, Y R2, Z R2, ⁇ R2, virtual image display position phi R2) in which reflected light is made is (X V2, Y V2, Z V2).
  • the light ray position and the light ray direction of the reflected light are represented by one point in the three-dimensional space and the vector direction. However, as shown in FIG. May be expressed by the position of intersection with the two-dimensional plane and the vector direction.
  • ray position and beam direction of FIG. 17 (X R1, Y R1, Z R1, ⁇ R1, ⁇ R1) is crossed position (X R1, Y R1) and the vector direction ( ⁇ R1, ⁇ R1) Can be represented by
  • the driver visually recognizes the virtual image by the reflected light of the combiner 103, but the light beam detector 131 detects the transmitted light of the combiner 103. For this reason, information indicating the correspondence between the reflected light and the transmitted light of the combiner 103 is calculated in advance. For example, the correspondence between the ray position and the ray direction of the reflected light and the ray position and the ray direction of the transmitted light may be specified with reference to the optical characteristics of the combiner 103.
  • FIG. 18 is a schematic diagram illustrating an example of information indicating the correspondence between the reflected light and the transmitted light of the combiner 103.
  • the light ray position and the light ray direction of the reflected light and the transmitted light of the combiner 103 are represented by one point on the three-dimensional space and the vector direction, respectively.
  • (X R1 , Y R1 , Z R1 , ⁇ R1 , ⁇ R1 ) shown in FIG. 18 represent the ray position and ray direction of the reflected light
  • ⁇ T1 represents the ray position and ray direction of the transmitted light corresponding to the ray position and ray direction of the reflected light.
  • the display position correspondence information indicating the correspondence between the virtual image display position and the ray position and the ray direction of the transmitted light is information indicating the correspondence between the virtual image display position and the reflected light, and the correspondence between the reflected light and the transmitted light. It is created from the information to represent.
  • FIG. 19 is a schematic diagram illustrating an example of the display position correspondence information. The display position correspondence information shown in FIG. 19 is created based on the information shown in FIG. 17 and the information shown in FIG.
  • the display position correspondence information can be derived from the optical design values of the HUD 150.
  • the display position correspondence information storage unit 133 shown in FIG. 10 stores such display position correspondence information, and the display position identification unit 132 stores the display position correspondence information stored in the display position correspondence information storage unit 133.
  • the virtual image display position can be specified from the light beam position and the light beam direction of the transmitted light detected by the light beam detection unit 131.
  • the display position correspondence information is derived from the optical design values of the HUD 150 .
  • the transmission light of the combiner 103 and the HUD 150 are transmitted in advance.
  • An image may be captured by the imaging unit 130 using both the reflected light and the light ray position and the light ray direction may be calculated to create the display position correspondence information.
  • the image distortion value calculation unit 134 is a distortion calculation unit that calculates an image distortion value indicating the degree of image distortion from the virtual image display position specified by the display position specification unit 132. Specifically, the image distortion value calculation unit 134 calculates a pixel position that is a position of a pixel of the inspection point in the distortion inspection image that is the original image and a virtual image display position corresponding to the inspection point. In the virtual image projected by the HUD 150, an image distortion value indicating a degree of distortion that is a degree of distortion of the distortion inspection image is calculated. Note that pixels other than the inspection point may be interpolated using the image distortion value calculated from the inspection point.
  • a virtual camera is arranged at an arbitrary position shown in FIG. 3, and a pixel position of an image obtained when a virtual image is captured by the virtual camera is calculated.
  • An example of the calculation formula is shown in formula (3).
  • f C represents a virtual camera focal length
  • (X V1 , Y V1 , Z V1 ) represents a virtual image display position in a three-dimensional space
  • (U, V) represents the pixel position of the image captured by the virtual camera, and this value is the image distortion value.
  • an image capable of inspecting the degree of distortion of an image by capturing an image for distortion inspection with the imaging unit 130 installed outside the vehicle 101 and measuring the ray position and ray direction of the HUD light source changed by the optical member of the HUD.
  • a distortion inspection device can be provided.
  • FIG. 20 is a flowchart illustrating an inspection method in image distortion inspection apparatus 110 according to Embodiment 1.
  • the distortion inspection image 176 is displayed on the HUD light source 102, and the imaging unit 130 captures an image using the transmitted light transmitted through the combiner 103 (S10).
  • the light beam detection unit 131 calculates the light beam position and the light beam direction of the transmitted light from the image captured in step S10 (S11). For example, the light ray detection unit 131 calculates a light ray position and a light ray direction from a position where transmitted light reaches an imaging surface of the light ray space camera by using a light ray space camera for the imaging unit 130.
  • the display position specifying unit 132 specifies a virtual image display position from the light beam position and the light beam direction calculated in step S12 (S12). For example, the display position specifying unit 132 specifies the virtual image display position corresponding to the light ray position and the light ray direction calculated in step S12 with reference to the display position corresponding information stored in the display position corresponding information storage unit 133. .
  • the image distortion value calculator 134 calculates an image distortion value from the virtual image display position specified in step S12 (S13). For example, the image distortion value calculation unit 134 arranges a virtual camera at an arbitrary position shown in FIG. 3 and obtains a pixel position of an image obtained when a virtual image is captured by the virtual camera. Calculate the image distortion value.
  • the distortion inspection image 176 is captured by the imaging unit 130, which is a light space camera installed outside the vehicle 101, and the light position and the light direction of the HUD light source 102 changed by the optical member of the HUD 150 are measured. And an image distortion inspection method capable of inspecting the degree of distortion.
  • Embodiment 2 FIG.
  • the image distortion inspection system 200 according to the second embodiment includes an image distortion inspection device 210.
  • the hardware configuration of the image distortion inspection device 210 according to the second embodiment is the same as the hardware configuration of the image distortion inspection device 110 according to the first embodiment. Also in the second embodiment, the same HUD 150 as in the first embodiment is used.
  • the image distortion inspection apparatus 210 and the image distortion inspection method that improve the inspection accuracy of the image distortion value by changing the distortion inspection image over time are provided.
  • FIG. 21 is a block diagram schematically showing a functional configuration of the image distortion inspection device 210 according to the second embodiment.
  • the image distortion inspection device 210 includes an imaging unit 130, a light ray detection unit 131, a display position identification unit 132, a display position correspondence information storage unit 133, an image distortion value calculation unit 134, an inspection image storage unit 235, A communication control unit 236 and a communication unit 237 are provided.
  • the imaging unit 130, the light beam detection unit 131, the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134 in the second embodiment are the same as those of the first embodiment. This is the same as the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134.
  • the hardware that implements the functions of the inspection image storage unit 235 and the communication control unit 236 is the image processing device 111 in FIG.
  • the hardware that implements the function of the communication unit 237 is the wireless communication device 118 in FIG.
  • the inspection image storage unit 235 stores a plurality of distortion inspection images. It is assumed that the plurality of distortion inspection images are different from each other.
  • the distortion inspection image displayed on the HUD light source 102 is changed as time elapses. For example, among the distortion inspection images 276A to 276C shown in FIGS. 22A to 22C, the displayed image is changed over time, and the boundary between the black image region and the white image region is changed. Line distortion may be checked. In other words, the image distortion may be detected by using the pixels forming the boundary between the image area and the white image area as inspection points.
  • the image distortion may be detected using the white point as the inspection point.
  • the communication control unit 236 controls communication performed with the HUD 150. For example, the communication control unit 236 sequentially selects one distortion inspection image from a plurality of distortion inspection images stored in the inspection image storage unit 235 with time, and selects the selected distortion inspection image. Is transmitted to the HUD 150 by the communication unit 237. For example, the communication control unit 236 may sequentially transmit one distortion inspection image to the communication unit 237 every time a predetermined time elapses.
  • the communication unit 237 performs communication with the HUD 150.
  • the communication unit 237 performs wireless communication.
  • the HUD 150 displays the distortion inspection image received by the wireless communication device 158 on the liquid crystal panel of the HUD light source 102 by the image processing device 151.
  • the image processing device 151 of the HUD 150 transmits the image display notification signal to the wireless communication device 158 to the image distortion inspection device 210. Send.
  • the communication unit 237 of the image distortion inspection device 210 receives the image display notification signal.
  • the communication control unit 236 notifies the imaging unit 130 that such a signal has been received.
  • the imaging unit 130 After receiving the image display notification signal, the imaging unit 130 performs a process of capturing an image for distortion inspection.
  • the image distortion value of the HUD 150 can be inspected using the distortion inspection image that changes with time.
  • the distortion inspection image it is possible to detect the light beam position and the light beam direction corresponding to each pixel of the liquid crystal panel of the HUD light source 102, and thus provide the image distortion inspection device 210 with improved image distortion value inspection accuracy. can do.
  • FIG. 24 is a flowchart illustrating an inspection method in the image distortion inspection device 210 according to the second embodiment.
  • the communication control unit 236 selects one distortion inspection image from the plurality of distortion inspection images stored in the inspection image storage unit 235, and reads the selected distortion inspection image (S20).
  • the inspection image storage unit 235 stores a plurality of distortion inspection images so that the selection order can be recognized.
  • the communication control unit 236 selects one distortion inspection image in accordance with the order and reads the selected distortion inspection image.
  • the communication control unit 236 causes the communication unit 237 to transmit the distortion inspection image read in step S20 to the HUD 150 in order to display the image on the liquid crystal panel of the HUD light source 102 (S21).
  • the communication control unit 236 determines that the distortion inspection image transmitted in step S21 is displayed on the liquid crystal panel of the HUD light source 102 by the communication unit 237 receiving the image display notification signal transmitted from the HUD 150. Is confirmed (S22). Then, the process proceeds to step S23.
  • step S26 The processing in steps S23 to S26 in FIG. 24 is the same as the processing in steps S10 to S13 in FIG. 20 of the first embodiment. However, after step S26, the process proceeds to step S27.
  • step S27 the communication control unit 236 determines whether all of the plurality of distortion inspection images stored in the inspection image storage unit 235 have been displayed. If all the distortion inspection images have been displayed (Yes in S27), the process ends. If there are distortion inspection images that have not been displayed (No in S27), the process proceeds to step S20. Return to
  • the imaging unit 130 installed outside the vehicle 101 captures an image for distortion inspection that changes with time, and the light beam position and light beam of the HUD light source 102 changed by the optical member of the HUD 150. It is possible to provide an image distortion inspection method capable of inspecting the degree of image distortion by measuring the direction.
  • an image distortion inspection system 300 includes an image distortion inspection device 310.
  • the relative position between image distortion inspection device 310 and vehicle 101 is not constant, the relative position between image distortion inspection device 310 and vehicle 101 is acquired and image distortion inspection device 310 is moved.
  • the image distortion value of the HUD 150 can be inspected.
  • the HUD 150 according to the third embodiment is the same as the HUD 150 according to the first embodiment.
  • FIG. 25 is a block diagram illustrating a hardware configuration example of the image distortion inspection device 310 according to the third embodiment.
  • the image distortion inspection device 310 includes an image processing device 311, an imaging device 117, a wireless communication device 118, a position measurement device 320, and a moving mechanism device 321.
  • the imaging device 117 and the wireless communication device 118 according to the third embodiment are the same as the imaging device 117 and the wireless communication device 118 according to the first embodiment.
  • the image processing device 311 includes a CPU 112, a RAM 113, a ROM 114, a sensor IO 315, a wireless communication IO 116, and a movement control IO 319.
  • the CPU 112, the RAM 113, the ROM 114, and the wireless communication IO 116 according to the third embodiment are the same as the CPU 112, the RAM 113, the ROM 114, and the wireless communication IO 116 according to the first embodiment.
  • the sensor IO 315 is an interface for connecting the imaging device 117 and the position measurement device 320.
  • the movement control IO 319 is an interface for controlling the movement mechanism device 321 in order to change the position of the imaging device 117.
  • the CPU 112 controls the direction and rotation of the wheels of the movement mechanism device 321 via the movement control IO 319.
  • the position measuring device 320 measures the position of the vehicle 101.
  • the position measurement device 320 measures the position of the vehicle 101 using a laser sensor such as LiDAR (Light Detection and Ranging).
  • the position measuring device 320 measures the position of the vehicle 101 by irradiating the vehicle 101 with a laser from the position measuring device 320 and receiving the laser reflected from the vehicle 101.
  • the moving mechanism device 321 moves the image distortion inspection device 310 to change the position of the imaging device 117.
  • the moving mechanism device 321 includes four wheels (not shown) installed below the image distortion inspection apparatus 310, and a motor (not shown) for moving the wheels.
  • the CPU 112 moves the image distortion inspection device 310 by controlling the direction and rotation of the wheels via the movement control IO 319.
  • FIG. 26 is a block diagram schematically showing a functional configuration of the image distortion inspection device 310 according to the third embodiment.
  • the image distortion inspection device 310 includes an imaging unit 130, a light ray detection unit 131, a display position identification unit 132, a display position correspondence information storage unit 133, an image distortion value calculation unit 134, a position detection unit 338, and a relative position.
  • An acquisition unit 339, a movement control unit 340, and an imaging position change unit 341 are provided.
  • the imaging unit 130, the light beam detection unit 131, the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134 according to the third embodiment are the same as those of the first embodiment. This is the same as the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134.
  • an inspection image storage unit 235, a communication control unit 236, and a communication unit 237 are provided, and a plurality of distortion inspection images are stored. Then, the distortion inspection images may be transmitted to the HUD 150 in order.
  • the hardware that implements the function of the position detection unit 338 is the position measurement device 320 in FIG.
  • Hardware that implements the functions of the relative position acquisition unit 339 and the movement control unit 340 is the image processing device 311 in FIG.
  • Hardware that implements the function of the imaging position changing unit 341 is the moving mechanism device 321 in FIG.
  • the position detection unit 338 detects a vehicle position that is the position of the vehicle 101. Specifically, the position detection unit 338 detects the vehicle position of the vehicle 101 by measuring the outer shape of the vehicle 101 using LiDAR.
  • the relative position acquisition unit 339 acquires a relative position, which is a relative position between the image distortion inspection device 310 and the vehicle 101, from the vehicle position detected by the position detection unit 338. It is assumed that the positional relationship between the vehicle 101 and the HUD 150 installed inside the vehicle 101 is known from design values when the HUD 150 is mounted. Thus, the relative position between the image distortion inspection device 310 and the vehicle 101 can be obtained.
  • the position detection unit 338 may be realized by the imaging device 117, for example.
  • the relative position acquisition unit 339 stores the image feature points of the vehicle 101 and the positions thereof in advance, and detects the image feature points from the image captured by the imaging device 117, thereby detecting the vehicle feature points.
  • the relative position between 101 and the imaging device 117 may be calculated.
  • an in-vehicle sensor (not shown) installed in the vehicle 101 may be used for position measurement.
  • a calibration board or the like is attached to the image distortion inspection device 310, and the position of the image distortion inspection device 310 is measured by measuring with a vehicle-mounted sensor.
  • the in-vehicle sensor transmits the measurement result to the image distortion inspection device 310 wirelessly, for example.
  • the relative position acquisition unit 339 can acquire the measurement result via the wireless communication device 118 and acquire the relative position between the vehicle 101 and the image distortion inspection device 310.
  • the movement control unit 340 calculates a movement amount from the relative position acquired by the relative position acquisition unit 339 to a position predetermined as a position for performing an inspection, and instructs the imaging position change unit 341 to calculate the amount of movement.
  • the image distortion inspection device 310 is moved.
  • the imaging position changing unit 341 moves the image distortion inspection device 310 by the movement amount calculated by the movement control unit 340 in accordance with the instruction from the movement control unit 340, and thereby the installation position of the imaging device 117, in other words, A moving unit that changes an imaging position of the imaging device 117.
  • the movement amount of the image distortion inspection device 310 is obtained, for example, in advance by setting a relative position between the image distortion inspection device 310 and the vehicle 101 suitable for an image distortion value inspection process, and by the relative position acquisition unit 339. What is necessary is just to calculate from the relative position and the difference value between the relative position.
  • the image distortion inspection device 310 automatically moves so as to be a relative position suitable for the image distortion value inspection processing. Become.
  • FIG. 27 is a flowchart illustrating an inspection method in the image distortion inspection device 310 according to the third embodiment.
  • the position detection unit 338 detects the position of the vehicle 101 by measuring the external shape of the vehicle 101 equipped with the HUD 150 for inspecting the image distortion value (S30).
  • the relative position acquisition unit 339 acquires the relative position between the vehicle 101 and the image distortion inspection device 310 from the position of the vehicle 101 detected in step S30, and the movement control unit 340 sets the relative position to the acquired relative position. Then, a difference value from the previously stored relative position is obtained, and the movement amount of the image distortion inspection device 310 is calculated (S31).
  • the imaging position changing unit 341 changes the imaging position by moving the image distortion inspection device 310 according to the movement amount calculated in step S31 (S32).
  • the processing in steps S33 to S36 in FIG. 27 is the same as the processing in steps S10 to S13 in FIG. 20 of the first embodiment.
  • the HUD 150 is obtained by acquiring the relative position between the image distortion inspection device 310 and the vehicle 101 and moving the image distortion inspection device 310. Can be provided.
  • first to third embodiments described above are examples in which the degree of image distortion of the HUD 150 is inspected as an image display device, the first to third embodiments are not limited to the above and depart from the spirit. It can be changed appropriately within a range not to be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)

Abstract

The present invention is provided with: an imaging unit (130) that captures, from the outside of a vehicle (101), an image formed by light outputted by an HUD, which is installed within the vehicle (101) so as to project a virtual image on the basis of an original image, in such a manner as to allow identification of a light position being the position where the light is inputted and of a light direction being the direction from which the light is inputted; a detection unit that detects, from the captured image, a light position and a light direction of light indicating an inspection point contained in the original image; a display position specifying unit which, from the detected light position and the detected light direction, specifies a virtual image display position where a point corresponding to the inspection point is to be displayed in the virtual image; and a distortion computation unit that computes, from a pixel position being the position of a pixel corresponding to the inspection point and the specified virtual image display position, a distortion degree indicating the degree of distortion of the original image in the virtual image.

Description

画像歪み検査装置、画像歪み検査方法及びプログラムImage distortion inspection apparatus, image distortion inspection method, and program
 本発明は、画像歪み検査装置、画像歪み検査方法及びプログラムに関する。 The present invention relates to an image distortion inspection device, an image distortion inspection method, and a program.
 近年、画像表示機器としてヘッドアップディスプレイ(HUD)を内部に搭載した車両が一般市場で販売されている。HUDは、運転手の前方に虚像を投影し、運転手の視線移動を少なくすることで、運転手に負担をかけずに情報を表示することができる。 In recent years, vehicles equipped with a head-up display (HUD) as an image display device have been sold in the general market. The HUD can display information without imposing a burden on the driver by projecting a virtual image in front of the driver and reducing movement of the driver's line of sight.
 車載用のHUDの一例では、HUD光源に表示された画像をコンバイナと呼ばれる凹面ハーフミラーへ投影することで、虚像が表示される。HUD光源からコンバイナまでは複数のレンズを介して光が投影されるため、一般的に、HUD光源で表示された画像は、歪んだ画像となって虚像として表示される。 In an example of a vehicle-mounted HUD, a virtual image is displayed by projecting an image displayed on a HUD light source onto a concave half mirror called a combiner. Since light is projected from the HUD light source to the combiner through a plurality of lenses, an image displayed by the HUD light source is generally displayed as a virtual image as a distorted image.
 このため、虚像として歪みの無い画像を表示するには、画像の歪み補正を行う必要がある。画像の歪み補正では、予め、虚像の歪み度合いである画像歪値を検査しておき、その画像歪値から歪補正値が算出される。算出された歪補正値に従って、HUD光源の表示画像を変形することで、歪みのない虚像が表示される。 Therefore, in order to display an image without distortion as a virtual image, it is necessary to perform image distortion correction. In the image distortion correction, an image distortion value which is a degree of distortion of a virtual image is inspected in advance, and a distortion correction value is calculated from the image distortion value. By deforming the display image of the HUD light source according to the calculated distortion correction value, a virtual image without distortion is displayed.
 画像歪値については、一般的に、HUDの光学設計値から算出する方法と、コンバイナへ投影した歪み検査用画像をカメラで撮像し、その撮像画像の歪み度合から算出する方法とがある。HUDを構成するレンズ等の光学部材は、光学設計値から外れた光学特性を持つ光学部材となって製造される場合が多いため、画像歪値は、歪み検査用画像をカメラで撮像した画像から算出する方法が望ましい。 Image distortion values generally include a method of calculating from an optical design value of the HUD and a method of calculating a distortion inspection image projected on a combiner with a camera and calculating the degree of distortion of the captured image. Since optical members such as lenses constituting the HUD are often manufactured as optical members having optical characteristics deviating from optical design values, the image distortion value is calculated from an image obtained by capturing a distortion inspection image with a camera. A calculation method is desirable.
 通常、歪み検査用画像は、車両内部における運転手の視点位置付近である運転手席に設置されたカメラで撮像される。例えば、特許文献1には、車両内部に設置されたカメラを移動させ、数点の代表的な運転手の視点位置で歪み検査用画像を撮像して画像歪値及び歪補正値を算出する技術が記載されている。 Normally, the distortion inspection image is captured by a camera installed in the driver's seat near the driver's viewpoint in the vehicle. For example, Patent Literature 1 discloses a technique in which a camera installed inside a vehicle is moved, a distortion inspection image is captured at several representative driver viewpoint positions, and an image distortion value and a distortion correction value are calculated. Is described.
特開2017-47794号公報JP 2017-47794 A
 従来技術では、画像歪値及び歪補正値を算出するために、カメラを車両内部へ設置して画像歪み検査用画像を撮像し、画像歪み度合いを検査した後にカメラを車両内部から取り外す作業が必要である。このため、画像歪み検査時の作業負荷が高いという問題があった。 In the prior art, in order to calculate the image distortion value and the distortion correction value, it is necessary to install a camera inside the vehicle, take an image for image distortion inspection, inspect the degree of image distortion, and then remove the camera from the vehicle interior. It is. For this reason, there is a problem that the work load at the time of the image distortion inspection is high.
 そこで、本発明の1又は複数の態様は、車両外部の撮像部で画像を撮像して、画像の歪み度合いを検査できるようにすることを目的とする。 Accordingly, an object of one or more aspects of the present invention is to enable an imaging unit outside a vehicle to capture an image and inspect the degree of distortion of the image.
 本発明の1態様に係る画像歪み検査装置は、原画像に基づいて虚像を投影するために車両の内部に設置された画像表示機器から出力された光による画像を、光線が入力される位置である光線位置及び前記光線が入力される方向である光線方向がわかるように、前記車両の外部から撮像する撮像部と、前記撮像された画像から、前記原画像に含まれている検査用の点を示す光線の光線位置及び光線方向を検出する検出部と、前記検出された光線位置及び前記検出された光線方向から、前記虚像において、前記検査用の点に対応する点が表示される虚像表示位置を特定する表示位置特定部と、前記原画像における、前記検査用の点の画素の位置である画素位置と、前記特定された虚像表示位置とから、前記虚像において前記原画像が歪む度合いを示す歪み度合いを算出する歪み算出部と、を備えることを特徴とする。 The image distortion inspection apparatus according to one aspect of the present invention is configured to project an image based on light output from an image display device installed inside a vehicle to project a virtual image based on an original image at a position where a light beam is input. An image pickup unit that picks up an image from the outside of the vehicle so that a light ray direction that is a direction in which the light ray is input, and an inspection point included in the original image from the picked-up image A detection unit that detects the light beam position and the light beam direction of the light beam, and a virtual image display in which a point corresponding to the inspection point is displayed in the virtual image from the detected light beam position and the detected light beam direction. A display position specifying unit that specifies a position, a pixel position that is a position of a pixel of the inspection point in the original image, and the specified virtual image display position, a degree of distortion of the original image in the virtual image. A distortion calculator for calculating a degree of distortion indicated, characterized in that it comprises a.
 本発明の1態様に係る画像歪み検査方法は、原画像に基づいて虚像を投影するために車両の内部に設置された画像表示機器から出力された光による画像を、光線が入力される位置である光線位置及び前記光線が入力される方向である光線方向がわかるように、前記車両の外部から撮像し、前記撮像された画像から、前記原画像に含まれている検査用の点を示す光線の光線位置及び光線方向を検出し、前記検出された光線位置及び前記検出された光線方向から、前記虚像において、前記検査用の点に対応する点が表示される虚像表示位置を特定し、前記原画像における、前記検査用の点の画素の位置である画素位置と、前記特定された虚像表示位置とから、前記虚像において前記原画像が歪む度合いを示す歪み度合いを算出することを特徴とする。 The image distortion inspection method according to one aspect of the present invention includes an image distortion inspection method that projects an image based on light output from an image display device installed inside a vehicle to project a virtual image based on an original image at a position where a light beam is input. A light beam indicating an inspection point included in the original image from the captured image so that a certain light beam position and a light beam direction which is a direction in which the light beam is input can be recognized. The light ray position and the light ray direction are detected, and from the detected light ray position and the detected light ray direction, in the virtual image, a virtual image display position at which a point corresponding to the inspection point is displayed is specified. In the original image, from the pixel position that is the position of the pixel of the inspection point, and from the specified virtual image display position, calculating a distortion degree indicating the degree of distortion of the original image in the virtual image, That.
 本発明の1態様に係るプログラムは、コンピュータを、原画像に基づいて虚像を投影するために車両の内部に設置された画像表示機器から出力された光を受けて、光線が入力される位置である光線位置及び前記光線が入力される方向である光線方向がわかるように、前記車両の外部から撮像された画像から、前記原画像に含まれている検査用の点を示す光線の光線位置及び光線方向を検出する検出部、前記検出された光線位置及び前記検出された光線方向から、前記虚像において、前記検査用の点に対応する点が表示される虚像表示位置を特定する表示位置特定部、並びに、前記原画像における、前記検査用の点の画素の位置である画素位置と、前記特定された虚像表示位置とから、前記虚像において前記原画像が歪む度合いを示す歪み度合いを算出する歪み算出部、として機能させることを特徴とする。 A program according to one embodiment of the present invention is a program that causes a computer to receive light output from an image display device installed inside a vehicle in order to project a virtual image based on an original image, at a position where a light beam is input. In order to know a certain ray position and a ray direction that is a direction in which the ray is input, from an image taken from outside the vehicle, a ray position of a ray indicating a point for inspection included in the original image and A detection unit that detects a light beam direction, a display position specifying unit that specifies a virtual image display position at which a point corresponding to the inspection point is displayed in the virtual image from the detected light beam position and the detected light beam direction. And a distortion degree indicating a degree of distortion of the original image in the virtual image from a pixel position that is a position of a pixel of the inspection point in the original image and the specified virtual image display position. It characterized in that to function as a strain calculating unit, for calculating a household.
 本発明の1又は複数の態様によれば、車両外部の撮像部で画像を撮像して、画像の歪み度合いを検査することができる。 According to one or more aspects of the present invention, an image can be captured by an imaging unit outside the vehicle, and the degree of distortion of the image can be inspected.
実施の形態1~3に係る画像歪み検査装置を含む画像歪み検査システムの使用状況を示す概略図である。FIG. 3 is a schematic diagram showing a usage state of an image distortion inspection system including the image distortion inspection device according to Embodiments 1 to 3. (a)及び(b)は、画像歪値を説明するための概略図である。(A) and (b) are schematic diagrams for explaining an image distortion value. 虚像と虚像投影画像との関係を示す概略図である。It is the schematic which shows the relationship between a virtual image and a virtual image projection image. 画像歪み補正処理を説明するための概略図である。It is a schematic diagram for explaining image distortion correction processing. (a)及び(b)は、歪補正値を説明するための概略図である。(A) and (b) are schematic diagrams for explaining a distortion correction value. 実施の形態1及び2に係る画像歪み検査装置及びHUDのハードウェア構成例を示すブロック図である。FIG. 3 is a block diagram illustrating a hardware configuration example of an image distortion inspection device and a HUD according to Embodiments 1 and 2. 光線空間画像を説明するための概略図である。It is a schematic diagram for explaining a light beam space image. (a)及び(b)は、マイクロレンズ-アレイを使用した光線空間カメラの一例を説明するための概略図である。(A) And (b) is a schematic diagram for explaining an example of a light beam space camera using a micro lens-array. 複数台の通常のカメラを縦横左右に並べて設置することによって光線空間カメラを構成する一例を説明するための概略図である。It is a schematic diagram for explaining an example which constitutes a beam space camera by arranging a plurality of normal cameras side by side vertically and horizontally. 実施の形態1に係る画像歪み検査装置の機能構成を概略的に示すブロック図である。FIG. 2 is a block diagram schematically showing a functional configuration of the image distortion inspection device according to the first embodiment. HUDの光学モデルの一例を示す概略図である。It is the schematic which shows an example of the optical model of HUD. 歪み検査用画像の第1の例を示す概略図である。It is the schematic which shows the 1st example of the image for distortion inspections. マイクロレンズ-アレイと撮像面との関係を示す概略図である。FIG. 3 is a schematic diagram illustrating a relationship between a microlens-array and an imaging surface. マイクロレンズと、撮像面との関係を説明するための概略図である。FIG. 3 is a schematic diagram for explaining a relationship between a microlens and an imaging surface. 虚像表示位置を説明するための概略図である。It is a schematic diagram for explaining a virtual image display position. 仮想HUDコンバイナを用いて、コンバイナの凹面ミラーを凸レンズに置き換えた例を示す概略図である。It is the schematic which shows the example which replaced the concave mirror of the combiner with the convex lens using the virtual HUD combiner. 虚像表示位置と、反射光の光線位置及び光線方向との対応関係を表す情報の一例を示す概略図である。It is the schematic which shows an example of the information showing the correspondence of the virtual image display position, the light beam position of reflected light, and the light beam direction. コンバイナの反射光と透過光との対応関係を示す情報の一例を示す概略図である。It is the schematic which shows an example of the information which shows the correspondence of the reflected light and the transmitted light of a combiner. 表示位置対応情報の一例を示す概略図である。It is a schematic diagram showing an example of display position correspondence information. 実施の形態1に係る画像歪み検査装置における検査方法を示すフローチャートである。5 is a flowchart illustrating an inspection method in the image distortion inspection device according to the first embodiment. 実施の形態2に係る画像歪み検査装置の機能構成を概略的に示すブロック図である。FIG. 9 is a block diagram schematically showing a functional configuration of an image distortion inspection device according to a second embodiment. (a)~(c)は、複数の歪み検査用画像の第1の例を示す概略図である。(A)-(c) is a schematic diagram showing a first example of a plurality of distortion inspection images. (a)~(c)は、複数の歪み検査用画像の第2の例を示す概略図である。(A)-(c) is a schematic diagram showing a second example of a plurality of distortion inspection images. 実施の形態2に係る画像歪み検査装置における検査方法を示すフローチャートである。9 is a flowchart illustrating an inspection method in the image distortion inspection device according to the second embodiment. 実施の形態3に係る画像歪み検査装置のハードウェア構成例を示すブロック図である。FIG. 11 is a block diagram illustrating a hardware configuration example of an image distortion inspection device according to a third embodiment. 実施の形態3に係る画像歪み検査装置の機能構成を概略的に示すブロック図である。FIG. 11 is a block diagram schematically showing a functional configuration of an image distortion inspection device according to a third embodiment. 実施の形態3に係る画像歪み検査装置における検査方法を示すフローチャートである。9 is a flowchart illustrating an inspection method in the image distortion inspection device according to the third embodiment.
実施の形態1.
 図1は、実施の形態1に係る画像歪み検査装置110を含む画像歪み検査システム100の使用状況を示す概略図である。
 画像歪み検査装置110は、車両101の外部に配置されており、車両101の内部に設置されている画像表示機器(図示せず)を用いて表示させる画像の画像歪み度合いを検査する。ここで、画像表示機器としては、例えば、HUDが使用される。以下、画像表示機器を、HUDという。
Embodiment 1 FIG.
FIG. 1 is a schematic diagram showing a usage state of an image distortion inspection system 100 including an image distortion inspection device 110 according to the first embodiment.
The image distortion inspection device 110 is arranged outside the vehicle 101, and inspects the degree of image distortion of an image to be displayed using an image display device (not shown) installed inside the vehicle 101. Here, for example, a HUD is used as the image display device. Hereinafter, the image display device is referred to as a HUD.
 例えば、HUDのHUD光源102から投影された光は、コンバイナ103で反射する反射光と、コンバイナ103を透過する少量の透過光に分かれる。ここで、コンバイナ103は、HUD光源102から投影された光の一部を反射する反射部として機能する。 For example, the light projected from the HUD light source 102 of the HUD is divided into reflected light reflected by the combiner 103 and a small amount of transmitted light transmitted through the combiner 103. Here, the combiner 103 functions as a reflector that reflects part of the light projected from the HUD light source 102.
 透過光は、車両101の内部のHUD光源102からコンバイナ103を透過して車両101の外部へ進行し、画像歪み検査装置110の撮像部130に到達する。
 反射光は、運転手の視点位置に到達し、運転手は、この反射光によって車両101の前方の虚像表示位置に表示された虚像171を視認する。
 HUD光源102からコンバイナ103までは、複数のレンズを介して光を投影するため、レンズによって光線の向き(進行方向)が変化する。
The transmitted light passes through the combiner 103 from the HUD light source 102 inside the vehicle 101, travels outside the vehicle 101, and reaches the imaging unit 130 of the image distortion inspection device 110.
The reflected light reaches the driver's viewpoint position, and the driver visually recognizes the virtual image 171 displayed at the virtual image display position in front of the vehicle 101 by the reflected light.
Since the light from the HUD light source 102 to the combiner 103 is projected through a plurality of lenses, the direction of the light beam (traveling direction) changes depending on the lens.
 HUDを構成するレンズ等の光学部材は、光学設計値から外れた光学特性を持つ光学部材となって製造されることが多い。従って、HUD光源102からの光線の向きは、光学設計値の想定とは異なる方向に変化する。これより、コンバイナ103の透過光及び反射光ともに光学設計値とは異なる方向へ進行し、この反射光を観察した運転手は、虚像171として歪んだ画像を視認する。 光学 Optical members such as lenses constituting the HUD are often manufactured as optical members having optical characteristics deviating from optical design values. Therefore, the direction of the light beam from the HUD light source 102 changes in a direction different from the assumed optical design value. As a result, both the transmitted light and the reflected light of the combiner 103 travel in directions different from the optical design values, and the driver who observes the reflected light visually recognizes the distorted image as the virtual image 171.
 画像歪み検査装置110は、コンバイナ103の透過光による画像を撮像部130で撮像し、透過光の光線が入力される位置である光線位置及びその光線が入力される方向である光線方向を検出することで、HUDの画像歪み度合を表す画像歪値を算出する。 The image distortion inspection device 110 captures an image of the transmitted light of the combiner 103 by the imaging unit 130, and detects a light beam position where the light beam of the transmitted light is input and a light beam direction where the light beam is input. Thus, an image distortion value indicating the degree of HUD image distortion is calculated.
 図2(a)及び(b)は、画像歪値を説明するための概略図である。
 図2(a)に示されている表示画像170は、HUD光源102の液晶パネル(図示せず)上に表示される画像である。表示画像170の表示解像度は、液晶パネルによって決定される。一例として、Full-HD(High Definition)の液晶パネルを使用する際には、横解像度は、1920画素、縦解像度は、1080画素である。
FIGS. 2A and 2B are schematic diagrams for explaining image distortion values.
A display image 170 shown in FIG. 2A is an image displayed on a liquid crystal panel (not shown) of the HUD light source 102. The display resolution of the display image 170 is determined by the liquid crystal panel. As an example, when using a Full-HD (High Definition) liquid crystal panel, the horizontal resolution is 1920 pixels and the vertical resolution is 1080 pixels.
 図2(a)に示されている虚像投影画像172は、虚像が表示される位置である虚像表示位置に表示された3次元空間上の虚像171(図1参照)を、任意の2次元平面空間上へ投影した際に生成される画像である。虚像投影画像172は、仮想的なカメラで3次元空間上の虚像の画像を撮像したときに得られる画像と等価になる。 A virtual image projection image 172 shown in FIG. 2A is obtained by converting a virtual image 171 (see FIG. 1) in a three-dimensional space displayed at a virtual image display position where a virtual image is displayed into an arbitrary two-dimensional plane. This is an image generated when projected on a space. The virtual image projection image 172 is equivalent to an image obtained when a virtual image in a three-dimensional space is captured by a virtual camera.
 ここで、図1では、車両101の正面の進行方向をZ軸とし、その高さ方向をY軸とし、その横方向をX軸とする。
 また、虚像を投影する任意の2次元平面空間は、Z軸に垂直な平面空間(X軸-Y軸)とする。図2では、液晶パネルの横向き方向をU軸とし、縦向き方向をV軸とする。
Here, in FIG. 1, the traveling direction of the front of the vehicle 101 is the Z axis, the height direction is the Y axis, and the lateral direction is the X axis.
Further, an arbitrary two-dimensional plane space on which a virtual image is projected is a plane space (X axis-Y axis) perpendicular to the Z axis. In FIG. 2, the horizontal direction of the liquid crystal panel is defined as a U axis, and the vertical direction is defined as a V axis.
 図3は、虚像171と虚像投影画像172との関係を示す概略図である。
 図3に示されている任意位置は、HUDに入力される入力画像の中心画素が作る虚像表示位置における点171aから、運転手席までZ軸方向へ並進移動した位置である。この任意位置は、他の位置であってもよいが、運転手の視点付近であることが望ましい。
FIG. 3 is a schematic diagram showing the relationship between the virtual image 171 and the virtual image projection image 172.
The arbitrary position shown in FIG. 3 is a position translated in the Z-axis direction from the point 171a at the virtual image display position formed by the center pixel of the input image input to the HUD to the driver's seat. This arbitrary position may be another position, but is preferably near the driver's viewpoint.
 虚像投影画像172は、虚像表示位置と任意位置との間に、Z軸方向に垂直な平面(X軸-Y軸平面)を仮定し、虚像表示位置から任意位置までの直線とその平面が交差する位置を算出して求める。なおHUDの画像歪みが無い状態であれば、虚像投影画像172は、表示画像170(図2(a)参照)と一致する。 The virtual image projection image 172 assumes a plane perpendicular to the Z-axis direction (X-axis-Y-axis plane) between the virtual image display position and the arbitrary position, and a straight line from the virtual image display position to the arbitrary position intersects with the plane. The position to be calculated is calculated. If there is no HUD image distortion, the virtual image projection image 172 matches the display image 170 (see FIG. 2A).
 図2(a)に示されているように、画像歪値は、表示画像170の各画素に対応する、虚像投影画像172の画素を示す。具体的には、図2(b)に示されているように、画像歪値は、表示画像170の各画素と、虚像投影画像172の各画素との位置関係を表す行列となる。例えば、表示画像170の最も左上にある画素を画素(1,1)とし、その画素(1,1)に対応する画像歪値を(U1-1,V1-1)とする。同様に、表示画像170の画素(1,2)に対応する画像歪値を(U1-2,V1-2)とする。 As shown in FIG. 2A, the image distortion value indicates a pixel of the virtual image projection image 172 corresponding to each pixel of the display image 170. Specifically, as shown in FIG. 2B, the image distortion value is a matrix representing the positional relationship between each pixel of the display image 170 and each pixel of the virtual image projection image 172. For example, let the pixel at the upper left of the display image 170 be the pixel (1,1), and let the image distortion value corresponding to the pixel (1,1) be (U 1-1 , V 1-1 ). Similarly, let the image distortion value corresponding to the pixel (1, 2) of the display image 170 be (U 1-2 , V 1-2 ).
 ここで、U1-1は、表示画像170の画素(1,1)が作る虚像表示位置に対応する虚像投影画像上のU軸方向の画素位置を表し、V1-1は、そのV軸方向の画素位置を表す。
 なお、表示画像170の画像解像度がFULL-HDである場合、画像歪値は、1920×1080の行列となり、U軸値とV軸値とをそれぞれ表す2つの行列となる。
Here, U 1-1 represents a pixel position in the U-axis direction on the corresponding virtual projection image to a virtual image display position where the pixel (1,1) is made of a display image 170, V 1-1, the V-axis Represents the pixel position in the direction.
When the image resolution of the display image 170 is FULL-HD, the image distortion values are 1920 × 1080 matrices, and are two matrices representing a U-axis value and a V-axis value, respectively.
 HUDは、画像歪み検査装置110で算出された画像歪値を参照して歪補正値を算出し、その歪補正値に従ってHUD光源102に表示させる表示画像170を変形することで、画像歪み補正処理を実施する。 The HUD calculates a distortion correction value with reference to the image distortion value calculated by the image distortion inspection apparatus 110, and deforms the display image 170 displayed on the HUD light source 102 according to the distortion correction value, thereby performing an image distortion correction process. Is carried out.
 図4は、画像歪み補正処理を説明するための概略図である。
 図4に示されている入力画像173は、コンバイナ103を介して表示したいコンテンツの画像である。画像歪み補正処理では、HUDは、歪補正値を参照して入力画像173を変形し、歪補正画像174を生成する。HUDは、生成された歪補正画像174をHUD光源102上の液晶パネルに表示させて、コンバイナ103へ投影すると、光学部材によって液晶パネル上に表示された表示画像が歪んでも、歪補正画像174の画像変形によって歪みが打ち消し合い、歪みの無い画像175が虚像として表示される。
FIG. 4 is a schematic diagram for explaining the image distortion correction processing.
An input image 173 shown in FIG. 4 is an image of a content to be displayed via the combiner 103. In the image distortion correction processing, the HUD deforms the input image 173 with reference to the distortion correction value to generate a distortion corrected image 174. The HUD displays the generated distortion-corrected image 174 on the liquid crystal panel on the HUD light source 102 and projects it on the combiner 103. Even when the display image displayed on the liquid crystal panel is distorted by the optical member, the HUD The distortions cancel each other out due to the image deformation, and an image 175 without distortion is displayed as a virtual image.
 図5(a)及び(b)は、歪補正値を説明するための概略図である。
 図5(a)に示されているように、歪補正画像174の画素位置(1,1)に対応する歪補正値は、図5に示されている入力画像173の(HU1-1,HV1-1)の値になる。
FIGS. 5A and 5B are schematic diagrams for explaining the distortion correction value.
As shown in FIG. 5A, the distortion correction value corresponding to the pixel position (1, 1) of the distortion corrected image 174 is (HU 1-1 , HV 1-1 ).
 歪補正値は、図5(a)に示されているように、入力画像173の各画素に対応する、歪補正画像174の画素を示す。具体的には、図5(b)に示されているように、歪補正値は、入力画像173の各画素と、歪補正画像174の各画素との位置関係を表す行列となる。例えば、歪補正画像174の最も左上にある画素を画素(1,1)とし、その画素(1,1)に対応する歪補正値を(HU1-1,HV1-1)とする。同様に、歪補正画像174の画素(1,2)に対応する歪補正値を(HU1-2,HV1-2)とする。 The distortion correction value indicates a pixel of the distortion correction image 174 corresponding to each pixel of the input image 173, as shown in FIG. Specifically, as shown in FIG. 5B, the distortion correction value is a matrix representing the positional relationship between each pixel of the input image 173 and each pixel of the distortion correction image 174. For example, assume that the pixel at the upper left of the distortion corrected image 174 is a pixel (1, 1), and a distortion correction value corresponding to the pixel (1, 1) is (HU 1-1 , HV 1-1 ). Similarly, let the distortion correction value corresponding to the pixel (1, 2) of the distortion correction image 174 be (HU 1-2 , HV 1-2 ).
 画像歪み補正処理では、HUDは、入力画像173の画素位置(HU1-1,HV1-1)の画素値を、歪補正画像174の画素位置(1,1)の画素値へ代入することで歪補正画像174を生成する。 In the image distortion correction processing, the HUD substitutes the pixel value of the pixel position (HU 1-1 , HV 1-1 ) of the input image 173 into the pixel value of the pixel position (1,1) of the distortion corrected image 174. Generates a distortion corrected image 174.
 図6は、画像歪み検査装置110及びHUDのハードウェア構成例を示すブロック図である。
 画像歪み検査装置110は、画像処理デバイス111と、撮像デバイス117と、無線通信デバイス118とを備える。
FIG. 6 is a block diagram illustrating a hardware configuration example of the image distortion inspection device 110 and the HUD.
The image distortion inspection apparatus 110 includes an image processing device 111, an imaging device 117, and a wireless communication device 118.
 画像処理デバイス111は、撮像デバイス117で撮像された画像から画像歪値を算出するデバイスである。画像処理デバイス111は、プロセッサとして機能するCPU(Central Processing Unit)112と、メモリとして機能するRAM(Random Access Memory)113及びROM(Read Only Memory)114と、撮像デバイス117を接続するためのインタフェースであるセンサIO(Input-Output)115と、無線通信デバイス118を接続するためのインタフェースである無線通信IO116とを備える。 The image processing device 111 is a device that calculates an image distortion value from an image captured by the imaging device 117. The image processing device 111 is an interface for connecting a CPU (Central Processing Unit) 112 functioning as a processor, a RAM (Random Access Memory) 113 and a ROM (Read Only Memory) 114 functioning as a memory, and an imaging device 117. It comprises a sensor IO (Input-Output) 115 and a wireless communication IO 116 which is an interface for connecting a wireless communication device 118.
 なお、画像処理デバイス111で実行される機能は、CPU112が、ROM114に記憶されているプログラムをRAM113に読み込み、読み込まれたプログラムを実行することで、実現されればよい。このようなプログラムは、ネットワークを通じて提供されてもよく、また、記録媒体に記録されて提供されてもよい。このようなプログラムは、例えば、プログラムプロダクトとして提供されてもよい。画像処理デバイス111は、コンピュータとして提供されてもよい。
 なお、画像処理デバイス111としては、一般的な計算機が用いられてもよく、組込機器又はFPGAボード等に上記デバイスを集約したものが用いられてもよい。
The functions executed by the image processing device 111 may be realized by the CPU 112 reading a program stored in the ROM 114 into the RAM 113 and executing the read program. Such a program may be provided through a network, or may be provided by being recorded on a recording medium. Such a program may be provided as a program product, for example. The image processing device 111 may be provided as a computer.
As the image processing device 111, a general computer may be used, or a device in which the above devices are integrated into an embedded device or an FPGA board may be used.
 撮像デバイス117は、光線空間画像を取得可能な光線空間カメラである。
 光線空間画像は、任意の3次元空間に存在する光線の位置及び方向を示す画像である。図7に示されているように、2次元平面上に到達する光線の位置及び方向を取得できれば、その光線空間画像から3次元空間上の光線空間画像を算出可能となる。
The imaging device 117 is a light space camera capable of acquiring a light space image.
The light ray space image is an image showing the position and direction of a light ray existing in an arbitrary three-dimensional space. As shown in FIG. 7, if the position and direction of the light beam that reaches the two-dimensional plane can be obtained, a light space image in a three-dimensional space can be calculated from the light space image.
 通常のカメラは、光線の位置のみで、その方向を記録できないが、光線空間カメラを使用することで、光線の方向も記録可能となる。 A normal camera cannot record the direction of a light beam only, but it can record the direction of a light beam by using a light beam space camera.
 図8(a)及び(b)は、マイクロレンズ-アレイを使用した光線空間カメラの一例を説明するための概略図である。図8(a)は、通常のカメラの光学モデルを示し、図8(b)は、マイクロレンズ-アレイを使用した光線空間カメラの光学モデルを示す。 FIGS. 8A and 8B are schematic diagrams for explaining an example of a light beam spatial camera using a microlens-array. FIG. 8A shows an optical model of a normal camera, and FIG. 8B shows an optical model of a ray space camera using a microlens-array.
 図8(a)に示されているように、通常のカメラでは、被写体からの光をレンズ117aで撮像面117bに結像させて、被写体の画像を撮像する。例えば、図8(a)では、被写体1からの光がレンズ117aにより撮像面117bに集光しており、被写体1にピントのあった画像を取得することができる。被写体2の光は、撮像面117bに集光しておらず、ピントのずれた画像となる。 As shown in FIG. 8A, in a normal camera, light from a subject is focused on an imaging surface 117b by a lens 117a to capture an image of the subject. For example, in FIG. 8A, light from the subject 1 is condensed on the imaging surface 117b by the lens 117a, and an image in which the subject 1 is in focus can be obtained. The light of the subject 2 is not converged on the imaging surface 117b and becomes an out-of-focus image.
 図8(b)に示されているように、光線空間カメラでは、レンズ117aと、撮像面117bとの間に、複数のマイクロレンズ117cを配置したマイクロレンズ-アレイが設置されている。
 撮像面117bは、マイクロレンズ117cを透過した光線が合焦する位置に配置される。
As shown in FIG. 8B, in the light beam space camera, a microlens-array in which a plurality of microlenses 117c are arranged is provided between a lens 117a and an imaging surface 117b.
The imaging surface 117b is arranged at a position where the light beam transmitted through the micro lens 117c is focused.
 レンズは、光線の方向を像の位置に変換するデバイスであるため、マイクロレンズ117cに到達した光線の方向を撮像面117bに到達した光線の位置から算出することができる。 Since the lens is a device that converts the direction of a light beam into the position of an image, the direction of the light beam that has reached the microlens 117c can be calculated from the position of the light beam that has reached the imaging surface 117b.
 図8(b)では、レンズ117aと、撮像面117bとの間にマイクロレンズ-アレイを追加する例を示したが、光線空間カメラは、図9に示されているように、複数台の通常のカメラ117dを縦横左右に並べて設置することによっても構成することができる。 FIG. 8B shows an example in which a microlens-array is added between the lens 117a and the imaging surface 117b. However, as shown in FIG. Can also be configured by arranging the cameras 117d vertically and horizontally and horizontally.
 図6に戻り、無線通信デバイス118は、HUD150との間で無線通信を行うための通信デバイスであり、HUD150との間で通信を行う通信部として機能する。無線通信としては、例えば、Wi-Fi(登録商標)等の無線LAN(Local Area Network)が使用されればよい。 Returning to FIG. 6, the wireless communication device 118 is a communication device for performing wireless communication with the HUD 150, and functions as a communication unit for performing communication with the HUD 150. As the wireless communication, for example, a wireless LAN (Local Area Network) such as Wi-Fi (registered trademark) may be used.
 HUD150は、画像処理デバイス151と、画像表示デバイス157と、無線通信デバイス158とを備える。 The HUD 150 includes an image processing device 151, an image display device 157, and a wireless communication device 158.
 画像処理デバイス151は、画像表示デバイス157へ表示画像を出力するデバイスである。画像処理デバイス151は、プロセッサとして機能するCPU152と、メモリとして機能するRAM153及びROM154と、画像表示デバイス157を接続するためのインタフェースである表示IO155と、無線通信デバイス158を接続するためのインタフェースである無線通信IO156とを備える。 The image processing device 151 is a device that outputs a display image to the image display device 157. The image processing device 151 is an interface for connecting the CPU 152 that functions as a processor, the RAM 153 and the ROM 154 that function as memories, the display IO 155 that is an interface for connecting the image display device 157, and the wireless communication device 158. And a wireless communication IO 156.
 画像歪み補正処理時には、画像処理デバイス151は、画像歪み検査装置110で検査された画像歪値から算出した歪補正値を参照して、入力画像を変形し、変形された入力画像である歪補正画像を画像表示デバイス157へ出力する。 At the time of the image distortion correction processing, the image processing device 151 deforms the input image with reference to the distortion correction value calculated from the image distortion value inspected by the image distortion inspection device 110, and performs distortion correction as the deformed input image. The image is output to the image display device 157.
 画像処理デバイス151としては、一般的な計算機が用いられてもよく、組込機器又はFPGAボード等に上記デバイスを集約したものが用いられてもよい。 As the image processing device 151, a general computer may be used, or a device in which the above devices are integrated into an embedded device or an FPGA board may be used.
 画像表示デバイス157は、HUD光源102及びコンバイナ103を含む画像投影装置である。例えば、画像表示デバイス157は、画像処理デバイス151から取得した歪補正画像を、HUD光源102からコンバイナ103へ投影して虚像を表示する。
 無線通信デバイス158は、画像歪み検査装置110との間で無線通信を行う通信デバイスであり、画像歪み検査装置110との間で通信を行う通信部として機能する。
The image display device 157 is an image projection device including the HUD light source 102 and the combiner 103. For example, the image display device 157 projects the distortion corrected image acquired from the image processing device 151 from the HUD light source 102 to the combiner 103 to display a virtual image.
The wireless communication device 158 is a communication device that performs wireless communication with the image distortion inspection device 110, and functions as a communication unit that communicates with the image distortion inspection device 110.
 図6では、画像歪み検査装置110と、HUD150との間の通信に無線通信が使用されているが、LANケーブル又はCANケーブル等の有線を用いて、有線通信が行われてもよい。この場合、図示してはいないが、画像歪み検査装置110と、HUD150とは、それぞれ、有線通信を行うための有線通信デバイスを備えればよい。 In FIG. 6, wireless communication is used for communication between the image distortion inspection device 110 and the HUD 150. However, wired communication may be performed using a wired cable such as a LAN cable or a CAN cable. In this case, although not shown, the image distortion inspection apparatus 110 and the HUD 150 may each include a wired communication device for performing wired communication.
 また、画像歪み検査装置110で検査した画像歪値を任意の記録媒体に保存し、その記録媒体をHUD150側で読み込むことで無線通信デバイス又は優先通信デバイスの無いハードウェア構成としてもよい。この場合、図示してはいないが、画像歪み検査装置110は、記録媒体にデータを書き込むための書込デバイスを備え、HUD150は、記録媒体からデータを読み込むための読込デバイスを備えればよい。 (4) The image distortion value inspected by the image distortion inspection apparatus 110 may be stored in an arbitrary recording medium, and the recording medium may be read by the HUD 150 so that a hardware configuration without a wireless communication device or a priority communication device may be adopted. In this case, although not shown, the image distortion inspection apparatus 110 may include a writing device for writing data on a recording medium, and the HUD 150 may include a reading device for reading data from the recording medium.
 図10は、画像歪み検査装置110の機能構成を概略的に示すブロック図である。
 画像歪み検査装置110は、撮像部130と、光線検出部131と、表示位置特定部132と、表示位置対応情報記憶部133と、画像歪値算出部134とを備える。
 撮像部130の機能を実現するハードウェアは、図6の撮像デバイス117である。光線検出部131、表示位置特定部132、表示位置対応情報記憶部133及び画像歪値算出部134の機能を実現するハードウェアは、図6の画像処理デバイス111である。
FIG. 10 is a block diagram schematically showing a functional configuration of the image distortion inspection device 110.
The image distortion inspection device 110 includes an imaging unit 130, a light ray detection unit 131, a display position identification unit 132, a display position correspondence information storage unit 133, and an image distortion value calculation unit 134.
The hardware that implements the function of the imaging unit 130 is the imaging device 117 in FIG. The hardware that realizes the functions of the light ray detection unit 131, the display position identification unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134 is the image processing device 111 in FIG.
 撮像部130は、虚像を投影するためにHUD150から出力された光による画像を、光線が入力される位置である光線位置及びその光線が入力される方向である光線方向がわかるように、車両101の外部から撮像する。具体的には、HUD光源102が原画像としての歪み検査用画像を表示した際に、HUD光源102から出力され、コンバイナ103を透過した透過光による光線空間画像を撮像部130で撮像する。 The imaging unit 130 converts the image of the light output from the HUD 150 to project a virtual image into the vehicle 101 so that the position of the light beam and the direction of the light beam can be recognized. Image from outside. Specifically, when the HUD light source 102 displays a distortion inspection image as an original image, the imaging unit 130 captures a light space image output from the HUD light source 102 and transmitted by the combiner 103.
 図11は、画像表示デバイス157であるHUD150の光学モデルの一例を示す概略図である。
 HUD光源102から出力された光は、コンバイナ103に到達し、反射光と透過光とに分かれる。撮像部130には、この透過光の光線が入力される。
FIG. 11 is a schematic diagram illustrating an example of an optical model of the HUD 150 that is the image display device 157.
The light output from the HUD light source 102 reaches the combiner 103 and is split into reflected light and transmitted light. The light beam of the transmitted light is input to the imaging unit 130.
 図12は、歪み検査用画像の一例を示す概略図である。
 図12に示されている歪み検査用画像176は、黒色背景に縦横方向の白線を重畳した画像である。この白線の歪み度合いから画像の歪み度合いが検査される。言い換えると、白線を構成する画素を検査用の点として、この検査用の点を用いて、画像の歪み度合いが検査される。なお、検査用の点は、白線同士が交差している画素であってもよい。
FIG. 12 is a schematic diagram illustrating an example of a distortion inspection image.
The distortion inspection image 176 shown in FIG. 12 is an image in which vertical and horizontal white lines are superimposed on a black background. The degree of distortion of the image is inspected from the degree of distortion of the white line. In other words, the pixels constituting the white line are used as inspection points, and the degree of image distortion is inspected using the inspection points. Note that the inspection point may be a pixel where white lines cross each other.
 図10に戻り、光線検出部131は、撮像部130で撮像された画像から、検査用の点を示す光線の光線位置及び光線方向を検出する検出部である。以下に、撮像デバイス117としてマイクロレンズ-アレイを使用した光線空間カメラが使用された際に、光線位置及び光線方向を検出する例を示す。 戻 り Returning to FIG. 10, the light beam detection unit 131 is a detection unit that detects the light beam position and the light beam direction of the light beam indicating the inspection point from the image captured by the imaging unit 130. In the following, an example will be described in which a light beam position and a light beam direction are detected when a light beam space camera using a microlens-array is used as the imaging device 117.
 図13は、マイクロレンズ-アレイと撮像面との関係を示す概略図である。
 レンズ117aからマイクロレンズ117cに到達した光線は、撮像面117bで合焦する。
FIG. 13 is a schematic diagram showing the relationship between the microlens-array and the imaging surface.
The light beam that has reached the microlens 117c from the lens 117a is focused on the imaging surface 117b.
 図14は、マイクロレンズ117cと、撮像面117bとの関係を説明するための概略図である。
 図14に示されているように、マイクロレンズ-アレイに含まれている1つのマイクロレンズ117c#1には、撮像面117bに含まれている複数の画素からなる画素領域117b#1が割り当てられている。画素領域117b#1には、レンズ117aからマイクロレンズ117c#1に到達した光が、撮像面117bで合焦する範囲の画素が含まれる。
FIG. 14 is a schematic diagram for explaining the relationship between the micro lens 117c and the imaging surface 117b.
As shown in FIG. 14, one microlens 117c # 1 included in the microlens-array is assigned a pixel area 117b # 1 including a plurality of pixels included in the imaging surface 117b. ing. The pixel region 117b # 1 includes pixels in a range where light that has reached the microlenses 117c # 1 from the lens 117a is focused on the imaging surface 117b.
 光線位置は、光線が到達したマイクロレンズ-アレイの位置から検出される。光線位置の検出方法は、通常カメラとほぼ同様であり、光線空間カメラでは、1つのマイクロレンズが通常カメラの撮像面の1画素に対応する。ただし、光線空間カメラではマイクロレンズに到達した光線が撮像面で合焦するため、マイクロレンズ-アレイの設置位置では合焦しない光線も取得可能である点は異なる。ここでは、検査用の点を示す光線の光線位置が検出される。 The ray position is detected from the position of the microlens-array that the ray has reached. The method of detecting the position of the light beam is almost the same as that of the normal camera. In the light space camera, one microlens corresponds to one pixel on the imaging surface of the normal camera. However, in the light beam space camera, the light beam that has reached the microlens is focused on the imaging surface, so that a light beam that is not focused can be acquired at the installation position of the microlens-array. Here, the position of the light beam indicating the inspection point is detected.
 レンズは、光線の方向を像の位置に変換するデバイスであるため、マイクロレンズ117cに到達した光線の方向によって、撮像面117bに到達する光線の位置が変化する。このため、光線の撮像面117bへの到達位置から光線方向を検出することができる。具体的には、光線方向は、1つのマイクロレンズ117c#1からの光線が、対応する画素領域117b#1で合焦する位置により、その光線の方向を検出することができる。ここでは、検査用の点を示す光線の光線方向が検出される。例えば、検査用の点を示す光線の光線位置が特定された場合に、その光線位置に入力される光線の光線方向が検出される。 Since the lens is a device that converts the direction of a light beam into the position of an image, the position of the light beam that reaches the imaging surface 117b changes depending on the direction of the light beam that has reached the microlens 117c. For this reason, the direction of the light beam can be detected from the position where the light beam reaches the imaging surface 117b. Specifically, the direction of the light beam can be detected by the position where the light beam from one micro lens 117c # 1 is focused on the corresponding pixel area 117b # 1. Here, the direction of the light beam indicating the inspection point is detected. For example, when the ray position of a ray indicating a point for inspection is specified, the ray direction of the ray input to the ray position is detected.
 図10に戻り、表示位置特定部132は、光線検出部131で検出された光線位置及び光線方向から、HUD150の虚像表示位置を特定する。具体的には、表示位置特定部132は、光線検出部131で検出された、検査用の点に対応する点の光線位置及び光線方向から、HUD150が投影する虚像において、検査用の点に対応する点が表示される位置である虚像表示位置を特定する。 Returning to FIG. 10, the display position specifying unit 132 specifies the virtual image display position of the HUD 150 from the light beam position and the light beam direction detected by the light beam detecting unit 131. Specifically, the display position specifying unit 132 corresponds to the inspection point in the virtual image projected by the HUD 150 from the ray position and the ray direction of the point corresponding to the inspection point detected by the ray detection unit 131. The virtual image display position which is the position where the point to be displayed is displayed is specified.
 虚像表示位置の特定では、虚像表示位置と、光線位置及び光線方向との対応関係を表す情報を予め作成しておき、表示位置対応情報として、表示位置対応情報記憶部133に記憶させておく。そして、表示位置特定部132は、表示位置対応情報を参照することで、光線検出部131で検出された光線位置及び光線方向に対応する虚像表示位置を特定する。 In specifying the virtual image display position, information indicating the correspondence between the virtual image display position, the light ray position and the light ray direction is created in advance, and stored in the display position correspondence information storage unit 133 as the display position correspondence information. Then, the display position specifying unit 132 specifies the virtual image display position corresponding to the light beam position and the light beam direction detected by the light beam detection unit 131 by referring to the display position correspondence information.
 表示位置対応情報記憶部133は、表示位置対応情報を記憶する情報記憶部である。表示位置対応情報は、HUD150の光学設計値から導出される。 The display position correspondence information storage unit 133 is an information storage unit that stores display position correspondence information. The display position correspondence information is derived from the optical design values of the HUD 150.
 図15は、虚像表示位置を説明するための概略図である。
 運転者は、コンバイナ103の反射光によって、虚像表示位置において虚像を視認する。
 コンバイナ103の反射面は、凹面形状であり、車両101の内部に設置されたHUD光源102からの光は、この凹面ミラーによって反射する。凹面ミラーの反射光は、凸レンズの透過光と同様の働きをするものであり、HUD光源102を仮想的に車両101の外部側へ反転移動した仮想HUD光源102#1によれば、凹面ミラーは、凸レンズとして置き換え可能である。
FIG. 15 is a schematic diagram for explaining a virtual image display position.
The driver visually recognizes the virtual image at the virtual image display position by the reflected light of the combiner 103.
The reflecting surface of the combiner 103 has a concave shape, and the light from the HUD light source 102 installed inside the vehicle 101 is reflected by the concave mirror. The reflected light from the concave mirror has the same function as the transmitted light from the convex lens. According to the virtual HUD light source 102 # 1 in which the HUD light source 102 is virtually inverted to the outside of the vehicle 101, the concave mirror is , Can be replaced as a convex lens.
 図16は、仮想HUD光源102#1を用いて、コンバイナ103の凹面ミラーを凸レンズ103#1に置き換えた例を示す概略図である。
 図16に示されている凸レンズ103#1の焦点距離をfとし、仮想HUD光源102#1と凸レンズ103#1との距離をaとすると、凸レンズ103#1から虚像表示位置までのZ軸方向の距離bは、下記の式(1)の関係式を満たす。
Figure JPOXMLDOC01-appb-M000001
FIG. 16 is a schematic diagram showing an example in which the concave mirror of the combiner 103 is replaced with a convex lens 103 # 1 using the virtual HUD light source 102 # 1.
Assuming that the focal length of the convex lens 103 # 1 shown in FIG. 16 is f and the distance between the virtual HUD light source 102 # 1 and the convex lens 103 # 1 is a, the Z-axis direction from the convex lens 103 # 1 to the virtual image display position Satisfies the following relational expression (1).
Figure JPOXMLDOC01-appb-M000001
 HUD150の光学設計値である凸レンズ103#1の焦点距離fは、コンバイナ103の凹面ミラーの焦点距離と同様であり、既知である。また、仮想HUD光源102#1と凸レンズ103#1との距離aは、HUD光源102からコンバイナ103までの距離と同様であり、既知である。このため、式(1)により、凸レンズ103#1から虚像表示位置までのZ軸方向の距離bを算出することができる。 The focal length f of the convex lens 103 # 1, which is the optical design value of the HUD 150, is the same as the focal length of the concave mirror of the combiner 103, and is known. The distance a between the virtual HUD light source 102 # 1 and the convex lens 103 # 1 is the same as the distance from the HUD light source 102 to the combiner 103 and is known. Therefore, the distance b in the Z-axis direction from the convex lens 103 # 1 to the virtual image display position can be calculated by Expression (1).
 また、下記の式(2)より、算出された距離bと、仮想HUD光源102#1の各画素の表示位置(XS1,S1,a)から、その画素に対応する虚像表示位置(XV1,YV1,b)も算出できる。
Figure JPOXMLDOC01-appb-M000002
Also, based on the distance b calculated from the following equation (2) and the display position (X S1, Y S1, a) of each pixel of the virtual HUD light source 102 # 1 , a virtual image display position (X V1 , YV1 , b) can also be calculated.
Figure JPOXMLDOC01-appb-M000002
 以上より、HUD150の光学設計値から虚像表示位置が算出可能となる。
 また、HUD光源102の設置位置と、コンバイナ103の光学特性とにより、HUD光源102から投影してコンバイナ103で反射した反射光の光線位置及び光線方向も特定される。これにより、虚像表示位置と、反射光の光線位置及び光線方向との対応関係を表す情報を作成することができる。
As described above, the virtual image display position can be calculated from the optical design value of the HUD 150.
Further, the position and direction of the light beam reflected from the HUD light source 102 and reflected by the combiner 103 are also specified based on the installation position of the HUD light source 102 and the optical characteristics of the combiner 103. This makes it possible to create information indicating the correspondence between the virtual image display position and the ray position and ray direction of the reflected light.
 図17は、虚像表示位置と、反射光の光線位置及び光線方向との対応関係を表す情報の一例を示す概略図である。
 虚像表示位置は、3次元空間上の一点で表わされる。図17の(XV1,YV1,ZV1)は、X軸値がXV1、Y軸値がYV1、Z軸値がZV1の一点の位置を表す。
 反射光の光線位置及び光線方向は、3次元空間上の一点と、ベクトル方向とで表される。図17の(XR1,YR1,ZR1,θR1,φR1)はX軸値がXR1、Y軸値がYR1、Z軸値がZR1の一点の位置を表し、X軸-Z軸平面の角度がθR1、Y軸-Z軸方向の角度がφR1のベクトル方向を表す。
FIG. 17 is a schematic diagram illustrating an example of information indicating a correspondence relationship between a virtual image display position and a light ray position and a light ray direction of reflected light.
The virtual image display position is represented by one point in a three-dimensional space. (X V1 , Y V1 , Z V1 ) in FIG. 17 indicates the position of one point of the X-axis value X V1 , the Y-axis value Y V1 , and the Z-axis value Z V1 .
The ray position and ray direction of the reflected light are represented by one point on the three-dimensional space and the vector direction. In FIG. 17, (X R1 , Y R1 , Z R1 , θ R1 , φ R1 ) indicates the position of one point of the X-axis value X R1 , the Y-axis value of Y R1 , and the Z-axis value of Z R1. The angle of the Z axis plane is θ R1 , and the angle between the Y axis and the Z axis direction is the vector direction of φ R1 .
 図17に示されている情報では、光線位置及び光線方向が(XR1,YR1,ZR1,θR1,φR1)である反射光が作る虚像表示位置が(XV1,YV1,ZV1)となり、光線位置及び光線方向が(XR2,YR2,ZR2,θR2,φR2)である反射光が作る虚像表示位置が(XV2,YV2,ZV2)となる。 The information shown in Figure 17, ray position and beam direction (X R1, Y R1, Z R1, θ R1, φ R1) a virtual image display position at which reflected light is made is (X V1, Y V1, Z V1), and the light ray position and ray direction (X R2, Y R2, Z R2, θ R2, virtual image display position phi R2) in which reflected light is made is (X V2, Y V2, Z V2).
 なお、図17では、反射光の光線位置及び光線方向を3次元空間上の一点と、ベクトル方向とで表したが、図7に示されているように任意の2次元平面を仮定し、光線と、2次元平面との交差位置及びベクトル方向で表現してもよい。この場合、例えば、図17の光線位置及び光線方向(XR1,YR1,ZR1R1,φR1)は、交差位置(XR1,YR1)及びベクトル方向(θR1,φR1)で表すことができる。 In FIG. 17, the light ray position and the light ray direction of the reflected light are represented by one point in the three-dimensional space and the vector direction. However, as shown in FIG. May be expressed by the position of intersection with the two-dimensional plane and the vector direction. In this case, for example, ray position and beam direction of FIG. 17 (X R1, Y R1, Z R1, θ R1, φ R1) is crossed position (X R1, Y R1) and the vector direction (θ R1, φ R1) Can be represented by
 運転手は、コンバイナ103の反射光により虚像を視認するが、光線検出部131は、コンバイナ103の透過光を検出する。このため、コンバイナ103の反射光と透過光との対応関係を示す情報を予め算出しておく。例えば、コンバイナ103の光学特性を参照して、反射光の光線位置及び光線方向と、透過光の光線位置及び光線方向との対応関係を特定すればよい。 The driver visually recognizes the virtual image by the reflected light of the combiner 103, but the light beam detector 131 detects the transmitted light of the combiner 103. For this reason, information indicating the correspondence between the reflected light and the transmitted light of the combiner 103 is calculated in advance. For example, the correspondence between the ray position and the ray direction of the reflected light and the ray position and the ray direction of the transmitted light may be specified with reference to the optical characteristics of the combiner 103.
 図18は、コンバイナ103の反射光と透過光との対応関係を示す情報の一例を示す概略図である。
 コンバイナ103の反射光及び透過光の光線位置及び光線方向は、それぞれ、3次元空間上の一点とベクトル方向とで表す。例えば、図18に示されている、(XR1,YR1,ZR1,θR1,φR1)は反射光の光線位置及び光線方向を表し、(XT1,YT1,ZT1,θT1,φT1)は、その反射光の光線位置及び光線方向に対応する透過光の光線位置及び光線方向を表す。
FIG. 18 is a schematic diagram illustrating an example of information indicating the correspondence between the reflected light and the transmitted light of the combiner 103.
The light ray position and the light ray direction of the reflected light and the transmitted light of the combiner 103 are represented by one point on the three-dimensional space and the vector direction, respectively. For example, (X R1 , Y R1 , Z R1 , θ R1 , φ R1 ) shown in FIG. 18 represent the ray position and ray direction of the reflected light, and (X T1 , Y T1 , Z T1 , θ T1). , Φ T1 ) represents the ray position and ray direction of the transmitted light corresponding to the ray position and ray direction of the reflected light.
 虚像表示位置と、透過光の光線位置及び光線方向との対応関係を表す表示位置対応情報は、虚像表示位置と反射光との対応関係を表す情報と、反射光と透過光との対応関係を表す情報とから作成される。
 例えば、図19は、表示位置対応情報の一例を示す概略図である。
 図19に示されている表示位置対応情報は、図17に示されている情報と、図18に示されている情報とにより作成されたものである。
The display position correspondence information indicating the correspondence between the virtual image display position and the ray position and the ray direction of the transmitted light is information indicating the correspondence between the virtual image display position and the reflected light, and the correspondence between the reflected light and the transmitted light. It is created from the information to represent.
For example, FIG. 19 is a schematic diagram illustrating an example of the display position correspondence information.
The display position correspondence information shown in FIG. 19 is created based on the information shown in FIG. 17 and the information shown in FIG.
 以上より、HUD150の光学設計値から表示位置対応情報を導出することができる。図10に示されている表示位置対応情報記憶部133は、そのような表示位置対応情報を記憶し、表示位置特定部132は、表示位置対応情報記憶部133に記憶されている表示位置対応情報を参照することで、光線検出部131で検出された透過光の光線位置及び光線方向から虚像表示位置を特定することができる。 From the above, the display position correspondence information can be derived from the optical design values of the HUD 150. The display position correspondence information storage unit 133 shown in FIG. 10 stores such display position correspondence information, and the display position identification unit 132 stores the display position correspondence information stored in the display position correspondence information storage unit 133. , The virtual image display position can be specified from the light beam position and the light beam direction of the transmitted light detected by the light beam detection unit 131.
 上記では、表示位置対応情報をHUD150の光学設計値から導出する例を示したが、より精度の高い表示位置対応情報を作成するため、予め数台のHUD150に対して、コンバイナ103の透過光と反射光との両方を用いて撮像部130で撮像し、光線位置及び光線方向を算出して、表示位置対応情報を作成してもよい。 In the above description, an example in which the display position correspondence information is derived from the optical design values of the HUD 150 has been described. However, in order to create more accurate display position correspondence information, the transmission light of the combiner 103 and the HUD 150 are transmitted in advance. An image may be captured by the imaging unit 130 using both the reflected light and the light ray position and the light ray direction may be calculated to create the display position correspondence information.
 画像歪値算出部134は、表示位置特定部132で特定された虚像表示位置から画像の歪み度合いを表す画像歪値を算出する歪み算出部である。具体的には、画像歪値算出部134は、原画像である歪み検査用画像における検査用の点の画素の位置である画素位置と、その検査用の点に対応する虚像表示位置とから、HUD150により投影される虚像において、歪み検査用画像が歪む度合いである歪み度合いを示す画像歪値を算出する。なお、検査用の点以外の画素については、検査用の点から算出された画像歪値により補間すればよい。 The image distortion value calculation unit 134 is a distortion calculation unit that calculates an image distortion value indicating the degree of image distortion from the virtual image display position specified by the display position specification unit 132. Specifically, the image distortion value calculation unit 134 calculates a pixel position that is a position of a pixel of the inspection point in the distortion inspection image that is the original image and a virtual image display position corresponding to the inspection point. In the virtual image projected by the HUD 150, an image distortion value indicating a degree of distortion that is a degree of distortion of the distortion inspection image is calculated. Note that pixels other than the inspection point may be interpolated using the image distortion value calculated from the inspection point.
 画像歪値の算出では、図3に示した任意位置に仮想的なカメラを配置し、その仮想的なカメラで虚像を撮像した際に得られる画像の画素位置を算出する。算出式の一例を式(3)に示す。
Figure JPOXMLDOC01-appb-M000003
In the calculation of the image distortion value, a virtual camera is arranged at an arbitrary position shown in FIG. 3, and a pixel position of an image obtained when a virtual image is captured by the virtual camera is calculated. An example of the calculation formula is shown in formula (3).
Figure JPOXMLDOC01-appb-M000003
 ここで、fは、仮想的なカメラの焦点距離を表し、(XV1,YV1,ZV1)は、3次元空間上の虚像表示位置を表す。(U,V)は仮想的なカメラの撮像画像の画素位置を表し、この値が画像歪値となる。 Here, f C represents a virtual camera focal length, and (X V1 , Y V1 , Z V1 ) represents a virtual image display position in a three-dimensional space. (U, V) represents the pixel position of the image captured by the virtual camera, and this value is the image distortion value.
 以上より、車両101の外部に設置した撮像部130で歪み検査用画像を撮像し、HUDの光学部材によって変化したHUD光源の光線位置及び光線方向を計測して画像の歪み度合いを検査可能な画像歪み検査装置を提供することができる。 As described above, an image capable of inspecting the degree of distortion of an image by capturing an image for distortion inspection with the imaging unit 130 installed outside the vehicle 101 and measuring the ray position and ray direction of the HUD light source changed by the optical member of the HUD. A distortion inspection device can be provided.
 図20は、実施の形態1に係る画像歪み検査装置110における検査方法を示すフローチャートである。
 HUD光源102に歪み検査用画像176を表示して、撮像部130は、コンバイナ103を透過した透過光による画像を撮像する(S10)。
FIG. 20 is a flowchart illustrating an inspection method in image distortion inspection apparatus 110 according to Embodiment 1.
The distortion inspection image 176 is displayed on the HUD light source 102, and the imaging unit 130 captures an image using the transmitted light transmitted through the combiner 103 (S10).
 次に、光線検出部131は、ステップS10で撮像された画像から、透過光の光線位置及び光線方向を算出する(S11)。例えば、光線検出部131は、撮像部130に光線空間カメラを使用して、透過光が光線空間カメラの撮像面に到達した位置から光線位置及び光線方向を算出する。 Next, the light beam detection unit 131 calculates the light beam position and the light beam direction of the transmitted light from the image captured in step S10 (S11). For example, the light ray detection unit 131 calculates a light ray position and a light ray direction from a position where transmitted light reaches an imaging surface of the light ray space camera by using a light ray space camera for the imaging unit 130.
 次に、表示位置特定部132は、ステップS12で算出された光線位置及び光線方向から、虚像表示位置を特定する(S12)。例えば、表示位置特定部132は、表示位置対応情報記憶部133に記憶されている表示位置対応情報を参照して、ステップS12で算出された光線位置及び光線方向に対応する虚像表示位置を特定する。 Next, the display position specifying unit 132 specifies a virtual image display position from the light beam position and the light beam direction calculated in step S12 (S12). For example, the display position specifying unit 132 specifies the virtual image display position corresponding to the light ray position and the light ray direction calculated in step S12 with reference to the display position corresponding information stored in the display position corresponding information storage unit 133. .
 次に、画像歪値算出部134は、ステップS12で特定された虚像表示位置から画像歪値を算出する(S13)。例えば、画像歪値算出部134は、図3に示されている任意位置に仮想的なカメラを配置し、その仮想的なカメラで虚像を撮像した際に得られる画像の画素位置を求めることで画像歪値を算出する。 Next, the image distortion value calculator 134 calculates an image distortion value from the virtual image display position specified in step S12 (S13). For example, the image distortion value calculation unit 134 arranges a virtual camera at an arbitrary position shown in FIG. 3 and obtains a pixel position of an image obtained when a virtual image is captured by the virtual camera. Calculate the image distortion value.
 以上より、車両101の外部に設置した光線空間カメラである撮像部130で歪み検査用画像176を撮像し、HUD150の光学部材によって変化したHUD光源102の光線位置及び光線方向を計測して、画像の歪み度合いを検査可能な画像歪み検査方法を提供できる。 As described above, the distortion inspection image 176 is captured by the imaging unit 130, which is a light space camera installed outside the vehicle 101, and the light position and the light direction of the HUD light source 102 changed by the optical member of the HUD 150 are measured. And an image distortion inspection method capable of inspecting the degree of distortion.
実施の形態2.
 図1に示されているように、実施の形態2における画像歪み検査システム200は、画像歪み検査装置210を含んでいる。
 図6に示されているように、実施の形態2に係る画像歪み検査装置210のハードウェア構成は、実施の形態1における画像歪み検査装置110のハードウェア構成と同様である。実施の形態2でも、実施の形態1と同様のHUD150が使用される。
Embodiment 2 FIG.
As shown in FIG. 1, the image distortion inspection system 200 according to the second embodiment includes an image distortion inspection device 210.
As shown in FIG. 6, the hardware configuration of the image distortion inspection device 210 according to the second embodiment is the same as the hardware configuration of the image distortion inspection device 110 according to the first embodiment. Also in the second embodiment, the same HUD 150 as in the first embodiment is used.
 実施の形態2では、歪み検査用画像を時間経過によって変更することで、画像歪値の検査精度を向上した画像歪み検査装置210及び画像歪み検査方法を提供する。 In the second embodiment, the image distortion inspection apparatus 210 and the image distortion inspection method that improve the inspection accuracy of the image distortion value by changing the distortion inspection image over time are provided.
 図21は、実施の形態2に係る画像歪み検査装置210の機能構成を概略的に示すブロック図である。
 画像歪み検査装置210は、撮像部130と、光線検出部131と、表示位置特定部132と、表示位置対応情報記憶部133と、画像歪値算出部134と、検査用画像記憶部235と、通信制御部236と、通信部237とを備える。
FIG. 21 is a block diagram schematically showing a functional configuration of the image distortion inspection device 210 according to the second embodiment.
The image distortion inspection device 210 includes an imaging unit 130, a light ray detection unit 131, a display position identification unit 132, a display position correspondence information storage unit 133, an image distortion value calculation unit 134, an inspection image storage unit 235, A communication control unit 236 and a communication unit 237 are provided.
 実施の形態2における撮像部130、光線検出部131、表示位置特定部132、表示位置対応情報記憶部133及び画像歪値算出部134は、実施の形態1における撮像部130、光線検出部131、表示位置特定部132、表示位置対応情報記憶部133及び画像歪値算出部134と同様である。
 なお、検査用画像記憶部235及び通信制御部236の機能を実現するハードウェアは、図6の画像処理デバイス111である。通信部237の機能を実現するハードウェアは、図6の無線通信デバイス118である。
The imaging unit 130, the light beam detection unit 131, the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134 in the second embodiment are the same as those of the first embodiment. This is the same as the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134.
The hardware that implements the functions of the inspection image storage unit 235 and the communication control unit 236 is the image processing device 111 in FIG. The hardware that implements the function of the communication unit 237 is the wireless communication device 118 in FIG.
 検査用画像記憶部235は、複数の歪み検査用画像を記憶する。複数の歪み検査用画像は、それぞれ異なっているものとする。
 実施の形態2では時間経過に伴って、HUD光源102に表示させる歪み検査用画像が変更される。例えば、図22(a)~(c)に示されている歪み検査用画像276A~276Cの中から、表示される画像が、時間経過によって変更されて、黒画像領域と白画像領域との境界線の歪みが検査されてもよい。言い換えると、画像領域と白画像領域との境界線を構成する画素を検査用の点として、画像歪みの検出が行われればよい。
The inspection image storage unit 235 stores a plurality of distortion inspection images. It is assumed that the plurality of distortion inspection images are different from each other.
In the second embodiment, the distortion inspection image displayed on the HUD light source 102 is changed as time elapses. For example, among the distortion inspection images 276A to 276C shown in FIGS. 22A to 22C, the displayed image is changed over time, and the boundary between the black image region and the white image region is changed. Line distortion may be checked. In other words, the image distortion may be detected by using the pixels forming the boundary between the image area and the white image area as inspection points.
 また、図23(a)~(c)に示されている歪み検査用画像276D~276Fのように、黒色背景に白点を1画素だけ表示し、時間経過に伴って、白点位置が画像内の縦横方向に走査されるようにしてよい。言い換えると、白点を検査用の点として、画像歪みの検出が行われればよい。 In addition, as in the distortion inspection images 276D to 276F shown in FIGS. 23A to 23C, only one pixel is displayed on a black background, and the position of the white point changes with time. May be scanned in the vertical and horizontal directions. In other words, the image distortion may be detected using the white point as the inspection point.
 通信制御部236は、HUD150との間で行う通信を制御する。例えば、通信制御部236は、検査用画像記憶部235に記憶されている複数の歪み検査用画像から、時間経過に従って、1つの歪み検査用画像を順番に選択し、選択された歪み検査用画像を、通信部237にHUD150へ送信させる。例えば、通信制御部236は、予め定められた時間が経過する毎に、順番に、1つの歪み検査用画像を通信部237に送信させればよい。 The communication control unit 236 controls communication performed with the HUD 150. For example, the communication control unit 236 sequentially selects one distortion inspection image from a plurality of distortion inspection images stored in the inspection image storage unit 235 with time, and selects the selected distortion inspection image. Is transmitted to the HUD 150 by the communication unit 237. For example, the communication control unit 236 may sequentially transmit one distortion inspection image to the communication unit 237 every time a predetermined time elapses.
 通信部237は、HUD150との間で通信を行う。ここでは、通信部237は、無線通信を行う。 The communication unit 237 performs communication with the HUD 150. Here, the communication unit 237 performs wireless communication.
 HUD150は、無線通信デバイス158で受信した歪み検査用画像を、画像処理デバイス151により、HUD光源102の液晶パネルに表示する。次に、HUD150で歪み検査用画像を表示したことを画像歪み検査装置210へ通知するために、HUD150の画像処理デバイス151は、無線通信デバイス158に、画像表示通知信号を画像歪み検査装置210へ送信させる。 The HUD 150 displays the distortion inspection image received by the wireless communication device 158 on the liquid crystal panel of the HUD light source 102 by the image processing device 151. Next, in order to notify the image distortion inspection device 210 that the image for distortion inspection has been displayed on the HUD 150, the image processing device 151 of the HUD 150 transmits the image display notification signal to the wireless communication device 158 to the image distortion inspection device 210. Send.
 画像歪み検査装置210の通信部237は、画像表示通知信号を受信する。通信部237が画像表示通知信号を受信すると、通信制御部236は、撮像部130にそのような信号を受信したことを通知する。撮像部130は、画像表示通知信号の受信後に歪み検査用画像の撮像処理を実行する。 通信 The communication unit 237 of the image distortion inspection device 210 receives the image display notification signal. When the communication unit 237 receives the image display notification signal, the communication control unit 236 notifies the imaging unit 130 that such a signal has been received. After receiving the image display notification signal, the imaging unit 130 performs a process of capturing an image for distortion inspection.
 以上より、時間経過に伴って変更される歪み検査用画像を使用して、HUD150の画像歪値を検査することができる。歪み検査用画像を変更することで、HUD光源102の液晶パネルの各画素に対応した光線位置及び光線方向が検出可能となるため、画像歪値の検査精度を向上した画像歪み検査装置210を提供することができる。 As described above, the image distortion value of the HUD 150 can be inspected using the distortion inspection image that changes with time. By changing the distortion inspection image, it is possible to detect the light beam position and the light beam direction corresponding to each pixel of the liquid crystal panel of the HUD light source 102, and thus provide the image distortion inspection device 210 with improved image distortion value inspection accuracy. can do.
 図24は、実施の形態2に係る画像歪み検査装置210における検査方法を示すフローチャートである。
 まず、通信制御部236は、検査用画像記憶部235に記憶されている複数の歪み検査用画像から、1つの歪み検査用画像を選択し、選択された歪み検査用画像を読み込む(S20)。例えば、検査用画像記憶部235は、複数の歪み検査用画像を、選択する順番がわかるように記憶している。通信制御部236は、その順番に従って、1つの歪み検査用画像を選択して、選択された歪み検査用画像を読み込む。
FIG. 24 is a flowchart illustrating an inspection method in the image distortion inspection device 210 according to the second embodiment.
First, the communication control unit 236 selects one distortion inspection image from the plurality of distortion inspection images stored in the inspection image storage unit 235, and reads the selected distortion inspection image (S20). For example, the inspection image storage unit 235 stores a plurality of distortion inspection images so that the selection order can be recognized. The communication control unit 236 selects one distortion inspection image in accordance with the order and reads the selected distortion inspection image.
 次に、通信制御部236は、ステップS20で読み込まれた歪み検査用画像を、HUD光源102の液晶パネルに表示させるため、通信部237にHUD150へ送信させる(S21)。 Next, the communication control unit 236 causes the communication unit 237 to transmit the distortion inspection image read in step S20 to the HUD 150 in order to display the image on the liquid crystal panel of the HUD light source 102 (S21).
 次に、通信制御部236は、HUD150から送信された画像表示通知信号を通信部237が受信することで、ステップS21で送信された歪み検査用画像がHUD光源102の液晶パネルに表示されたことを確認する(S22)。そして、処理はステップS23に進む。 Next, the communication control unit 236 determines that the distortion inspection image transmitted in step S21 is displayed on the liquid crystal panel of the HUD light source 102 by the communication unit 237 receiving the image display notification signal transmitted from the HUD 150. Is confirmed (S22). Then, the process proceeds to step S23.
 図24におけるステップS23~S26での処理は、実施の形態1の図20におけるステップS10~S13での処理と同様である。但し、ステップS26の後は、処理はステップS27に進む。 処理 The processing in steps S23 to S26 in FIG. 24 is the same as the processing in steps S10 to S13 in FIG. 20 of the first embodiment. However, after step S26, the process proceeds to step S27.
 ステップS27では、通信制御部236は、検査用画像記憶部235に記憶されている複数の歪み検査用画像の全てが表示されたか否かを判断する。全ての歪み検査用画像が表示された場合(S27でYes)には、処理は終了し、まだ表示されていない歪み検査用画像が残っている場合(S27でNo)には、処理はステップS20に戻る。 In step S27, the communication control unit 236 determines whether all of the plurality of distortion inspection images stored in the inspection image storage unit 235 have been displayed. If all the distortion inspection images have been displayed (Yes in S27), the process ends. If there are distortion inspection images that have not been displayed (No in S27), the process proceeds to step S20. Return to
 実施の形態2によれば、車両101の外部に設置された撮像部130で、時間経過に伴って変わる歪み検査用画像を撮像し、HUD150の光学部材によって変化したHUD光源102の光線位置及び光線方向を計測して、画像の歪み度合いを検査可能な画像歪み検査方法を提供することができる。 According to the second embodiment, the imaging unit 130 installed outside the vehicle 101 captures an image for distortion inspection that changes with time, and the light beam position and light beam of the HUD light source 102 changed by the optical member of the HUD 150. It is possible to provide an image distortion inspection method capable of inspecting the degree of image distortion by measuring the direction.
実施の形態3.
 図1に示されているように、実施の形態3における画像歪み検査システム300は、画像歪み検査装置310を含んでいる。
 実施の形態3では、画像歪み検査装置310と車両101との相対位置が一定でなくとも、画像歪み検査装置310と車両101との相対位置を取得して、画像歪み検査装置310を移動させることで、HUD150の画像歪値を検査できるようにする。
 なお、実施の形態3におけるHUD150は、実施の形態1におけるHUD150と同様である。
Embodiment 3 FIG.
As shown in FIG. 1, an image distortion inspection system 300 according to the third embodiment includes an image distortion inspection device 310.
In the third embodiment, even if the relative position between image distortion inspection device 310 and vehicle 101 is not constant, the relative position between image distortion inspection device 310 and vehicle 101 is acquired and image distortion inspection device 310 is moved. Thus, the image distortion value of the HUD 150 can be inspected.
The HUD 150 according to the third embodiment is the same as the HUD 150 according to the first embodiment.
 図25は、実施の形態3に係る画像歪み検査装置310のハードウェア構成例を示すブロック図である。
 画像歪み検査装置310は、画像処理デバイス311と、撮像デバイス117と、無線通信デバイス118と、位置計測デバイス320と、移動機構デバイス321とを備える。
 実施の形態3における撮像デバイス117及び無線通信デバイス118は、実施の形態1における撮像デバイス117及び無線通信デバイス118と同様である。
FIG. 25 is a block diagram illustrating a hardware configuration example of the image distortion inspection device 310 according to the third embodiment.
The image distortion inspection device 310 includes an image processing device 311, an imaging device 117, a wireless communication device 118, a position measurement device 320, and a moving mechanism device 321.
The imaging device 117 and the wireless communication device 118 according to the third embodiment are the same as the imaging device 117 and the wireless communication device 118 according to the first embodiment.
 画像処理デバイス311は、CPU112と、RAM113と、ROM114と、センサIO315と、無線通信IO116と、移動制御IO319とを備える。
 実施の形態3におけるCPU112、RAM113、ROM114及び無線通信IO116は、実施の形態1におけるCPU112、RAM113、ROM114及び無線通信IO116と同様である。
The image processing device 311 includes a CPU 112, a RAM 113, a ROM 114, a sensor IO 315, a wireless communication IO 116, and a movement control IO 319.
The CPU 112, the RAM 113, the ROM 114, and the wireless communication IO 116 according to the third embodiment are the same as the CPU 112, the RAM 113, the ROM 114, and the wireless communication IO 116 according to the first embodiment.
 センサIO315は、撮像デバイス117及び位置計測デバイス320を接続するためのインタフェースである。 The sensor IO 315 is an interface for connecting the imaging device 117 and the position measurement device 320.
 移動制御IO319は、撮像デバイス117の位置を変更するために、移動機構デバイス321を制御するためのインタフェースである。例えば、CPU112は、移動制御IO319を介して、移動機構デバイス321の車輪の方向と回転とを制御する。 The movement control IO 319 is an interface for controlling the movement mechanism device 321 in order to change the position of the imaging device 117. For example, the CPU 112 controls the direction and rotation of the wheels of the movement mechanism device 321 via the movement control IO 319.
 位置計測デバイス320は、車両101の位置を計測する。例えば、位置計測デバイス320は、LiDAR(Light Detection and Ranging)等のレーザセンサを使用して、車両101の位置を計測する。具体的には、位置計測デバイス320から車両101へレーザを照射し、車両101から反射したレーザを受信することで、位置計測デバイス320は、車両101の位置を計測する。 The position measuring device 320 measures the position of the vehicle 101. For example, the position measurement device 320 measures the position of the vehicle 101 using a laser sensor such as LiDAR (Light Detection and Ranging). Specifically, the position measuring device 320 measures the position of the vehicle 101 by irradiating the vehicle 101 with a laser from the position measuring device 320 and receiving the laser reflected from the vehicle 101.
 移動機構デバイス321は、撮像デバイス117の位置を変えるため、画像歪み検査装置310を移動させる。例えば、移動機構デバイス321は、画像歪み検査装置310の下部に設置されている4個の車輪(図示せず)と、車輪を動かすためのモータ(図示せず)とを備えている。CPU112は、移動制御IO319を介して、車輪の方向と回転とを制御することで、画像歪み検査装置310を移動させる。 The moving mechanism device 321 moves the image distortion inspection device 310 to change the position of the imaging device 117. For example, the moving mechanism device 321 includes four wheels (not shown) installed below the image distortion inspection apparatus 310, and a motor (not shown) for moving the wheels. The CPU 112 moves the image distortion inspection device 310 by controlling the direction and rotation of the wheels via the movement control IO 319.
 図26は、実施の形態3に係る画像歪み検査装置310の機能構成を概略的に示すブロック図である。
 画像歪み検査装置310は、撮像部130と、光線検出部131と、表示位置特定部132と、表示位置対応情報記憶部133と、画像歪値算出部134と、位置検出部338と、相対位置取得部339と、移動制御部340と、撮像位置変更部341とを備える。
FIG. 26 is a block diagram schematically showing a functional configuration of the image distortion inspection device 310 according to the third embodiment.
The image distortion inspection device 310 includes an imaging unit 130, a light ray detection unit 131, a display position identification unit 132, a display position correspondence information storage unit 133, an image distortion value calculation unit 134, a position detection unit 338, and a relative position. An acquisition unit 339, a movement control unit 340, and an imaging position change unit 341 are provided.
 実施の形態3における撮像部130、光線検出部131、表示位置特定部132、表示位置対応情報記憶部133及び画像歪値算出部134は、実施の形態1における撮像部130、光線検出部131、表示位置特定部132、表示位置対応情報記憶部133及び画像歪値算出部134と同様である。
 なお、実施の形態3でも、実施の形態2と同様に、検査用画像記憶部235、通信制御部236及び通信部237を備え、複数の歪み検査用画像を記憶しておき、時間経過に伴って、順番に歪み検査用画像をHUD150に送信するようにしてもよい。
The imaging unit 130, the light beam detection unit 131, the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134 according to the third embodiment are the same as those of the first embodiment. This is the same as the display position specifying unit 132, the display position correspondence information storage unit 133, and the image distortion value calculation unit 134.
In the third embodiment, similarly to the second embodiment, an inspection image storage unit 235, a communication control unit 236, and a communication unit 237 are provided, and a plurality of distortion inspection images are stored. Then, the distortion inspection images may be transmitted to the HUD 150 in order.
 また、位置検出部338の機能を実現するハードウェアは、図25の位置計測デバイス320である。相対位置取得部339及び移動制御部340の機能を実現するハードウェアは、図25の画像処理デバイス311である。撮像位置変更部341の機能を実現するハードウェアは、図25の移動機構デバイス321である。 The hardware that implements the function of the position detection unit 338 is the position measurement device 320 in FIG. Hardware that implements the functions of the relative position acquisition unit 339 and the movement control unit 340 is the image processing device 311 in FIG. Hardware that implements the function of the imaging position changing unit 341 is the moving mechanism device 321 in FIG.
 位置検出部338は、車両101の位置である車両位置を検出する。具体的には、位置検出部338は、LiDARを使用して、車両101の外形形状を計測することで、車両101の車両位置を検出する。
 相対位置取得部339は、位置検出部338により検出された車両位置から、画像歪み検査装置310と、車両101との相対的な位置である相対位置を取得する。なお、車両101と、車両101の内側に設置されているHUD150との位置関係は、HUD150の搭載時の設計値に既知のものとする。これより、画像歪み検査装置310と車両101との相対位置を取得することができる。
The position detection unit 338 detects a vehicle position that is the position of the vehicle 101. Specifically, the position detection unit 338 detects the vehicle position of the vehicle 101 by measuring the outer shape of the vehicle 101 using LiDAR.
The relative position acquisition unit 339 acquires a relative position, which is a relative position between the image distortion inspection device 310 and the vehicle 101, from the vehicle position detected by the position detection unit 338. It is assumed that the positional relationship between the vehicle 101 and the HUD 150 installed inside the vehicle 101 is known from design values when the HUD 150 is mounted. Thus, the relative position between the image distortion inspection device 310 and the vehicle 101 can be obtained.
 ここでは、位置検出部338は、LiDARを使用した位置計測デバイス320により実現される例を示したが、例えば、位置検出部338は、撮像デバイス117により実現されてもよい。具体的には、相対位置取得部339は、予め、車両101の画像特徴点と、その位置とを記憶しておき、撮像デバイス117で撮像された画像から画像特徴点を検出することで、車両101と撮像デバイス117との相対位置を算出してもよい。 Here, an example in which the position detection unit 338 is realized by the position measurement device 320 using LiDAR has been described, but the position detection unit 338 may be realized by the imaging device 117, for example. Specifically, the relative position acquisition unit 339 stores the image feature points of the vehicle 101 and the positions thereof in advance, and detects the image feature points from the image captured by the imaging device 117, thereby detecting the vehicle feature points. The relative position between 101 and the imaging device 117 may be calculated.
 また、位置の計測には、車両101に設置されている車載センサ(図示せず)が使用されてもよい。画像歪み検査装置310にキャリブレーションボード等を取り付け、車載センサで計測することで、画像歪み検査装置310の位置を計測する。車載センサは、その計測結果を、例えば、無線で画像歪み検査装置310に送信する。相対位置取得部339は、無線通信デバイス118を介して、その計測結果を取得して、車両101と画像歪み検査装置310との間の相対位置を取得することができる。 In addition, an in-vehicle sensor (not shown) installed in the vehicle 101 may be used for position measurement. A calibration board or the like is attached to the image distortion inspection device 310, and the position of the image distortion inspection device 310 is measured by measuring with a vehicle-mounted sensor. The in-vehicle sensor transmits the measurement result to the image distortion inspection device 310 wirelessly, for example. The relative position acquisition unit 339 can acquire the measurement result via the wireless communication device 118 and acquire the relative position between the vehicle 101 and the image distortion inspection device 310.
 移動制御部340は、相対位置取得部339で取得された相対位置から、検査を行うための位置として予め定められた位置までの移動量を算出し、撮像位置変更部341に指示することで、画像歪み検査装置310を移動させる。
 撮像位置変更部341は、移動制御部340からの指示に応じて、移動制御部340で算出された移動量だけ画像歪み検査装置310を移動させることで、撮像デバイス117の設置位置、言い換えると、撮像デバイス117による撮像位置を変える移動部である。
The movement control unit 340 calculates a movement amount from the relative position acquired by the relative position acquisition unit 339 to a position predetermined as a position for performing an inspection, and instructs the imaging position change unit 341 to calculate the amount of movement. The image distortion inspection device 310 is moved.
The imaging position changing unit 341 moves the image distortion inspection device 310 by the movement amount calculated by the movement control unit 340 in accordance with the instruction from the movement control unit 340, and thereby the installation position of the imaging device 117, in other words, A moving unit that changes an imaging position of the imaging device 117.
 画像歪み検査装置310の移動量は、例えば、予め、画像歪値の検査処理に適した画像歪み検査装置310と車両101との相対位置を設定しておき、相対位置取得部339で取得された相対位置と、その相対位置との差分値から算出すればよい。 The movement amount of the image distortion inspection device 310 is obtained, for example, in advance by setting a relative position between the image distortion inspection device 310 and the vehicle 101 suitable for an image distortion value inspection process, and by the relative position acquisition unit 339. What is necessary is just to calculate from the relative position and the difference value between the relative position.
 以上により、画像歪み検査装置310と車両101との相対位置を合わせなくても、画像歪値の検査処理に適した相対位置となるように、画像歪み検査装置310が自動的に移動するようになる。 As described above, even if the relative position between the image distortion inspection device 310 and the vehicle 101 is not adjusted, the image distortion inspection device 310 automatically moves so as to be a relative position suitable for the image distortion value inspection processing. Become.
 図27は、実施の形態3に係る画像歪み検査装置310における検査方法を示すフローチャートである。
 まず、位置検出部338は、画像歪値を検査するHUD150を搭載した車両101の外形形状を測定することにより、車両101の位置を検出する(S30)。
FIG. 27 is a flowchart illustrating an inspection method in the image distortion inspection device 310 according to the third embodiment.
First, the position detection unit 338 detects the position of the vehicle 101 by measuring the external shape of the vehicle 101 equipped with the HUD 150 for inspecting the image distortion value (S30).
 次に、相対位置取得部339は、ステップS30で検出された車両101の位置から、車両101と画像歪み検査装置310との相対位置を取得し、移動制御部340は、取得された相対位置と、予め記憶している相対位置との差分値を求めて、画像歪み検査装置310の移動量を算出する(S31)。 Next, the relative position acquisition unit 339 acquires the relative position between the vehicle 101 and the image distortion inspection device 310 from the position of the vehicle 101 detected in step S30, and the movement control unit 340 sets the relative position to the acquired relative position. Then, a difference value from the previously stored relative position is obtained, and the movement amount of the image distortion inspection device 310 is calculated (S31).
 次に、撮像位置変更部341は、ステップS31で算出された移動量に従って、画像歪み検査装置310を移動させることで、撮像位置を変更する(S32)。 Next, the imaging position changing unit 341 changes the imaging position by moving the image distortion inspection device 310 according to the movement amount calculated in step S31 (S32).
 図27におけるステップS33~S36での処理は、実施の形態1の図20におけるステップS10~S13での処理と同様である。
 以上により、画像歪み検査装置310と車両101との相対位置が一定でなくとも、画像歪み検査装置310と車両101との相対位置を取得して、画像歪み検査装置310を移動させることで、HUD150の画像歪値を検査可能な画像歪み検査方法を提供することができる。
The processing in steps S33 to S36 in FIG. 27 is the same as the processing in steps S10 to S13 in FIG. 20 of the first embodiment.
As described above, even if the relative position between the image distortion inspection device 310 and the vehicle 101 is not constant, the HUD 150 is obtained by acquiring the relative position between the image distortion inspection device 310 and the vehicle 101 and moving the image distortion inspection device 310. Can be provided.
 なお、以上に記載された実施の形態1~3は、画像表示機器としてHUD150の画像歪み度合を検査する一例を記載したが、実施の形態1~3は、以上に限定されず、趣旨を逸脱しない範囲で適宜変更することが可能である。 Although the first to third embodiments described above are examples in which the degree of image distortion of the HUD 150 is inspected as an image display device, the first to third embodiments are not limited to the above and depart from the spirit. It can be changed appropriately within a range not to be performed.
 100,200,300 画像歪み検査システム、 101 車両、 102 HUD光源、 103 コンバイナ、 110,210,310 画像歪み検査装置、 111,311 画像処理デバイス、 112 CPU、 113 RAM、 114 ROM、 115 センサIO、 116 無線通信IO、 117 撮像デバイス、 118 無線通信デバイス、 319 移動制御IO、 320 位置計測デバイス、 321 移動機構デバイス、 130 撮像部、 131 光線検出部、 132 表示位置特定部、 133 表示位置対応情報記憶部、 134 画像歪値算出部、 235 検査用画像記憶部、 236 通信制御部、 237 通信部、 338 位置検出部、 339 相対位置取得部、 340 移動制御部、 341 撮像位置変更部、 150 HUD、 151 画像処理デバイス、 152 CPU、 153 RAM、 154 ROM、 155 表示IO、 156 無線通信IO、 157 画像表示デバイス、 158 無線通信デバイス。 100, 200, 300 image distortion inspection system, {101} vehicle, {102} HUD light source, {103} combiner, {110, 210, 310} image distortion inspection device, {111, 311} image processing device, {112} CPU, {113} RAM, {114} ROM, {115} sensor IO, 116 wireless communication IO, {117} imaging device, {118} wireless communication device, {319} movement control IO, {320} position measurement device, {321} movement mechanism device, {130} imaging unit, {131} ray detection unit, {132} display position identification unit, {133} display position correspondence information storage Unit, {134} image distortion value calculation unit, {235} inspection image storage unit, {236} communication control unit, {237} communication unit, {338} position detection unit, {339} relative position acquisition unit, # 3 0 movement control unit, 341 imaging position change unit, 0.99 HUD, 151 image processing device, 152 CPU, 153 RAM, 154 ROM, 155 display IO, 156 wireless communication IO, 157 image display device, 158 a wireless communications device.

Claims (8)

  1.  原画像に基づいて虚像を投影するために車両の内部に設置された画像表示機器から出力された光による画像を、光線が入力される位置である光線位置及び前記光線が入力される方向である光線方向がわかるように、前記車両の外部から撮像する撮像部と、
     前記撮像された画像から、前記原画像に含まれている検査用の点を示す光線の光線位置及び光線方向を検出する検出部と、
     前記検出された光線位置及び前記検出された光線方向から、前記虚像において、前記検査用の点に対応する点が表示される虚像表示位置を特定する表示位置特定部と、
     前記原画像における、前記検査用の点の画素の位置である画素位置と、前記特定された虚像表示位置とから、前記虚像において前記原画像が歪む度合いを示す歪み度合いを算出する歪み算出部と、を備えること
     を特徴とする画像歪み検査装置。
    In order to project a virtual image based on the original image, an image by light output from an image display device installed inside the vehicle is a light beam position where a light beam is input and a direction where the light beam is input. An imaging unit that captures an image from outside the vehicle so that a light beam direction can be understood;
    From the captured image, a detection unit that detects the ray position and ray direction of a ray indicating a point for inspection included in the original image,
    From the detected light beam position and the detected light beam direction, in the virtual image, a display position specifying unit that specifies a virtual image display position where a point corresponding to the inspection point is displayed.
    In the original image, a pixel position that is the position of the pixel of the inspection point, and the specified virtual image display position, a distortion calculation unit that calculates a distortion degree indicating the degree of distortion of the original image in the virtual image. An image distortion inspection device, comprising:
  2.  前記撮像部は、前記虚像の画像として光線空間画像を撮像し、
     前記検出部は、前記光線空間画像から、前記検査用の点を示す光線の光線位置と、前記検査用の点を示す光線の光線方向と、を検出すること
     を特徴とする請求項1に記載の画像歪み検査装置。
    The imaging unit captures a ray space image as the virtual image,
    The said detection part detects the light beam position of the light beam which shows the said point for a test | inspection, and the light beam direction of the light beam which shows the said point for a test | inspection from the said light beam space image. Image distortion inspection equipment.
  3.  複数の光線位置及び複数の光線方向と、複数の虚像表示位置と、を対応付ける表示位置対応情報を記憶する情報記憶部をさらに備え、
     前記表示位置特定部は、前記表示位置対応情報を参照することで、前記検出された光線位置及び前記検出された光線方向に対応付けられている虚像表示位置を、前記検査用の点に対応する点が表示される虚像表示位置として特定すること
     を特徴とする請求項1又は2に記載の画像歪み検査装置。
    A plurality of light beam positions and a plurality of light beam directions, and a plurality of virtual image display positions, further comprising an information storage unit for storing display position correspondence information for associating,
    The display position specifying unit refers to the display position correspondence information to correspond to the detected light beam position and the virtual image display position associated with the detected light beam direction to the inspection point. The image distortion inspection apparatus according to claim 1, wherein the point is specified as a virtual image display position at which a point is displayed.
  4.  前記原画像として用いられる複数の歪み検査用画像を記憶する検査用画像記憶部と、
     前記画像表示機器に、前記原画像として、前記複数の歪み検査用画像の各々を、予め定められた順番で前記画像表示機器に送信する通信部と、をさらに備えること
     を特徴とする請求項1から3の何れか一項に記載の画像歪み検査装置。
    An inspection image storage unit that stores a plurality of distortion inspection images used as the original image,
    The image display device further includes: a communication unit that transmits each of the plurality of distortion inspection images to the image display device in a predetermined order as the original image. The image distortion inspection device according to any one of claims 1 to 3.
  5.  前記通信部は、予め定められた時間が経過するごとに、前記複数の歪み検査用画像を1つずつ、前記画像表示機器に送信すること
     を特徴とする請求項4に記載の画像歪み検査装置。
    The image distortion inspection apparatus according to claim 4, wherein the communication unit transmits the plurality of distortion inspection images one by one to the image display device each time a predetermined time elapses. .
  6.  前記車両の位置である車両位置を検出する位置検出部と、
     前記車両位置から、前記画像歪み検査装置と前記車両との相対的な位置である相対位置を取得する相対位置取得部と、
     前記相対位置から予め定められた位置までの移動量を算出する移動制御部と、
     前記移動量に従って、前記画像歪み検査装置を前記予め定められた位置に移動させる移動部と、をさらに備えること
     を特徴とする請求項1から5の何れか一項に記載の画像歪み検査装置。
    A position detection unit that detects a vehicle position that is the position of the vehicle,
    From the vehicle position, a relative position acquisition unit that acquires a relative position that is a relative position between the image distortion inspection device and the vehicle,
    A movement control unit that calculates a movement amount from the relative position to a predetermined position,
    The image distortion inspection device according to any one of claims 1 to 5, further comprising: a moving unit configured to move the image distortion inspection device to the predetermined position according to the movement amount.
  7.  原画像に基づいて虚像を投影するために車両の内部に設置された画像表示機器から出力された光による画像を、光線が入力される位置である光線位置及び前記光線が入力される方向である光線方向がわかるように、前記車両の外部から撮像し、
     前記撮像された画像から、前記原画像に含まれている検査用の点を示す光線の光線位置及び光線方向を検出し、
     前記検出された光線位置及び前記検出された光線方向から、前記虚像において、前記検査用の点に対応する点が表示される虚像表示位置を特定し、
     前記原画像における、前記検査用の点の画素の位置である画素位置と、前記特定された虚像表示位置とから、前記虚像において前記原画像が歪む度合いを示す歪み度合いを算出すること
     を特徴とする画像歪み検査方法。
    In order to project a virtual image based on the original image, an image by light output from an image display device installed inside the vehicle is a light beam position where a light beam is input and a direction where the light beam is input. Image from the outside of the vehicle so that the direction of the light beam can be understood,
    From the captured image, to detect the ray position and ray direction of a ray indicating a point for inspection included in the original image,
    From the detected light beam position and the detected light beam direction, in the virtual image, specify a virtual image display position at which a point corresponding to the inspection point is displayed,
    From the pixel position of the pixel of the inspection point in the original image and the specified virtual image display position, calculate a degree of distortion indicating the degree of distortion of the original image in the virtual image. Image distortion inspection method.
  8.  コンピュータを、
     原画像に基づいて虚像を投影するために車両の内部に設置された画像表示機器から出力された光を受けて、光線が入力される位置である光線位置及び前記光線が入力される方向である光線方向がわかるように、前記車両の外部から撮像された画像から、前記原画像に含まれている検査用の点を示す光線の光線位置及び光線方向を検出する検出部、
     前記検出された光線位置及び前記検出された光線方向から、前記虚像において、前記検査用の点に対応する点が表示される虚像表示位置を特定する表示位置特定部、並びに、
     前記原画像における、前記検査用の点の画素の位置である画素位置と、前記特定された虚像表示位置とから、前記虚像において前記原画像が歪む度合いを示す歪み度合いを算出する歪み算出部、として機能させること
     を特徴とするプログラム。
    Computer
    Receiving light output from an image display device installed inside the vehicle to project a virtual image based on the original image, a light beam position where a light beam is input and a direction where the light beam is input. A detection unit that detects a light ray position and a light ray direction of a light ray indicating a point for inspection included in the original image from an image captured from outside the vehicle so that the light ray direction can be understood.
    From the detected light beam position and the detected light beam direction, in the virtual image, a display position specifying unit that specifies a virtual image display position at which a point corresponding to the inspection point is displayed, and
    In the original image, a pixel position, which is the position of the pixel of the inspection point, and the specified virtual image display position, a distortion calculation unit that calculates a distortion degree indicating the degree of distortion of the original image in the virtual image. A program characterized by functioning as a program.
PCT/JP2018/027124 2018-07-19 2018-07-19 Image distortion inspection device, image distortion inspection method, and program WO2020016994A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027124 WO2020016994A1 (en) 2018-07-19 2018-07-19 Image distortion inspection device, image distortion inspection method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027124 WO2020016994A1 (en) 2018-07-19 2018-07-19 Image distortion inspection device, image distortion inspection method, and program

Publications (1)

Publication Number Publication Date
WO2020016994A1 true WO2020016994A1 (en) 2020-01-23

Family

ID=69163811

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/027124 WO2020016994A1 (en) 2018-07-19 2018-07-19 Image distortion inspection device, image distortion inspection method, and program

Country Status (1)

Country Link
WO (1) WO2020016994A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586263A (en) * 2020-03-27 2020-08-25 广东技术师范大学 Imaging quality detection method for automobile HUD virtual image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011209457A (en) * 2010-03-29 2011-10-20 Denso Corp Method for manufacturing head-up display apparatus and virtual image adjustment device suitable for use for the manufacturing method
JP2015022013A (en) * 2013-07-16 2015-02-02 株式会社デンソー Inspection device
JP2017047794A (en) * 2015-09-02 2017-03-09 カルソニックカンセイ株式会社 Distortion correction method for head-up display and distortion correction device for head-up display using the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011209457A (en) * 2010-03-29 2011-10-20 Denso Corp Method for manufacturing head-up display apparatus and virtual image adjustment device suitable for use for the manufacturing method
JP2015022013A (en) * 2013-07-16 2015-02-02 株式会社デンソー Inspection device
JP2017047794A (en) * 2015-09-02 2017-03-09 カルソニックカンセイ株式会社 Distortion correction method for head-up display and distortion correction device for head-up display using the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586263A (en) * 2020-03-27 2020-08-25 广东技术师范大学 Imaging quality detection method for automobile HUD virtual image
CN111586263B (en) * 2020-03-27 2024-05-14 广东技术师范大学 Imaging quality detection method for HUD virtual image of automobile

Similar Documents

Publication Publication Date Title
KR101787304B1 (en) Calibration method, calibration device, and computer program product
JP5615441B2 (en) Image processing apparatus and image processing method
WO2009141998A1 (en) Calibration method, calibration device, and calibration system having the device
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
WO2016157799A1 (en) Method for adjusting position of vehicular display apparatus
JP7038346B2 (en) Camera parameter calculation method, camera parameter calculation program, and camera parameter calculation device
JP4414661B2 (en) Stereo adapter and range image input device using the same
JP6209833B2 (en) Inspection tool, inspection method, stereo camera production method and system
JP3804916B2 (en) Imaging system, program used for controlling image data thereof, method for correcting distortion of captured image in imaging system, and storage medium storing procedure thereof
WO2017179453A1 (en) Inspecting device and inspecting method
JP2015203652A (en) Information processing unit and information processing method
JP6791341B2 (en) Calibration method, calibration equipment, and program
EP2902967A1 (en) Stereo camera calibration method, disparity calculating device, and stereo camera
US11233961B2 (en) Image processing system for measuring depth and operating method of the same
JP5487946B2 (en) Camera image correction method, camera apparatus, and coordinate transformation parameter determination apparatus
JP6529360B2 (en) Image processing apparatus, imaging apparatus, image processing method and program
WO2020016994A1 (en) Image distortion inspection device, image distortion inspection method, and program
JP2008298589A (en) Device and method for detecting positions
US20230274444A1 (en) Measuring device, moving device, measuring method, and storage medium
JP2018044942A (en) Camera parameter calculation device, camera parameter calculation method, program and recording medium
JP2017194591A (en) Distance measurement device, imaging apparatus, and distance measurement method
CN113596441B (en) Optical axis adjusting device, method, system and readable storage medium
JP5727969B2 (en) Position estimation apparatus, method, and program
JP2020051903A (en) Stereo camera system and distance measurement method
JP6680335B2 (en) Stereo camera, vehicle, calculation method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927073

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18927073

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP