WO2014192347A1 - Système et procédé d'inspection d'un capteur optique - Google Patents

Système et procédé d'inspection d'un capteur optique Download PDF

Info

Publication number
WO2014192347A1
WO2014192347A1 PCT/JP2014/054946 JP2014054946W WO2014192347A1 WO 2014192347 A1 WO2014192347 A1 WO 2014192347A1 JP 2014054946 W JP2014054946 W JP 2014054946W WO 2014192347 A1 WO2014192347 A1 WO 2014192347A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical sensor
light
inspection
image data
screen
Prior art date
Application number
PCT/JP2014/054946
Other languages
English (en)
Japanese (ja)
Inventor
和宏 藤井
雄亮 増田
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2015519694A priority Critical patent/JP5897213B2/ja
Publication of WO2014192347A1 publication Critical patent/WO2014192347A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • the present invention relates to an inspection system and an inspection method for an optical sensor provided on a moving body.
  • detection light is emitted from the optical sensor provided in the vehicle toward the front of the vehicle.
  • the detection light is applied to the obstacle, and the obstacle is detected by the light sensor receiving the reflected light reflected from the obstacle.
  • a screen is irradiated with detection light from the optical sensor, and an irradiation range of the optical sensor is grasped based on a light pattern projected on the screen (for example, Patent Document 1). reference).
  • the present invention solves the above-described problems, and provides an inspection system and an inspection method capable of detecting the position of the light emitting portion and the optical pattern of the optical sensor and detecting the irradiation direction of the optical sensor provided on the moving body. This is the issue.
  • the present invention provides an optical sensor inspection system provided on a moving body, and a translucent screen on which detection light is emitted from the optical sensor, and a light emitting unit of the optical sensor And an imaging device that captures the light pattern of the detection light that has passed through the screen, and an inspection device that receives the light-emitting unit and the captured image data of the light pattern from the imaging device.
  • the inspection apparatus includes a vertical angle detection unit that detects a vertical angle in an irradiation direction of the photosensor from the position of the light emitting unit and the position of the light pattern in the captured image data. Is desirable.
  • the inspection apparatus includes a left-right angle detection unit that detects a left-right angle of the irradiation direction of the photosensor from the position of the light emitting unit and the position of the light pattern in the captured image data. It is desirable. In these configurations, it is possible to detect a deviation (vertical angle, horizontal angle) with respect to the reference angle in the irradiation direction of the optical sensor.
  • the inspection apparatus when the inspection apparatus includes a rotation angle detection unit that detects a rotation angle of the optical pattern of the captured image data, a shift (rotation angle) with respect to a reference angle around the optical axis of the optical sensor. ) Can be detected.
  • the inspection apparatus includes a trapezoidal distortion correction unit that corrects a trapezoidal distortion of the optical pattern of the captured image data.
  • a trapezoidal distortion correction unit that corrects a trapezoidal distortion of the optical pattern of the captured image data.
  • the screen can be freely moved in and out of the moving area of the moving body, and the imaging device is disposed outside the moving area of the moving body. It is desirable to make it easy to move to the next step.
  • the imaging device is installed on a gate-shaped support base provided in a region for inspecting the optical axis of the light of the mobile body, and the mobile body can pass through the support body. It is desirable that
  • the imaging device is any one of an infrared camera, a color camera, and a monochrome camera.
  • the step of irradiating the screen with the detection light from the optical sensor provided on the moving body, and the light emitting unit and the optical pattern by the imaging device And a step of detecting an irradiation direction of the light sensor from the position of the light emitting unit and the position of the light pattern in the photographed image data.
  • the inspection apparatus includes a trapezoidal distortion correction unit that corrects the trapezoidal distortion of the optical pattern of the captured image data, and the inspection apparatus corrects the trapezoidal distortion of the optical pattern of the captured image data. After that, it is desirable to detect the irradiation direction of the detection light by the inspection apparatus.
  • the screen is raised and the light emitting unit is photographed by the photographing device.
  • the moving body can be moved.
  • the moving body includes a sensor control device that controls the optical sensor.
  • a signal input device is connected to the sensor control device, and the optical sensor does not detect an obstacle from the signal input device to the sensor control device. It is desirable to input a normal pseudo signal indicating the above.
  • the moving body includes a device control apparatus that controls an apparatus mounted on the moving body, a sensor control apparatus that controls the optical sensor, the device control apparatus, and the sensor control apparatus. And a central control device for controlling.
  • the central control device has a normal movement mode in which the moving body performs a collision avoidance operation when an obstacle detection signal is input from the optical sensor to the sensor control device.
  • the central control unit can switch to the in-factory movement mode in which the moving body does not execute the collision avoiding operation even when the detection signal is input to the sensor control unit. desirable.
  • the central control unit is switched to the in-factory movement mode, so that even if the optical sensor detects peripheral equipment or an operator, the moving body performs a collision avoidance operation. Can be prevented. Thereby, the moving body can smoothly enter and leave the inspection area. Moreover, it can prevent that a mobile body and peripheral equipment contact, and can prevent damage to a mobile body and peripheral equipment.
  • the positions of the light emitting part and the optical pattern of the optical sensor are detected from the captured image data, and the irradiation direction of the optical sensor is determined based on the positional relationship between the light emitting part and the optical pattern. Since it can be detected, the mounting position and mounting angle of the optical sensor can be accurately adjusted so that the optical sensor irradiates a predetermined range.
  • (A) is the figure which showed the picked-up image data before correcting trapezoid distortion
  • (b) is the figure which showed the picked-up image data after correcting trapezoid distortion.
  • (A) is explanatory drawing which showed determination of the up-down angle of the irradiation direction of detection light
  • (b) is explanatory drawing which showed the shift
  • (A) is explanatory drawing which showed determination of the left-right angle of the irradiation direction of detection light
  • (b) is explanatory drawing which showed the shift
  • (A) is explanatory drawing of the structure which inputs a normal pseudo signal into a sensor control apparatus
  • (b) is explanatory drawing of the structure which switches a central control apparatus to the movement mode in a factory.
  • an optical sensor 2 that detects an obstacle in front of the vehicle 1 is used in a collision prevention device for a vehicle 1 that is a four-wheeled vehicle (a “moving body” in the claims).
  • An inspection system 10 and an inspection method for inspection will be described.
  • the optical sensor 2 is attached to the front end of the ceiling surface, the inner surface of the windshield, and the like in the interior of the vehicle 1.
  • the light sensor 2 has a light emitting portion 2a facing forward, and emits infrared detection light (light in the wavelength region of invisible light) toward the front of the vehicle 1 from the light emitting portion 2a. Further, when the obstacle is irradiated with the detection light, the optical sensor 2 receives the reflected light reflected from the obstacle.
  • the mounting position of the optical sensor 2 can be detected.
  • difference with respect to the reference value of the up-and-down angle of the irradiation direction of the detection light irradiated from the light emission part 2a of the optical sensor 2 is detectable.
  • FIG. 6A it is possible to detect a deviation from the reference value of the left-right angle in the irradiation direction of the detection light.
  • FIG. 8 it is possible to detect a deviation of the rotation angle of the detection light from the reference value.
  • the inspection system 10 is provided in an area for inspecting the optical axis of the headlight of the vehicle 1 in the inspection process of the automobile production line.
  • the inspection system 10 includes a screen 20 disposed in front of the vehicle 1 placed on the facing device 3, a support base 30 disposed in front of the vehicle 1 placed on the facing device 3, and an optical sensor. 2 and a photographing device 40 for photographing the screen 20, and an inspection device 50 (see FIG. 2) to which photographed image data is input from the photographing device 40.
  • the facing device 3 includes four roller support portions 3 a provided on the floor surface 4.
  • the roller support portion 3a includes a pair of rollers 3b arranged at the front and rear, and the wheel 1a of the vehicle 1 is placed thereon.
  • the vehicle 1 By placing each wheel 1a of the vehicle 1 on each roller support portion 3a, the vehicle 1 can be positioned in the front-rear direction. Moreover, the wheel 1a can be driven on the roller support part 3a, and the instrument of the vehicle 1 can be test
  • the screen 20 is disposed in front of the windshield of the vehicle 1 so as to face the light emitting portion 2 a of the optical sensor 2.
  • the front surface 21 of the screen 20 is disposed on a vertical line that passes through the axle of the wheel 1 a in front of the vehicle 1.
  • the rear surface 22 of the screen 20 is irradiated with detection light from the optical sensor 2.
  • the screen 20 can be raised and lowered by a lifting device (not shown). By raising the screen 20, the screen 20 can be moved outside the moving region (a position higher than the vehicle 1) when the vehicle 1 is moved forward.
  • the screen 20 is a semi-light transmission type screen in which the detection light applied to the rear surface 22 is transmitted to the front surface 21.
  • the detection light is irradiated on the rear surface 22 of the screen 20
  • the light pattern 100 of the detection light is displayed on the front surface 21.
  • the light pattern 100 composed of three rectangular figures of the central region 110 and the left and right regions 120 is displayed on the front surface 21 of the screen 20.
  • the support table 30 is a gate-shaped frame body that supports an upper frame 32 by four support columns 31 erected on the floor surface 4.
  • the vehicle 1 faces the support base 30.
  • the width of the left and right support columns 31 and the height of the upper frame 32 are set so that the support table 30 allows the vehicle 1 to pass through the support table 30 when the vehicle 1 is moved forward.
  • the imaging device 40 is an infrared camera or a monochrome camera that can capture light in a region close to the wavelength region of visible light.
  • the imaging device 40 is attached to the support base 30 and images the light emitting portion 2a of the optical sensor 2 and the front surface 21 of the screen 20 from obliquely above. Further, the photographing device 40 outputs the photographed captured image data to an inspection device 50 (see FIG. 2) described later.
  • the photographing device 40 is suspended from the upper frame 32 of the support base 30 and is disposed outside the moving region (a position higher than the vehicle 1) when the vehicle 1 is moved forward.
  • the imaging device 40 may be separately installed on a gate-shaped frame in which an optical axis inspection device for inspecting the optical axis of the headlight of the vehicle 1 is installed.
  • an optical axis inspection device for inspecting the optical axis of the headlight of the vehicle 1 is installed.
  • the light in the wavelength region of invisible light that can be captured by the monochrome camera is light in a region close to the wavelength region of visible light, so the amount of light is small.
  • the visible light is also imaged and dazzled. Therefore, it is preferable to attach a visible light cut filter in front of the light receiving unit of the monochrome camera.
  • a color camera that can capture light in a region close to the wavelength region of visible light may be used.
  • the photographing device 40 photographs the light pattern 100 (see FIG. 2) projected on the front surface 21 of the screen 20 with the screen 20 lowered, and photographs the light emitting unit 2a with the screen 20 raised. .
  • the inspection device 50 includes a storage unit 51 that stores captured image data, a trapezoidal distortion correction unit 52 that corrects trapezoidal distortion of the light pattern 100 of the captured image data, and upper and lower directions of the detection light irradiation direction.
  • Vertical angle detection means 53 for detecting the angle
  • left / right angle detection means 54 for detecting the left / right angle of the irradiation direction of the detection light
  • rotation angle detection means 55 for detecting the rotation angle of the detection light
  • the irradiation direction of the detection light Determination means 56 for determining the rotation angle of the detection light.
  • the inspection device 50 is a computer that detects the irradiation direction and the rotation angle of the detection light. Each process in the inspection apparatus 50 is realized by executing a program stored in the storage unit 51 by the CPU.
  • the storage unit 51 stores photographed image data and various programs. Further, the storage means 51 stores the height Ha between the reference point O set on the front surface 21 of the screen 20 and the floor surface 4 (see FIG. 5). Note that the mounting position of the imaging device 40 is adjusted so that the reference point O is arranged at the center position of the captured image data after the trapezoidal distortion correction.
  • the storage means 51 stores a distance L1 between the light emitting unit 2a and the front surface 21 of the screen 20 (see FIG. 5). The distance L1 is different for each model of the vehicle 1, and the storage unit 51 stores the distances L1 of a plurality of models.
  • the storage means 51 stores pixel distance data (resolution) in which the vertical and horizontal sizes of one pixel of the captured image data correspond to the actual distance of the front surface 21 of the screen 20.
  • the pixel distance data indicates how many mm of the front surface 21 of the screen 20 corresponds to the size of one pixel in the captured image data.
  • the pixel distance data is obtained by photographing the four points at predetermined intervals set on the front surface 21 of the screen 20 in advance by the photographing device 40 during the calibration of the photographing device 40, and obtaining the number of pixels between the two points in the photographed image data. Is calculated by
  • the trapezoidal distortion correction means 52 corrects trapezoidal distortion generated in the image of the light emitting unit 2a and the light pattern 100 of the captured image data, and converts it into captured image data after correcting the trapezoidal distortion.
  • the imaging device 40 is disposed above the screen 20. Therefore, the photographing device 40 photographs the light pattern 100 projected on the light emitting unit 2a and the front surface 21 of the screen 20 from obliquely above. For this reason, as shown in FIG. 3A, trapezoidal distortion occurs in the light emitting unit 2a and the light pattern 100 of the captured image data.
  • the trapezoidal distortion correcting unit 52 corrects the distance from the reference point O (center position) of the captured image data to the point group constituting the light spot of the light emitting unit 2a and the light point group constituting the light pattern 100. That is, the trapezoidal distortion correcting means 52 aims to correctly represent the position of the captured image data from the reference point O with respect to the point group constituting the light spot of the light emitting unit 2a and the light point group constituting the light pattern 100. Correct as As a result, as shown in FIG. 3B, trapezoidal distortion is eliminated from the contours of the light emitting section 2a and the light pattern 100 in the captured image data.
  • the trapezoidal distortion correction unit 52 corrects the trapezoidal distortion of the light pattern 100 using a known image processing method.
  • the vertical angle detection means 53 includes a height H1 (see FIG. 5) between the light pattern 100 projected on the front surface 21 of the screen 20 and the floor surface 4, and the light emitting unit 2a and the floor surface. 4 is detected, and the vertical angle ⁇ 1 (see FIG. 5) in the irradiation direction of the detection light is detected.
  • the vertical direction (Y-axis direction) between the reference point O and the center position P1 of the central area 110 of the light pattern 100. ) Is calculated.
  • the height Y1 from the reference point O of the screen 20 to the center position P1 of the light pattern 100 projected on the screen 20 is calculated as shown in FIG.
  • the height H1 between the center position P1 of the light pattern 100 projected on the screen 20 and the floor surface 4 is calculated.
  • the trapezoidal distortion of the light pattern 100 of the photographed image data is corrected by the trapezoidal distortion correction means 52, the center position of the light pattern 100 projected on the screen 20 based on the light pattern 100 of the photographed image data.
  • the height H1 between P1 and the floor surface 4 can be calculated accurately and easily.
  • the vertical angle detection means 53 calculates the number of pixels between the reference point O and the light emitting unit 2a in the vertical direction (Y-axis direction) in the captured image data after trapezoidal distortion correction. By multiplying the number of pixels by the pixel distance data, the height Y2 from the reference point O of the screen 20 to the light emitting unit 2a is calculated. By adding this height Y2 to the height Ha of the reference point O of the screen 20, the height H2 between the light emitting portion 2a and the floor surface 4 is calculated.
  • the vertical angle detection means 53 uses the distance L1 between the light emitting portion 2a and the front surface 21 of the screen 20 and the above-described heights H1 and H2 to detect the detection light with respect to the horizontal line according to the following formula 1.
  • the vertical angle ⁇ 1 in the irradiation direction is detected, and the detection result is output to the determination means 56 (see FIG. 2).
  • ⁇ 1 arctan ((H2 ⁇ H1) / L1) (Formula 1)
  • the left-right angle detection means 54 is a distance W1 in the left-right direction between the light pattern 100 projected on the front surface 21 of the screen 20 and the center position in the left-right direction (vehicle width direction) of the vehicle 1 (FIG. 7). And a distance W2 (see FIG. 7) between the light emitting unit 2a and the center position in the vehicle width direction of the vehicle 1 is calculated, and a left-right angle ⁇ 2 (see FIG. 7) in the detection light irradiation direction is detected.
  • the reference point O see FIG. 6B
  • the reference point O of the captured image data after the trapezoidal distortion correction is arranged at the center position in the vehicle width direction of the vehicle 1. The mounting position has been adjusted.
  • the left / right angle detection means 54 in the captured image data after trapezoidal distortion correction, the left / right direction (X-axis direction) between the reference point O and the center position P 1 of the central region 110 of the light pattern 100. ) Is calculated.
  • the distance W1 in the left-right direction from the reference point O of the screen 20 to the center position P1 of the light pattern 100 projected on the screen 20 is calculated as shown in FIG.
  • the left / right angle detection means 54 calculates the number of pixels between the reference point O and the light emitting unit 2a in the left / right direction (X-axis direction) in the captured image data after trapezoidal distortion correction. By multiplying the pixel distance data by the pixel distance data, the distance W2 in the left-right direction from the reference point O of the screen 20 to the light emitting unit 2a is calculated.
  • the left / right angle detection means 54 uses the distance L1 and the distances W1, W2 between the light emitting portion 2a and the front surface 21 of the screen 20 to detect the vehicle 1 in the front-rear direction according to the following equation (2).
  • the left-right angle ⁇ 2 in the light irradiation direction is detected, and the detection result is output to the determination means 56 (see FIG. 2).
  • ⁇ 2 arctan ((W1-W2) / L1) (Formula 2)
  • the rotation angle detection means 55 detects a rotation angle around the optical axis of the light pattern 100 with respect to the horizontal line. As shown in FIG. 9, the rotation angle detection means 55 detects the angle between the upper side of the central area 110 of the light pattern 100 and the horizontal line in the captured image data after the trapezoidal distortion correction, so that the light with respect to the horizontal line is detected. The rotation angle ⁇ 3 around the optical axis of the pattern 100 is detected, and the detection result is output to the determination means 56 (see FIG. 2).
  • the determination unit 56 includes the vertical angle ⁇ 1 (see FIG. 5) of the detection light irradiation direction input from the vertical angle detection unit 53 and the detection light irradiation direction stored in the storage unit 51 in advance. Is compared with the reference value of the vertical angle of. Then, as shown in FIG. 4A, the determination unit 56 determines whether the vertical angle in the irradiation direction of the detection light emitted from the sensor 2 is within an allowable range, and displays it on a display unit such as a monitor. Output the judgment result.
  • the determination unit 56 includes the left and right angle ⁇ 2 (see FIG. 7) of the detection light irradiation direction input from the left and right angle detection unit 54, and the detection light stored in the storage unit 51 in advance. The reference value of the left and right angle in the irradiation direction is compared. Then, as shown in FIG. 6A, the determination unit 56 determines whether or not the left and right angle of the irradiation direction of the detection light emitted from the optical sensor 2 is within an allowable range, and displays such as a monitor. The judgment result is output to.
  • the determination unit 56 includes the rotation angle ⁇ 3 (see FIG. 9) around the optical axis of the detection light input from the rotation angle detection unit 55 and the detection light stored in advance in the storage unit 51. Are compared with the reference value of the rotation angle around the optical axis. Then, as shown in FIG. 8, the determination unit 56 determines whether or not the rotation angle around the optical axis of the detection light emitted from the optical sensor 2 is within an allowable range, and determines the display unit such as a monitor. Output the result.
  • step S ⁇ b> 1 in FIG. 10 the vehicle 1 is placed on the facing device 3, and the vehicle 1 is directly facing the support base 30 (step S ⁇ b> 1 in FIG. 10).
  • the screen 20 can be moved upward to prevent contact between the vehicle 1 and the screen 20.
  • the screen 20 is lowered, the screen 20 is disposed in front of the vehicle 1, and the detection light is irradiated on the rear surface 22 of the screen 20 from the light emitting unit 2 a of the optical sensor 2 (step S ⁇ b> 2 in FIG. 10). .
  • the detection light applied to the rear surface 22 of the screen 20 is transmitted through the front surface 21, and the light pattern 100 of the detection light is displayed on the front surface 21.
  • the light pattern 100 projected on the front surface 21 of the screen 20 is photographed by the photographing device 40, and the photographed image data is stored in the storage means 51 of the inspection device 50 (step S3 in FIG. 10).
  • the screen 20 is raised, the light emitting unit 2a is photographed by the photographing device 40, and the photographed image data is stored in the storage means 51 of the inspection device 50 (step S4 in FIG. 10).
  • the trapezoidal distortion of the light pattern 100 of the captured image data stored in the storage unit 51 is corrected by the trapezoidal distortion correction unit 52 (step S5 in FIG. 10).
  • the vertical angle detection means 53 detects the height H2 (see FIG. 5) between the light emitting unit 2a and the floor surface 4 and the vertical angle ⁇ 1 (see FIG. 5) of the irradiation direction of the detection light with respect to the horizontal line. A detection result is output to the determination means 56 (step S6 of FIG. 10).
  • the distance W2 (see FIG. 7) between the light emitting portion 2a and the center position in the left / right direction of the vehicle 1, and the left / right angle ⁇ 2 of the detection light irradiation direction with respect to the front / rear direction of the vehicle 1 ) And the detection result is output to the determination means 56 (step S7 in FIG. 10).
  • the rotation angle detection means 55 detects the rotation angle ⁇ 3 (see FIG. 9) around the optical axis of the detection light with respect to the horizontal line, and outputs the detection result to the determination means 56 (step S8 in FIG. 10).
  • the determination unit 56 displays the vertical and horizontal angles in the irradiation direction of the detection light and the determination result of the detection light on the display unit (step S9 in FIG. 10).
  • the worker adjusts the mounting position (mounting angle) of the optical sensor 2 based on the determination result of the determination unit 56 to irradiate the predetermined area with the detection light from the optical sensor 2.
  • the vehicle 1 After the adjustment of the irradiation direction of the optical sensor 2 is completed, the vehicle 1 is moved below the screen 20 and inside the support base 30 to move to the next process.
  • the positions of the light emitting unit 2a and the optical pattern 100 of the optical sensor 2 are detected from the captured image data, and the light emitting unit 2a and the optical pattern are detected. Since the irradiation direction of the optical sensor 2 can be detected based on the positional relationship with the optical sensor 100, the mounting position and the mounting angle of the optical sensor 2 are accurately adjusted so that the optical sensor 2 irradiates a predetermined range. be able to.
  • the inspection apparatus 50 since the inspection apparatus 50 includes the trapezoidal distortion correction unit 52 that corrects the trapezoidal distortion of the optical pattern 100 of the captured image data, the detection accuracy of the positions of the light emitting unit 2a and the optical pattern 100 of the optical sensor 2, the optical sensor The detection accuracy of the irradiation direction 2 can be increased.
  • the screen 20 can be moved inside and outside the moving area of the vehicle 1, and the photographing device 40 is arranged outside the moving area of the vehicle 1. It is easy to move the vehicle 1 to the next process after the completion of the inspection of No. 2.
  • FIG. 11A when the optical sensor 2 detects peripheral equipment or an operator during inspection of the optical sensor 2 and the vehicle 1 performs a collision avoiding operation such as a sudden stop, the vehicle 1 is inspected in the inspection area A2. Cannot enter and exit. Further, when the traveling vehicle 1 performs the collision avoidance operation in the inspection area A2, there is a possibility that the vehicle 1 and the peripheral equipment come into contact with each other.
  • the inspection operator connects the signal input device 5a from the outside of the vehicle 1 to the sensor control device 5 of the vehicle 1, so that the vehicle 1 collides.
  • the avoidance action is not executed.
  • the sensor control device 5 is an electronic control device that controls the optical sensor 2 mounted on the vehicle 1.
  • the optical sensor 2 and the sensor control device 5 are integrally formed, but the optical sensor 2 and the sensor control device 5 may be formed separately and connected by a cable.
  • the signal input device 5a is an electronic control device for extracting information stored in an electronic storage medium or the like connected to an electronic control device or the like mounted on the vehicle 1 or the like in the assembly line. This is a portable information terminal used by an inspection operator or the like. Then, the signal input device 5a outputs a normal pseudo signal indicating, in a pseudo manner, that the optical sensor 2 has not detected an obstacle to the sensor control device 5.
  • the sensor control device 5 determines that the optical sensor 2 does not detect an obstacle even if the optical sensor 2 detects an obstacle. To do. Thereby, even if the optical sensor 2 detects an obstacle, the vehicle 1 enters a state where the collision avoidance operation is not executed.
  • a normal pseudo signal is continuously input from the input device 5a to the sensor control device 5.
  • the vehicle 1 executes a collision avoidance operation when the optical sensor 2 detects an obstacle. It can be returned to the state.
  • the central control device 6 is an electronic control device connected to a plurality of device control devices 7 and sensor control devices 5 mounted on the vehicle 1 by a CAN (Controller Area Network) standardized by the ISO standard. An abnormal signal indicating that the optical sensor 2 has detected an obstacle is input from the sensor control device 5 to the central control device 6.
  • the device control device 7 is an electronic control device that controls various devices such as an engine and a brake, and is, for example, a fuel injection control device 7a or a brake fluid pressure control device 7b.
  • the central controller 6 has a normal movement mode that causes the vehicle 1 to execute a collision avoidance operation when an abnormal signal is input from the sensor controller 5. At the normal time of traveling on a general road, the central controller 6 is set to the normal movement mode.
  • the central control device 6 has an in-factory movement mode that does not cause the vehicle 1 to execute a collision avoidance operation even when an abnormal signal is input from the sensor control device 5. Then, after the optical sensor 2 is assembled to the vehicle 1 until the vehicle 1 is shipped from the factory, the central controller 6 is switched from the normal movement mode to the in-factory movement mode.
  • the control mode of the central control device 6 is switched by connecting a computer to the central control device 6 and switching a plurality of mode programs written in the electronic storage medium of the central control device 6 by this computer.
  • the vehicle 1 can be prevented from executing the collision avoiding operation, and therefore the vehicle 1 can be smoothly moved in the factory. Can do. Further, contact between the vehicle 1 and peripheral equipment can be prevented, and damage to the vehicle 1 and peripheral equipment can be prevented.
  • the central controller 6 only needs to be set to the factory movement mode at least when the optical sensor 2 is inspected.
  • the computer connected to the central control device 6 is used to switch the mode / program written in the electronic storage medium of the central control device 6 so that the in-factory movement mode can be used. It can be switched to the normal movement mode. Further, the in-factory movement mode program written in the electronic storage medium of the central controller 6 may be rewritten to a normal movement mode program.
  • the present invention has been described above, but the present invention is not limited to the above-described embodiment, and can be appropriately changed without departing from the spirit of the present invention.
  • the light emitting unit 2 a of the optical sensor 2 and the light pattern 100 projected on the front surface 21 of the screen 20 are separately photographed by the photographing device 40, but the light emitting unit 2 a You may install the imaging device 40 so that the light pattern 100 can be image
  • the screen 20 may be lowered and the light pattern 100 projected on the screen 20 may be photographed by the photographing device 40.
  • the inspection immediately after the inspection by the inspection system is completed.
  • the vehicle 1 can be moved in the traveling direction.
  • the optical sensor 2 provided in the vehicle 1 which is a four-wheeled vehicle is inspected using the inspection system and the inspection method of the present invention has been described.
  • the present invention can be applied to various types of moving bodies, and can be applied to moving bodies such as two-wheeled vehicles and autonomous bipedal walking robots.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)
  • Image Input (AREA)

Abstract

La présente invention concerne un système d'inspection (10) d'un capteur optique (2) équipant un véhicule (1). Le système d'inspection comprend : un écran semi-transparent à la lumière (20) devant être irradié par une lumière de détection provenant du capteur optique (2) ; un appareil de photographie (40) servant à photographier une section émettrice de lumière (2a) du capteur optique (2) et un motif optique (100) de la lumière de détection qui est passée à travers l'écran (20) ; et un appareil d'inspection (50) recevant les données d'image photographiée de la section émettrice de lumière (2a) et du motif optique (100) depuis l'appareil de photographie (40). Avec une telle configuration, la direction d'irradiation du capteur optique (2) peut être détectée sur la base de la relation positionnelle entre la section émettrice de lumière (2a) et le motif optique (100).
PCT/JP2014/054946 2013-05-31 2014-02-27 Système et procédé d'inspection d'un capteur optique WO2014192347A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015519694A JP5897213B2 (ja) 2013-05-31 2014-02-27 光センサの検査システムおよび光センサの検査方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013115879 2013-05-31
JP2013-115879 2013-05-31

Publications (1)

Publication Number Publication Date
WO2014192347A1 true WO2014192347A1 (fr) 2014-12-04

Family

ID=51988391

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/054946 WO2014192347A1 (fr) 2013-05-31 2014-02-27 Système et procédé d'inspection d'un capteur optique

Country Status (2)

Country Link
JP (1) JP5897213B2 (fr)
WO (1) WO2014192347A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018194324A (ja) * 2017-05-12 2018-12-06 株式会社バンザイ 車両寸法測定装置
WO2018226390A1 (fr) 2017-06-09 2018-12-13 Waymo Llc Systèmes et procédés d'alignement d'optique de lidar
EP3588001A1 (fr) * 2018-06-21 2020-01-01 Mahle Aftermarket Italy S.p.A. Système et procédé d'étalonnage d'un capteur optique monté à bord d'un véhicule
WO2020214427A1 (fr) * 2019-04-17 2020-10-22 Waymo Llc Dispositif de mesure de synchronisation à capteurs multiples

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6438617A (en) * 1987-08-04 1989-02-08 Yaskawa Denki Seisakusho Kk Measuring apparatus of light distribution
JPH06221960A (ja) * 1993-01-26 1994-08-12 Honda Motor Co Ltd ヘッドライトの光源位置測定方法及び光軸測定方法並びに測定装置
JP2003270036A (ja) * 2002-03-14 2003-09-25 Mitsubishi Electric Corp 赤外線撮像装置
JP4061822B2 (ja) * 2000-06-26 2008-03-19 松下電工株式会社 赤外線モジュールの特性測定方法
JP2009067347A (ja) * 2007-09-18 2009-04-02 Koito Mfg Co Ltd 光軸検査方法及び光軸検査装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6438617A (en) * 1987-08-04 1989-02-08 Yaskawa Denki Seisakusho Kk Measuring apparatus of light distribution
JPH06221960A (ja) * 1993-01-26 1994-08-12 Honda Motor Co Ltd ヘッドライトの光源位置測定方法及び光軸測定方法並びに測定装置
JP4061822B2 (ja) * 2000-06-26 2008-03-19 松下電工株式会社 赤外線モジュールの特性測定方法
JP2003270036A (ja) * 2002-03-14 2003-09-25 Mitsubishi Electric Corp 赤外線撮像装置
JP2009067347A (ja) * 2007-09-18 2009-04-02 Koito Mfg Co Ltd 光軸検査方法及び光軸検査装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018194324A (ja) * 2017-05-12 2018-12-06 株式会社バンザイ 車両寸法測定装置
WO2018226390A1 (fr) 2017-06-09 2018-12-13 Waymo Llc Systèmes et procédés d'alignement d'optique de lidar
EP3610288A4 (fr) * 2017-06-09 2020-12-23 Waymo LLC Systèmes et procédés d'alignement d'optique de lidar
EP3588001A1 (fr) * 2018-06-21 2020-01-01 Mahle Aftermarket Italy S.p.A. Système et procédé d'étalonnage d'un capteur optique monté à bord d'un véhicule
WO2020214427A1 (fr) * 2019-04-17 2020-10-22 Waymo Llc Dispositif de mesure de synchronisation à capteurs multiples
CN113677584A (zh) * 2019-04-17 2021-11-19 伟摩有限责任公司 多传感器同步测量设备
US11269066B2 (en) 2019-04-17 2022-03-08 Waymo Llc Multi-sensor synchronization measurement device
CN113677584B (zh) * 2019-04-17 2022-09-06 伟摩有限责任公司 多传感器同步测量设备

Also Published As

Publication number Publication date
JP5897213B2 (ja) 2016-03-30
JPWO2014192347A1 (ja) 2017-02-23

Similar Documents

Publication Publication Date Title
KR101526424B1 (ko) 차량용 헤드 업 디스플레이 검사장치 및 그 방법
US11836947B2 (en) System for calibrating a vehicle camera
KR102395276B1 (ko) 운전자 지원 시스템 검사 장치 및 그 제어 방법
KR101787304B1 (ko) 교정 방법, 교정 장치 및 컴퓨터 프로그램 제품
US8272748B2 (en) Projection-type display apparatus and method for performing projection adjustment
JP5897213B2 (ja) 光センサの検査システムおよび光センサの検査方法
CN107560834B (zh) 车辆前照灯对准系统及方法
JP4461091B2 (ja) 位置検出装置及びその補正方法
JP2019156641A (ja) フォークリフト用の画像処理装置、および制御プログラム
US8342222B2 (en) Method for mounting a tyre on a rim to form a motor vehicle wheel and for demounting a tyre from a rim and apparatus therefore
JPWO2017179453A1 (ja) 検査装置、検査方法
KR101776740B1 (ko) Hud 자동 보정방법 및 보정시스템
JP2001252883A (ja) 移動ロボットシステム
US9970751B2 (en) Optical axis angle inspection device
CN107643049B (zh) 基于单目结构光的地磅上车辆位置检测系统及方法
US20110169954A1 (en) Maneuvering assisting apparatus
US20200366883A1 (en) Stereo camera device
WO2016091813A1 (fr) Appareil d'étalonnage et de surveillance d'image et procédé d'étalonnage et de surveillance d'image
JP7054739B2 (ja) 車両検査システム
JP2009208561A (ja) 運転者撮像システムおよび運転者撮像方法
JP2006153768A (ja) 位置検出装置及びその補正方法
JP4622637B2 (ja) 車載カメラ姿勢補正装置および車載カメラ姿勢補正方法
JP2013024735A (ja) 移動体及び移動面検出システム
US20180295342A1 (en) Image apparatus for detecting abnormality of distance image
JP2023536676A (ja) 車両の少なくとも1つのヘッドランプをキャリブレーションするためのadasキャリブレーションシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14803877

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015519694

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14803877

Country of ref document: EP

Kind code of ref document: A1