WO2012011187A1 - 距離測定装置および距離測定方法 - Google Patents
距離測定装置および距離測定方法 Download PDFInfo
- Publication number
- WO2012011187A1 WO2012011187A1 PCT/JP2010/062404 JP2010062404W WO2012011187A1 WO 2012011187 A1 WO2012011187 A1 WO 2012011187A1 JP 2010062404 W JP2010062404 W JP 2010062404W WO 2012011187 A1 WO2012011187 A1 WO 2012011187A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- lens
- imaging
- image
- light
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
Definitions
- the present invention relates to a distance measuring device for measuring a distance to a measuring object based on optical detection of a measuring object existing in a surrounding environment, particularly a measuring object existing in a traffic environment, and the distance measuring device.
- the present invention relates to a distance measuring method that is preferably used.
- Conventional distance measuring devices that have been put to practical use to measure the distance to a measurement object measure the distance to the measurement object based on optical detection of light selected from visible and invisible light. To do.
- a distance measuring device is mounted on a vehicle as a moving body, for example, and measures a distance (relative distance) between another vehicle or the like to be measured and the own vehicle (the distance measuring device itself). To do.
- the distance measuring device provides information on the distance measured in this way to the driving support device as one of driving support information for supporting collision avoidance with other vehicles, for example.
- the distance measuring device described in Patent Literature 1 includes a light source that projects light having a predetermined pattern having different wavelengths onto a measurement target, and the light pattern projected on the measurement target is in a direction different from the optical axis of the light source. Take an image from.
- the distance measuring device measures the distance to the measurement object based on the change between the projected light pattern and the captured light pattern.
- the distance measuring device described in Patent Document 1 needs to have a light source that projects light having sufficient intensity to enable imaging to be measured.
- the light source projects a pattern of light that can be imaged onto a measurement target that may be several tens to hundreds of meters away from the host vehicle.
- the energy consumed by the light source is an amount that cannot be neglected.
- Patent Document 2 discloses a distance measuring device that does not use a light source.
- a distance measuring device a total of two cameras, a camera sensitive to the visible spectral range and a camera sensitive to the infrared spectral range, are arranged with a predetermined interval between the two cameras.
- the distance measuring device measures the distance to the measurement target by applying a triangulation method to the same measurement target image captured by each camera.
- the distance measuring device described in Patent Document 2 does not require a special light source, the energy consumption is certainly small.
- the distance measuring device mounted on the vehicle is affected by the vibration and distortion of the vehicle body, it is not easy to maintain a predetermined distance between the two cameras attached to the vehicle body with high accuracy. As described above, when the distance measuring device is mounted on a vehicle in particular, there is still room for improvement from the standpoint of simplification of the configuration.
- the present invention has been made in view of such circumstances, and its purpose is distance measurement capable of measuring a distance to a measurement object with a simple configuration even when mounted on a vehicle or the like.
- An object of the present invention is to provide a device and a distance measuring method suitable for use in the distance measuring device.
- a distance measuring device measures a target distance as a distance to the measuring object by optically detecting the measuring object.
- the distance measuring device includes a lens having an optical axis in a direction different from a traveling direction of incident light from the measurement target, and the lens obtains an image of the measurement target by imaging the incident light. It is configured.
- the distance measuring device obtains an imaging position indicating the position of the image with respect to the lens with respect to each of a plurality of wavelengths of the incident light, thereby indicating a relative relationship between the imaging positions.
- Imaging relative amount calculating means for calculating an imaging relative amount as: information determined by chromatic aberration characteristics of the lens and the direction of the optical axis so as to show a correlation between the imaging relative amount and the target distance
- a storage means for storing the correlation information; and a distance calculation means for calculating the target distance by collating the imaging relative amount with the correlation information.
- the imaging position for each wavelength is different by using a lens having an optical axis in a direction different from the traveling direction of incident light. From this, the imaging relative amounts between the plurality of imaging positions are detected as different amounts. In other words, the distance measuring device can measure the distance to the measurement object based on the different imaging relative amounts. Since a normal lens has a different refractive index for each wavelength of light, that is, chromatic aberration is generated, when an image of light having a plurality of wavelengths is formed, the imaging position differs for each wavelength. Therefore, when the optical axis of the lens is tilted with respect to the traveling direction of the incident light, the lens refracts the incident light so as to tilt toward the direction of the optical axis of the lens.
- the light is refracted at the refractive index for each wavelength, so that the light image formation position is the direction perpendicular to the traveling direction of the incident light for each wavelength (the horizontal direction or the vertical direction of the lens). ) Different from each other.
- the incident angle of the light to the lens differs as the object distance, which is the distance between the lens and the measurement object, changes in the front-rear direction of the lens. For this reason, when the target distance changes, the imaging position of the single wavelength light also changes. Therefore, the distance measuring device can measure the distance to the measurement object based on the relative relationship between the imaging positions of the respective wavelengths.
- the imaging positions of the respective wavelengths are different from each other in the direction perpendicular to the traveling direction of the incident light, it is difficult to detect on a common imaging plane that is generally provided facing the lens. Instead, light of each wavelength is imaged. From this, the imaging plane can detect the imaging position of each wavelength. That is, there is almost no need to move the imaging plane in order to detect the imaging position, so that, for example, an apparatus for moving the imaging plane is not necessary. That is, the imaging position of each wavelength can be detected with a simple configuration.
- the difference in imaging position for each wavelength based on chromatic aberration can be obtained by detecting the imaging position of each wavelength by a common lens (optical system). From this, distance measurement can be performed by one optical system, that is, one camera. Therefore, for example, the degree of freedom of camera placement is increased as compared with the case of using a plurality of cameras. Further, it is not necessary to maintain the position of the camera with high accuracy, and the configuration of the distance measuring device can be simplified.
- normal lenses have chromatic aberration correction, and are limited to light of the desired wavelength. For example, for images, red, blue, and green wavelengths of light are imaged at each wavelength. Are often configured to match.
- the degree of freedom in selecting a wavelength used in the distance measuring device and the degree of freedom in design are increased. Furthermore, the degree of freedom in selecting and designing the optical system employed in the distance measuring device is increased.
- the light has two wavelengths whose imaging positions are different from each other, and the correlation information constitutes map data in which the imaging relative amount is associated with the target distance.
- the distance measuring device can measure the distance to the measurement object based on light of two wavelengths whose imaging positions by the lenses are different from each other. As described above, if the light has two or more wavelengths, the distance measuring device can measure the distance to the measurement target, and thus the distance measurement can be easily performed.
- the imaging relative amount may be an imaging position difference as a difference between imaging positions of the two wavelengths. According to such a configuration, since the relative imaging amount is detected as the difference between the imaging positions of the two wavelengths of light, the calculation for detection is simple.
- the imaging relative amount may be an imaging position ratio as a ratio between the imaging positions of the two wavelengths. Even with such a configuration, calculation for detection is simple.
- the optical axis of the lens may be tilted with respect to the traveling direction of the incident light.
- the distance measuring device can measure the distance to the measurement object based on the difference between the imaging positions. For example, in the case of a general convex lens, when the lens is arranged to be inclined with respect to the traveling direction of incident light, the optical axis of the lens is tilted with respect to the traveling direction of incident light. In this way, the lens arrangement mode and lens characteristics in the distance measuring device can be simplified.
- the surface of the lens may be non-rotational symmetric with respect to the optical axis of the lens.
- the optical axis of the lens can be tilted by configuring the lens surface to be non-rotational symmetric with respect to the optical axis of the lens. Therefore, by adjusting the surface shape of the lens, the optical axis of the lens can be tilted according to the wavelength of the emitted light and the distance to the measurement target. Therefore, the degree of freedom in selecting and designing the lens used in the distance measuring device is improved.
- the refractive index of the lens may be non-rotational symmetric with respect to the optical axis of the lens.
- the inclination of the optical axis of the lens can be adjusted by making the refractive index of the lens non-rotational symmetric with respect to the optical axis. Therefore, the optical axis of the lens can be tilted so as to correspond to the wavelength of the emitted light and the distance to the measurement target. This also improves the degree of freedom in selecting and designing the lens used in the distance measuring device.
- the lens is preferably a part of a spectrum sensor that detects light from the measurement target. According to such a configuration, it becomes possible to detect light of a plurality of wavelengths composed of arbitrary wavelengths by using the spectrum sensor. Therefore, a large amount of image formation relative can be calculated based on the image formation position of the light with the detected wavelength light. By measuring the distance based on many imaging relative amounts, the distance measurement accuracy can be increased.
- the spectrum sensor originally has a high degree of freedom in wavelength selectivity, it becomes easy to appropriately select light having a wavelength suitable for distance measurement according to the surrounding environment, ambient light, and the like.
- the spectrum sensor can detect light of a plurality of wavelengths in the first place, the distance measuring device can be easily configured. That is, the existing spectrum sensor can be used as a distance measuring device.
- the present invention provides a distance measuring method for measuring a target distance as a distance to the measuring object by optically detecting the measuring object.
- a distance measuring method for measuring a target distance as a distance to the measuring object by optically detecting the measuring object.
- an image of the measurement object is formed by a lens having an optical axis in a direction different from the traveling direction of incident light from the measurement object, and an imaging position indicating the position of the image with respect to the lens is obtained.
- An imaging position calculation step for each of a plurality of wavelengths of the incident light an imaging relative amount calculation step for calculating an imaging relative amount as an amount indicating a relative relationship between the imaging positions;
- a correlation information as information determined by the imaging relative quantity, the chromatic aberration characteristics of the lens, and the direction of the optical axis so as to show a correlation between the imaging relative quantity and the object distance;
- the imaging positions for the respective wavelengths are different from each other by the lenses having the optical axis in the direction different from the traveling direction of the incident light. Based on the imaging positions for each wavelength, the relative imaging amounts between the plurality of imaging positions are detected as different amounts.
- the distance measuring method can measure the distance to the measurement object based on different image forming relative amounts.
- a normal lens has a different refractive index for each wavelength of light, that is, causes chromatic aberration. Therefore, when light having a plurality of wavelengths is imaged, the imaging position differs for each wavelength.
- the lens refracts the incident light so as to tilt toward the direction of the optical axis. Since the light is refracted by the refractive index for each wavelength in this way, the imaging positions of the light for each wavelength are different from each other in the direction perpendicular to the traveling direction of the incident light (the horizontal direction and the vertical direction of the lens).
- the incident angle of the light to the lens differs as the object distance, which is the distance between the lens and the measurement object, changes along the front-rear direction of the lens. For this reason, when the target distance changes, the imaging position of the single wavelength light also changes. Therefore, according to the distance measuring method, the distance to the measurement object is measured based on the relative relationship between the imaging positions of the respective wavelengths.
- the imaging positions of the respective wavelengths are different from each other in the direction perpendicular to the traveling direction of the incident light.
- the image forming surface is generally provided so as to face the lens, and light of each wavelength is imaged on the common image forming surface without any trouble in detection. From this, the imaging plane can detect the imaging position of each wavelength. That is, since it is almost unnecessary to move the imaging plane in the front-rear direction of the lens in order to detect the imaging position, an apparatus for moving the imaging plane is unnecessary. That is, the distance measuring method can detect the imaging position of each wavelength with a simple configuration.
- the distance measuring method obtains an imaging position difference for each wavelength based on chromatic aberration based on the imaging position of each wavelength detected by a common lens, that is, a common optical system. From this, distance measurement can be performed by one optical system, that is, one camera. Therefore, for example, as compared with a method that requires a plurality of cameras, the degree of freedom of the camera arrangement of the apparatus that employs the distance measuring method is increased.
- ordinary lenses are often chromatic aberration corrected.
- a normal lens is configured to match the imaging distance of each wavelength only for light of the wavelength to be acquired, for example, for light of red wavelength, blue wavelength, and green wavelength for images.
- light having a wavelength that has not been corrected for chromatic aberration can be used for distance measurement, which increases the degree of freedom in selecting and designing the wavelength used in the distance measurement method.
- the degree of freedom in selecting and designing the optical system in an apparatus in which the distance measuring method is adopted is also increased.
- the incident light may have two wavelengths, and the imaging position calculation step may determine the imaging position for each of the two wavelengths.
- the correlation information may be acquired from map data in which the imaging relative amount is associated with the target distance.
- the distance measurement method can measure the distance to the measurement target, so that the distance measurement can be easily performed.
- FIG. 1 is a block diagram showing a system configuration in which a spectrum measuring apparatus according to an embodiment embodying a distance measuring apparatus of the present invention is mounted on a moving body.
- the schematic diagram which shows the schematic structure of an optical system used for the spectrum measuring apparatus of FIG.
- FIGS. 3A to 3C are schematic views showing image forming positions where the optical system of FIG. 2 forms an image of the measurement object.
- FIG. 3A is a diagram illustrating an imaging position when the measurement target is far away.
- FIG. 3B is a diagram illustrating an imaging position when the measurement target is closer to the lens than in the case of FIG.
- FIG. 3C is a diagram illustrating an imaging position when the measurement target is closer to the lens than in the case of FIG.
- FIG. 4B are schematic views illustrating an aspect when the optical system of FIG. 2 projects the same measurement target onto the imaging surface with light of different wavelengths.
- the graph which shows the relationship between the shift amount which the spectrum measuring apparatus of FIG. 1 detects, and the distance to a measuring object.
- the flowchart which shows the procedure in which the spectrum measuring apparatus of FIG. 1 measures distance.
- the schematic diagram which shows the schematic structure of the optical system of the spectrum measuring device which embodies the distance measuring device of this invention which concerns on other embodiment.
- FIG. 1 to 6 illustrate a spectrum measuring apparatus 11 according to an embodiment embodying the distance measuring apparatus of the present invention.
- 1 to 5 show a system configuration of the spectrum measuring apparatus 11, and
- FIG. 6 shows a flowchart.
- FIG. 1 is a block diagram showing a system configuration of a spectrum measuring apparatus 11 mounted on a vehicle 10 as a moving body.
- a driving support device that is being considered for practical use in vehicles such as automobiles is based on spectrum data measured by a spectrum sensor mounted on the vehicle in order to support driver driving and decision making. Recognize pedestrians and other vehicles.
- the spectrum measuring apparatus 11 shown in FIG. 1 can recognize the measurement object by acquiring optical information including visible light and invisible light outside the vehicle, and the distance between the spectrum measuring apparatus 11 itself and the measurement object. It is comprised so that it can measure. Further, the vehicle 10 transmits the recognition information and distance information output from the spectrum measurement device 11 to the passenger of the vehicle 10, and the recognition information and distance information output from the spectrum measurement device 11. And a vehicle control device 13 that reflects the above in vehicle control. Since the spectrum measuring apparatus 11 recognizes the measurement object by a known method, in this embodiment, the configuration of the part of the spectrum measurement apparatus 11 for recognizing the measurement object and the recognition process for recognizing the measurement object. Redundant explanations such as are omitted for convenience.
- the human machine interface 12 communicates the vehicle state and the like to the passenger, particularly the operator, through light, color, sound, and the like. That is, the human machine interface 12 is a known interface device provided with an operation device such as a push button or a touch panel so that a passenger's intention is input through a button or the like.
- the vehicle control device 13 as one of the various control devices mounted on the vehicle 10 can transmit necessary information to various other control devices such as an engine control device mounted on the vehicle. They are interconnected directly or indirectly via an in-vehicle network.
- the vehicle control device 13 receives information about the measurement target recognized by the spectrum measurement device 11 and information such as the distance to the measurement target from the connected spectrum measurement device 11, the vehicle control device 13 Information is transmitted to various other control devices. Further, the vehicle control device 13 is configured to execute the required driving assistance in the vehicle 10 in accordance with the recognized measurement object and the distance to the measurement object.
- the spectrum measurement device 11 detects spectrum data R0 of observation light, which is light obtained by observing a measurement object, and receives and processes spectrum data R0 from the spectrum sensor 14. And a spectral data processing device 15 for performing the processing.
- the spectrum sensor 14 is configured to generate spectrum data R0 of observation light by detecting a spectrum image of observation light.
- Each of the plurality of pixels constituting the spectrum image has individual spectrum data.
- the spectrum sensor 14 has a function of splitting observation light as light composed of visible light and invisible light into a predetermined wavelength band.
- the spectrum data R0 output from the spectrum sensor 14 includes wavelength information as information indicating the wavelengths constituting each wavelength band after spectroscopy, and light intensity as information indicating the light intensity of observation light for each wavelength in these wavelength bands. Information.
- 400 nm (nanometers) is selected in advance as the first wavelength ( ⁇ 1), that is, the short wavelength used for distance measurement, and the second wavelength ( ⁇ 2), that is, the long wavelength that is longer than the short wavelength is selected.
- 800 nm is selected. That is, the spectrum data R0 includes spectrum data composed of 400 nm light and spectrum data composed of 800 nm light.
- the spectrum sensor 14 also has a function of regulating the observation light to a predetermined wavelength band.
- the spectrum sensor 14 includes a lens 20 that forms an image of the incident light L and a detection device 21 that detects the light imaged by the lens 20. Furthermore, the spectrum sensor 14 includes a filter (not shown) for generating the incident light L from the observation light. That is, the filter of this embodiment selects the light component of the main wavelength from the observation light among the various light components constituting the incident light L.
- the detection device 21 is composed of a light receiving element such as a CCD.
- An imaging surface 21 a as an imaging surface formed by the light receiving surfaces of these light receiving elements is disposed so as to face the lens 20. That is, the detection device 21 detects the light intensity information of the incident light L in a state of being imaged by the lens on the imaging surface 21a.
- the lens 20 is a convex lens, when the incident light L is incident on the lens 20, outgoing light as transmitted light refracted so as to be condensed is emitted from the lens 20. Light emitted from the lens forms an image at the image point F.
- the traveling direction X1 of the incident light L is inclined by the inclination angle ⁇ a with respect to the optical axis of the lens 20. That is, the optical axis AX of the lens 20 has a direction different from the traveling direction X1 of the incident light L. That is, the main plane TX of the lens 20 is inclined by an inclination angle ⁇ a with respect to a plane perpendicular to the traveling direction X1 of the incident light L.
- the main plane TX of the lens 20 is a plane that passes through the main point PP of the lens 20 and is perpendicular to the optical axis AX of the lens 20.
- the main plane TX of the lens 20 passes through the center of the lens 20 in the thickness direction.
- Such a lens 20 can be designed by a known lens design technique.
- each position component of the incident light L incident on each position on the surface of the lens 20 is not rotationally symmetric with respect to the optical axis AX of the lens 20 due to the presence of the tilt angle ⁇ a of the lens 20. That is, each position component of the incident light L is incident on the lens 20 at a non-rotationally symmetric incident angle. Therefore, each position component of the incident light L is refracted by each lens 20 portion at a refraction angle that is non-rotationally symmetric with respect to the optical axis AX of the lens 20. Therefore, the imaging point F does not exist on the extension line LX in the traveling direction X1 from the principal point PP of the lens 20. The imaging point F exists away from the extension line LX.
- the portion of the lens 20 is refracted at a non-rotationally symmetric refraction angle. That is, the outgoing light L10 from the lens 20 travels in a direction different from the traveling direction X1 of the far incident light L1 and forms an image at the imaging point F.
- the lens 20 has a property of having a different refractive index for each wavelength of light, so-called chromatic aberration. For this reason, among the emitted light L10 from the lens 20, the light components of the respective wavelengths are emitted from the lens 20 in different directions with refractive indexes corresponding to the respective wavelengths. That is, the emitted light L10 is emitted at a non-rotation contrasting refraction angle based on a different refractive index for each wavelength of light, and proceeds in a direction corresponding to each refraction angle, thereby forming imaging points F at different positions. Is imaged. That is, on the common imaging surface 21a, the imaging points F of the light of each wavelength are formed at different positions for each wavelength of the incident light L.
- the image forming point F for the short wavelength and the image forming point F for the long wavelength are not necessarily formed on the common image forming surface 21a, and it is considered that some axial chromatic aberration also occurs.
- the axial chromatic aberration is in the range of the depth of focus as compared to the difference in position, that is, the shift amount between the imaging point F for the short wavelength and the imaging point F for the long wavelength on one imaging plane 21a.
- the inclination angle ⁇ a, the range of the target distance s as the distance to the measurement target, the material of the lens 20, and the refractive index were set in advance so as to fall within the range. That is, the axial chromatic aberration is negligible compared to the shift amount of the imaging position of the lens 20 in the horizontal direction or the vertical direction.
- FIG. 3A shows a case where the measurement target is a far measurement target T1 that exists far away.
- FIG. 3B shows a case where the measurement target is a medium measurement target T2 that is closer to the lens 20 than in the case of FIG.
- FIG. 3C shows a case where the measurement target is a near measurement target T3 that is closer to the lens 20 than in the case of FIG.
- FIG. 3A shows a far measurement target T1 that exists at a far target distance s1 that can be evaluated as an infinite distance from the lens 20.
- the far incident light L1 that is the incident light from the far measurement target T1 is incident on the lens 20 as substantially parallel light in the traveling direction X1.
- the far-incident light L1 is single-wavelength light having only a short wavelength as the first wavelength, for example, light having a wavelength of 400 nm
- the far-incident light L1 is obtained by using the refractive index of the lens 20 corresponding to the wavelength of 400 nm and the lens.
- the light is refracted on the basis of the refractive index corresponding to the inclination angle ⁇ a of 20, and is emitted from the lens 20 as the far-short emitted light L11.
- the far and short emitted light L11 is imaged at the far and short imaging point F11 on the imaging surface 21a.
- the far incident light L1 is a single wavelength light having only a long wavelength as a second wavelength different from the short wavelength, for example, a light having a wavelength of 800 nm
- the far incident light L1 is a lens corresponding to a wavelength of 800 nm.
- the light is refracted and emitted from the lens 20 as long-distance outgoing light L12.
- the long-length emitted light L12 is imaged at a long-length image point F12 on the image plane 21a.
- the refractive index generally tends to increase as the wavelength decreases. Therefore, as shown in FIG. 3A, the refraction of the long and short emitted light L11 having a short wavelength (wavelength 400 nm) is larger than the refraction of the long and long emitted light L12 having a long wavelength of 800 nm. Therefore, on the common imaging surface 21a, the position of the long imaging point F12 of the long output light L12 is different from the position of the long imaging point F11 of the long output light L11.
- the far-short image point F11 is located farther than the far-length image point F12 with respect to the extension line LX in the traveling direction X1 passing through the principal point PP of the lens 20.
- the far and short image forming point F11 is positioned below the far and long image forming point F12. Therefore, there is a relative difference between the position of the long and short imaging point F11 of the long and short outgoing light L11 and the position of the long and long imaging point F12 of the long and long outgoing light L12 due to the shift of the imaging position due to the difference in wavelength.
- a far shift amount D1 position of the far-short imaging point F11 ⁇ position of the long imaging point F12
- the far shift amount D1 is a distance in the vertical direction with respect to the extension line LX in the traveling direction X1 passing through the principal point PP of the lens 20.
- FIG. 3B shows the middle measurement target T2 located at the middle target distance s2 whose distance from the lens 20 is shorter than the far target distance s1 described above.
- the intermediate expansion angle ⁇ 2 shown in FIG. 3B is an expansion angle, that is, an intake angle, indicating the degree of expansion of the medium incident light L2 as incident light in this case from the measurement target T toward the peripheral edge of the lens 20. Show.
- the incident angle to the lens 20 increases as the expansion angle increases.
- the far expansion angle ⁇ 1 which is the expansion angle in the case of FIG. 3A, is almost zero.
- the medium incident light L2 from the medium measurement target T2 is a single wavelength light having a short wavelength of 400 nm
- the medium incident light L2 has a refractive index of the lens 20 corresponding to the short wavelength and a refraction angle by the tilt angle ⁇ a of the lens 20.
- the lens 20 is refracted based on the refraction angle due to the medium expansion angle ⁇ 2.
- the medium-short emission light L21 emitted from the lens 20 in this case forms an image at the medium-short image formation point F21 on the image plane 21a almost the same as in the case of FIG.
- the medium incident light L2 is a single wavelength light having a long wavelength of 800 nm
- the medium incident light L2 has a refractive index of the lens 20 corresponding to the long wavelength, a refraction angle due to the inclination angle ⁇ a of the lens 20, and The light is refracted based on the refraction angle defined by the medium expansion angle ⁇ 2.
- the medium-long emission light L22 emitted from the lens 20 is imaged at the medium-long image formation point F22 on almost the same image plane 21a. Since the lens 20 is not corrected for chromatic aberration, as shown in FIG.
- the refraction of the medium-short emission light L21 with a short wavelength of 400 nm is larger than the refraction of the medium-long emission light L22 with a long wavelength of 800 nm. Therefore, on the common imaging plane 21a, the position of the medium-long image forming point F22 of the medium-long emitted light L22 is different from the position of the medium-short image forming point F21 of the medium-short emitted light L21. Therefore, there is a relative amount of relation between the position of the medium and short image forming point F21 of the medium and short emission light L21 and the medium and long image formation point F22 of the medium and long emission light L22 due to the deviation of the image formation position due to the difference in wavelength.
- the refraction angle defined by the tilt angle ⁇ a of the lens 20 and the refraction angle defined by the intermediate expansion angle ⁇ 2 to the lens 20 indicate a refraction angle that is not contrasted with the lens 20.
- FIG. 3C shows a near-measuring object T3 that exists at a near-object distance s3 whose distance from the lens 20 is shorter than the above-described medium-object distance s2.
- the near widening angle ⁇ 3 shown in FIG. 3C is larger than the medium widening angle ⁇ 2 in FIG.
- the near incident light L3 from the near measurement target T3 is a single wavelength light having a short wavelength of 400 nm
- the near incident light L3 is refracted by the refractive index of the lens 20 corresponding to the short wavelength and the inclination angle ⁇ a of the lens 20.
- the light is refracted based on the angle and the refraction angle defined by the near expansion angle ⁇ 3 to the lens 20.
- the near-short emitted light L31 emitted from the lens 20 is imaged at a near-short image forming point F31 on almost the same image plane 21a.
- the near incident light L3 is a single wavelength light having a long wavelength of 800 nm
- the near incident light L3 is directed to the lens 20 by the refractive index of the lens 20 corresponding to the long wavelength, the refraction angle by the tilt angle ⁇ a of the lens 20, and the lens 20. Is refracted based on the refraction angle defined by the near expansion angle ⁇ 3.
- the near-length emitted light L32 emitted from the lens 20 is imaged at the near-length image point F32 on almost the same image plane 21a. Since the lens 20 is not corrected for chromatic aberration, as shown in FIG.
- the refraction angle of the lens 20 with respect to incident light having a short wavelength is different between the incident angles, that is, the refraction angle with respect to the far incident light L1 at the far object distance s1, and the refraction angle with respect to the medium incident light L2 at the medium object distance s2.
- the refraction angles with respect to the near incident light L3 at the target distance s3 are different from each other.
- the refraction angle of the lens 20 with respect to incident light having a long wavelength is also different in incident angle, that is, the refraction angle with respect to the far incident light L1 at the far object distance s1, and the refraction angle with respect to the medium incident light L2 at the medium object distance s2.
- the refraction angles for the near incident light L3 at the near target distance s3 are different from each other.
- the relative relationship such as the “ratio” between the non-rotation contrast angle of the lens 20 with respect to the short wavelength incident light and the non-rotation contrast angle of the lens 20 with respect to the long wavelength incident light with respect to the far target distance s1 is The relative relationship such as “ratio” between the non-rotation contrasting refraction angle of the lens 20 with respect to the short wavelength incident light and the non-rotation contrasting refraction angle of the lens 20 with respect to the long wavelength incident light with respect to the intermediate target distance s2. Usually does not match.
- a relative relationship such as a “ratio” between the non-rotation contrasting refraction angle of the lens 20 with respect to the short wavelength incident light and the non-rotation contrasting refraction angle of the lens 20 with respect to the long wavelength incident light with respect to the intermediate target distance s2 is
- the relative relationship such as “ratio” between the non-rotational contrast angle of the lens 20 with respect to the short wavelength incident light and the non-rotational contrast angle of the lens 20 with respect to the long wavelength incident light with respect to the near target distance s3 is Usually does not match.
- the relative relationship such as the “ratio” between the non-rotation contrasting refraction angle of the lens 20 with respect to the short wavelength incident light and the non-rotation contrasting refraction angle of the lens 20 with respect to the long wavelength incident light with respect to the far target distance s1 is
- the relative relationship such as the “ratio” between the non-rotation contrasting refraction angle of the lens 20 with respect to the short-wavelength incident light and the non-rotation contrasting refraction angle of the lens 20 with respect to the long-wavelength incident light with respect to the near target distance s3 is also normal. Does not match.
- the far shift amount D1 in the case of the far target distance s1 to the far measurement target T1 the middle shift amount D2 in the case of the middle target distance s2 to the middle measurement target T2, and the near target distance s3 to the near measurement target T3.
- the near shift amount D3 is different from each other. From this, the spectrum measuring apparatus 11 can conclude that the far shift amount D1 corresponds to the far target distance s1, the middle shift amount D2 corresponds to the middle target distance s2, and the near shift amount D3 corresponds to the near target distance s3. That is, the spectrum measuring apparatus 11 can conclude a unique correspondence between the target distance s and the shift amount D. Therefore, the spectrum measuring apparatus 11 can measure the target distance s as the distance to the measurement target T using the lateral chromatic aberration.
- the short wave image P1 projected on the imaging surface 21a by the far incident light L1 having a short wavelength of 400 nm in this case includes a short wave pedestrian image T31 that is an image of the pedestrian T3, and the like. It includes a shortwave other vehicle image T21 that is an image of the vehicle T2 and a shortwave tree image T11 that is an image of the tree T1.
- the long wave image P2 projected by the far incident light L1 having a long wavelength of 800 nm on the imaging surface 21a includes a long wave pedestrian image T32 that is an image of the pedestrian T3 and other vehicles.
- a long-wave other vehicle image T22 that is an image of T2 and a long-wave tree image T12 that is an image of a tree T1 are included.
- the entity of the short wave pedestrian image T31 of the short wave image P1 is the same pedestrian T3 as the entity of the long wave pedestrian image T32 of the long wave image P2.
- the entity of the short-wave other vehicle image T21 in the short-wave image P1 is the other vehicle T2, which is the same as the entity of the long-wave other vehicle image T22 in the long-wave image P2.
- the entity of the shortwave tree image T11 of the shortwave image P1 is the same tree T1 as the entity of the longwave tree image T12 of the longwave image P2.
- a third detection region W3 for comparing the imaging position of the pedestrian T3 for each wavelength is set, and the third detection area W3 for comparing the imaging position of the other vehicle T2 for each wavelength.
- Two detection areas W2 are set, and a first detection area W1 for comparing the imaging position of the tree T1 for each wavelength is set.
- the position of the third detection region W3 of the short wave image P1 with respect to the imaging surface 21a is set to be the same as the position of the third detection region W3 of the long wave image P2.
- the position of the second detection region W2 of the short wave image P1 with respect to the imaging surface 21a is set to be the same as the position of the second detection region W2 of the long wave image P2.
- the position of the first detection region W1 of the short wave image P1 with respect to the imaging surface 21a is set to be the same as the position of the first detection region W1 of the long wave image P2.
- the spectrum sensor 14 of the present embodiment changes the imaging position of the measurement object between the short wavelength 400 nm and the long wavelength 800 nm. From this, for example, when the distance to the pedestrian T3 is the near target distance s3, the vertical direction is between the position of the short wave pedestrian image T31 and the position of the long wave pedestrian image T32 on the imaging plane 21a. Produces a near shift amount D3. Further, for example, when the distance to the other vehicle T2 is the middle target distance s2, a middle shift in the vertical direction is made between the position of the short-wave other vehicle image T21 and the position of the long-wave other vehicle image T22 on the imaging plane 21a. A quantity D2 is produced.
- a far shift amount D1 in the vertical direction is between the position of the short wave tree image T11 and the position of the long wave tree image T12 on the imaging plane 21a. Arise.
- the spectrum sensor 14 determines that the distance between the position of the short wave pedestrian image T31 and the position of the long wave pedestrian image T32 is the near shift amount D3, and the distance to the pedestrian T3 is the near target distance s3. It can be determined that Further, the spectrum sensor 14 determines that the distance between the position of the short-wave other vehicle image T21 and the position of the long-wave other vehicle image T22 is the middle shift amount D2, and the distance to the other vehicle T2 is the middle target distance s2. It can be determined that Further, the spectrum sensor 14 obtains that the distance to the tree T1 is the far target distance s1 based on the shift between the position of the shortwave tree image T11 and the position of the longwave tree image T12 being the far shift amount D1. be able to.
- the spectrum sensor 14 grasps the target distance s from the deviation amount, that is, the difference between the image formation position of the image at the short wavelength of the measurement object and the image formation position of the image at the long wavelength of the measurement object on the image formation surface 21a. It becomes possible.
- the spectrum sensor 14 thus obtains spectrum data R0 including a short wave image P1 that is a spectrum image based on a short wavelength and a long wave image P2 that is a spectrum image based on a long wavelength for the measurement target T. To detect. Then, the spectrum sensor 14 outputs the spectrum data R0 to the spectrum data processing device 15.
- the spectrum data processing device 15 is mainly configured by a microcomputer having, for example, an arithmetic device and a storage device.
- the spectrum data processing device 15 is connected to the spectrum sensor 14 and receives the spectrum data R0 of the observation light detected by the spectrum sensor 14.
- the spectrum data processing device 15 calculates or measures the target distance s based on the input spectrum data R0 of the observation light.
- the spectrum data processing device 15 includes an arithmetic device 16 and a storage unit 17 as storage means.
- storage part 17 consists of all or one part of the storage area provided in the well-known memory
- FIG. 5 shows an example of the map data 18 stored in the storage area.
- the map data 18 is an aspect associated with a target distance s (s1, s2, s3, etc.) as a distance to the measurement target T, and includes an imaging position of short wavelength light and an imaging position of long wavelength light.
- the shift amount (D1, D2, D3, etc.) which consists of a difference is shown. That is, the map data 18 stores a far shift amount D1 that is a difference between the position of the far and short image forming point F11 and the position of the far and long image forming point F12 in association with the far object distance s1 to the far measurement target T1. ing.
- the map data 18 stores a medium shift amount D2 that is a difference between the position of the medium-short image formation point F21 and the position of the medium-long image formation point F22 in association with the medium object distance s2 to the medium measurement object T2. ing. Further, the map data 18 stores a near shift amount D3, which is the difference between the position of the near and short image forming point F31 and the position of the near long image forming point F32, in association with the near object distance s3 to the near measurement target T3. ing.
- the computing device 16 can acquire the far target distance s1 based on the far shift amount D1, the middle target distance s2 based on the middle shift amount D2, or the near target distance s3 based on the near shift amount D3 from the map data 18.
- the map data 18 constitutes correlation information as information determined by the chromatic aberration characteristics of the lens 20 and the direction of the optical axis AX so as to show the correlation between the shift amount D as the imaging relative amount and the target distance s. To do.
- the arithmetic device 16 includes an attention image selection unit 30 that selects an image used for distance measurement from an image of the measurement target T; an imaging position calculation unit 31 that detects an imaging position of two wavelengths from the selected images; A shift amount calculation unit 32 that calculates a shift amount D that is a difference between the imaging positions of the two wavelengths. Furthermore, the arithmetic device 16 includes a distance calculation unit 33 as a distance calculation unit that calculates the target distance s from the shift amount D.
- the imaging position calculation unit 31 and the shift amount calculation unit 32 constitute an imaging relative amount calculation unit as a relative amount calculation unit.
- the attention image selection unit 30 selects an image used for distance measurement from the image of the measurement target T in units of pixels.
- the spectrum data R0 is input from the spectrum sensor 14
- the attention image selection unit 30 outputs the attention image information W0 and spectrum data R1 including the spectrum images of two wavelengths to the imaging position calculation unit 31.
- the attention image selection unit 30 may select an image corresponding to a measurement object with a high priority from among the recognized measurement objects based on a separately performed object recognition process or the like. Alternatively, an image corresponding to one that occupies many areas may be selected.
- the image selected by the target image selection unit 30 is preferably a boundary portion with the background or the like so that the positions of two different wavelength images can be identified.
- FIG. 4A shows a short wave image P1 as a short wavelength image
- FIG. 4B shows a long wave image P2 as a long wavelength image.
- the far measurement target T1 is “tree”
- the middle measurement target T2 is “other vehicle”
- the near measurement target T3 is “pedestrian”.
- 4A shows a shortwave tree image T11 that is an image of a tree, a shortwave other vehicle image T21 that is an image of another vehicle, and a shortwave pedestrian image T31 that is an image of a pedestrian.
- a long wave image P2 in FIG. 4B shows a long wave tree image T12 that is an image of a tree, a long wave other vehicle image T22 that is an image of another vehicle, and a long wave pedestrian image T32 that is an image of a pedestrian.
- the attention image selection unit 30 selects the first attention image PX1 from the short wave tree image T11 and the long wave tree image T12 when the tree T1 is a measurement target.
- the attention image selection unit 30 sets the first detection region W1 in which both the short wave image P1 and the long wave image P2 include the first attention image PX1.
- the first attention image PX1 shows a boundary line between the root of the tree T1 and the ground below the root.
- the attention image selection unit 30 selects the second attention image PX2 from the short-wave other vehicle image T21 and the long-wave other vehicle image T22 when the other vehicle T2 is the measurement target.
- the attention image selection unit 30 sets the second detection region W2 in which both the short wave image P1 and the long wave image P2 include the second attention image PX2.
- the second attention image PX2 includes a boundary line between the tire of the other vehicle T2 and the road surface under the tire.
- the attention image selection unit 30 selects the third attention image PX3 from the short wave pedestrian image T31 and the long wave pedestrian image T32, for example, when the pedestrian T3 is the measurement target. Further, the attention image selection unit 30 sets a third detection region W3 in which both the short wave image P1 and the long wave image P2 include the third attention image PX3.
- the third attention image PX3 includes a boundary line between the shoes of the pedestrian T3 and the road surface under the shoes.
- the attention image selection unit 30 includes attention image information W0 including the first attention image PX1 and the first detection area W1, the second attention image PX2 and the second detection area W2, and the third attention image PX3 and the third detection area W3. And output to the imaging position calculation unit 31.
- the imaging position calculation unit 31 detects the imaging positions of the two wavelength images based on the attention image selected by the attention image selection unit 30.
- the imaging position calculation unit 31 receives the target image information W0 and the spectrum data R1 from the target image selection unit 30, and based on the input target image information W0 and the spectrum data R1, the two wavelengths of the target image.
- the imaging position is calculated.
- the imaging position calculation unit 31 outputs imaging position data R2 including the calculated imaging positions of the two wavelengths to the shift amount calculation unit 32.
- the shift amount calculation unit 32 calculates the shift amount D from the imaging positions of the two wavelengths. Based on the imaging position data R2 input from the imaging position calculating unit 31, the shift amount calculating unit 32 forms imaging positions of two wavelengths (for example, the position of the far and short imaging point F11 and the far and long imaging point F12). (Position) difference is calculated as a shift amount D. The shift amount calculation unit 32 outputs the calculated shift amount D to the distance calculation unit 33 as shift amount data R3 that is data associated with two wavelengths.
- the distance calculation unit 33 calculates the target distance s based on the shift amount data R3. That is, the distance calculation unit 33 selects the map data 18 corresponding to the two wavelengths from the storage unit 17 based on two wavelengths (for example, 400 nm and 800 nm) acquired from the shift amount data R3. Then, the distance calculation unit 33 acquires the target distance s (for example, the far target distance s1) corresponding to the shift amount (for example, the far shift distance D1) acquired from the shift amount data R3 from the selected map data 18. The distance calculation unit 33 generates distance data R4 by, for example, associating the acquired target distance s with the measurement target T, and outputs the distance data R4 to the human machine interface 12, the vehicle control device 13, or the like.
- FIG. 6 illustrates a procedure for measuring the target distance s.
- FIG. 6 is a flowchart showing a procedure by which the spectrum measuring apparatus 11 of the present embodiment measures the target distance s.
- the procedure for measuring the target distance s is sequentially executed at a predetermined cycle.
- the arithmetic unit 16 acquires the spectrum data R0 detected by the spectrum sensor 14 in step S10.
- the arithmetic unit 16 selects a target image from the image of the measurement target T whose distance is to be measured.
- the measurement target T is selected on the condition of the measurement target separately recognized by the spectrum measuring apparatus 11 and the priority of the measurement target.
- the arithmetic unit 16 calculates the imaging position of the attention image for each of the short wavelength and the long wavelength used for distance measurement (imaging position calculation step). The imaging position is obtained based on the position of the pixel on the imaging surface 21a that detects the target image.
- step S13 the arithmetic unit 16 compares the position of the target image of two wavelengths with each other to calculate a shift amount D that is an imaging relative amount (imaging relative amount calculation). Process).
- the shift amount D (D1, D2, D3) is calculated as the difference between the imaging positions of the images of interest at the two wavelengths.
- step S14 the arithmetic device 16 calculates the target distance s (distance calculation step). The computing device 16 calculates the target distance s by acquiring the distance corresponding to the shift amount D from the map data 18 corresponding to the two wavelengths.
- the spectrum measuring apparatus 11 uses the lens 20 having the optical axis AX in a direction different from the traveling direction X1 of the incident light L, so that the imaging position for each wavelength is different. From this, the relative imaging amount between the plurality of imaging positions is detected as a different amount for each target distance s. That is, the spectrum measuring apparatus 11 can measure the target distance s based on the different imaging relative amounts.
- a normal lens has a different refractive index for light of each wavelength, that is, produces chromatic aberration. For this reason, when the lens 20 forms an image of light having a plurality of wavelengths, the image formation position differs for each light of each wavelength.
- the optical axis AX of the lens 20 is inclined with respect to the traveling direction X1 of the far incident light L1, that is, when the lens 20 refracts the far incident light L1 in the direction of the optical axis AX of the lens,
- the light of each wavelength is refracted by each refractive index. Therefore, the image formation position (image formation point position) of the image formed by the lens 20 is displaced by different amounts in the horizontal direction and the vertical direction of the lens 20 for each light of each wavelength.
- the incident angle of light on the lens 20 is different due to the change in the target distance s between the lens 20 and the measurement target T, the imaging position of light having one wavelength also changes. Accordingly, the spectrum measuring apparatus 11 can measure the target distance s based on the relative relationship between the imaging positions of the respective wavelengths.
- the imaging positions of the respective wavelengths are different in the horizontal direction and the vertical direction of the lens 20. That is, the imaging positions of the respective wavelengths are displaced by different amounts in the direction perpendicular to the traveling direction X1 of the incident light L. For this reason, images of the respective wavelengths are formed on the image forming surface 21a that is generally provided facing the lens 20. Therefore, the imaging plane 21a can detect each imaging position of the light image of each wavelength. That is, since the spectrum measuring apparatus 11 does not need to move the imaging surface 21a in order to detect the imaging position, an apparatus for moving the imaging surface 21a is not required, and the imaging positions of the respective wavelengths can be easily set. It can be detected by configuration.
- the same lens 20 detects the imaging position of each wavelength, a difference in imaging position for each wavelength based on chromatic aberration is obtained. That is, distance measurement can be performed by one optical system, that is, one camera (spectrum sensor 14). Therefore, compared with the case where a plurality of cameras are used, for example, the present embodiment increases the degree of freedom of camera arrangement. That is, it is not necessary to maintain the camera arrangement position with high accuracy, and the configuration of the distance measuring device can be simplified.
- the normal lens is corrected for chromatic aberration.
- normal lenses are configured so that the imaging distance of light of each wavelength is the same for light of the wavelength that you want to acquire, for example, for light of red, blue, and green wavelengths. It is often done.
- the lens 20 that has not been corrected for chromatic aberration can be used for distance measurement. Accordingly, the degree of freedom in selecting and designing the wavelength used in the distance measuring device is increased, and the degree of freedom in selecting and designing the optical system employed in the distance measuring device is also increased.
- the spectrum measuring apparatus 11 measures the target distance s based on two wavelengths of light having different imaging positions (imaging point positions) by the lens 20. That is, if the light from the measurement target T has two or more wavelengths, the distance of the measurement target T can be measured, so that the distance measurement can be easily performed.
- the spectrum measuring apparatus 11 detects the imaging relative amount as the difference between the imaging positions of the two wavelengths, that is, the shift amount D (D1, D2, D3). Therefore, the calculation for detection is simple.
- the spectrum measuring apparatus 11 causes the optical axis AX of the lens 20 to tilt with respect to the far incident light L1, thereby causing a difference in the imaging position for each light of each wavelength.
- the spectrum measuring apparatus 11 measures the target distance s based on this imaging position difference.
- the optical axis AX of the lens 20 is inclined with respect to the traveling direction X1 of the far incident light L1 by arranging the lens 20 so as to be inclined with respect to the traveling direction X1 of the far incident light L1.
- this embodiment can simplify the arrangement of the lens 20 and the characteristics of the lens 20 in the distance measuring device.
- the spectrum sensor 14 detects an image for each wavelength of the measurement target T formed by the lens 20, whereby light having a plurality of wavelengths of arbitrary wavelengths can be detected. Therefore, since the degree of freedom in wavelength selectivity is high, it becomes easy to appropriately select light having a wavelength suitable for distance measurement according to the surrounding environment, ambient light, and the like. In addition, since the spectrum sensor 14 can originally detect light having a plurality of wavelengths, the distance measuring device can be easily configured. That is, an existing spectrum sensor can be used as a distance measuring device.
- the said embodiment can also be implemented in the following aspects, for example.
- the case where the wavelength of incident light incident on the lens 20 is light having a short wavelength or a long wavelength is exemplified.
- the present invention is not limited to this, and the filter may acquire outgoing light having only predetermined light from the outgoing light emitted from the lens 20. Therefore, the freedom degree of the structure which acquires the light of a predetermined wavelength is raised.
- the combination of the wavelengths of the imaging position difference (shift amount) stored in the map data 18 is the short wavelength and the long wavelength.
- the present invention is not limited to this, and the combination of wavelengths stored in the map data 18 may be other combinations.
- the map data 18 may be a plurality of map data based on combinations of different wavelengths. Therefore, the degree of freedom in selecting the wavelength used for distance measurement is increased.
- the map data 18 is referred to in order to calculate the target distance s from the shift amount D.
- the present invention is not limited to this, and the target distance s may be calculated from the shift amount D using an arithmetic expression. Therefore, the storage area can be reduced.
- the imaging surface 21a is disposed so as to expand perpendicular to the traveling direction X1 of the incident light L.
- the present invention is not limited to this, and the imaging surface 21a may be inclined with respect to the traveling direction X1 of the incident light L.
- the imaging distance f from the lens 20 to the imaging point changes based on the wavelength of light or the target distance s
- the distance between the lens 20 and the part of the imaging surface 21a that images the imaging point is The imaging surface 21a may be tilted so as to change. Therefore, the distance measurement accuracy by the distance measuring device can be improved.
- the imaging relative amount is the difference (shift amount) between the imaging positions of the two wavelengths.
- the imaging relative amount may be a ratio between the imaging positions of light of two wavelengths. Therefore, the degree of freedom of the method for calculating the imaging relative amount of the imaging positions of the two wavelengths is increased, and a preferable measurement result can be obtained.
- the target distance s is calculated based on one shift amount.
- the present invention is not limited thereto, and the target distance s may be calculated based on a plurality of shift amounts including other shift amounts detected from other wavelength combinations. Based on a plurality of shift amounts, the target distance s can be obtained with high accuracy.
- a spectrum sensor that can detect light of a large number of wavelengths can detect the image formation position of an image with respect to many wavelengths. That is, a large amount of shift can be calculated based on the difference in the imaging position.
- distance measurement based on a large amount of shift can be easily performed, and the accuracy of the measured distance can be increased.
- the lens 20 is a single convex lens.
- the lens 20 may be an optical system that forms an image of incident light, and may be constituted by a plurality of lenses.
- the lens 20 may be configured to include a concave lens.
- the main plane TX of the lens 20 that is, the center plane as a plane passing through the center in the thickness direction of the lens 20 is inclined with respect to a plane perpendicular to the traveling direction X 1 of the incident light L. ⁇ a.
- the present invention is not limited to this, and if the optical axis AX of the lens 20 has a direction different from the traveling direction X1 of the incident light L, the traveling of the incident light L on the central plane passing through the center of the lens 20 in the thickness direction. It may be perpendicular to the direction X1.
- the imaging position that is, the position of the so-called imaging point is displaced in the horizontal direction or the vertical direction on a plane perpendicular to the traveling direction X1 of the incident light L. Therefore, an imaging relative amount is generated, and the target distance s can be measured.
- the surface of the lens 25 that is a convex lens may be formed in a non-rotational symmetry.
- the thickest part of the lens 25 is biased from the center of the lens 25.
- the optical axis AX is also tilted upward.
- the surface of the lens 25 is not rotationally symmetric with respect to the optical axis AX of the lens 25. That is, the optical axis AX of the lens 25 may be inclined with respect to the traveling direction X1 of the incident light L by making the surface of the lens 25 non-rotationally symmetric with respect to the optical axis AX.
- the imaging position changes according to the wavelength of the incident light L and the target distance s. Therefore, the degree of freedom of design and configuration of the distance measuring device is improved.
- the specific configuration of the lens 25 can be changed from the case of FIG.
- the lens 26, which is a convex lens, is composed of a first member 26a and a second member 26b having different refractive indexes (aberrations), thereby allowing the incident light L to travel in the traveling direction X1.
- the optical axis AX of the lens 26 may be inclined.
- the first member 26a near the measurement target T is concave in the upper part of the figure and convex in the lower part of the figure with respect to the traveling direction X1 of the incident light L.
- the second member 26b is convex in the upper part of the drawing and concave in the lower part of the drawing so as to make up the unevenness of the first member 26a.
- the optical axis AX of the lens 26 is inclined upward. That is, the optical axis AX of the lens 26 may be inclined with respect to the traveling direction X1 of the incident light L by making the refractive index of the lens 26 non-rotationally symmetric with respect to the optical axis AX. Also in this case, the imaging position changes on the imaging surface 21a according to the wavelength of the incident light L and the target distance s. This also improves the degree of freedom in designing and configuring the distance measuring device.
- the specific configuration of the lens 26 can be changed from the case of FIG. Moreover, you may comprise one convex lens with three or more members.
- the center plane (TX) passing through the center in the thickness direction is inclined with respect to the traveling direction X1 of the incident light L. This also improves the degree of freedom in designing and configuring the distance measuring device.
- the lens 20 is not subjected to chromatic aberration correction.
- the present invention is not limited to this. If the wavelength used for distance measurement is not corrected for chromatic aberration, or if the degree of correction is small, the lens may be corrected for chromatic aberration. Therefore, the possibility that such a distance measuring device can be adopted also in an apparatus using a lens subjected to chromatic aberration correction is increased.
- the short wavelength of the two wavelengths for obtaining the shift amount D (image formation relative amount as a relative relationship amount) is 400 nm and the long wavelength is 800 nm is exemplified.
- the present invention is not limited to this, and the two wavelengths for obtaining the shift amount can be selected from visible light and invisible light as long as chromatic aberration is caused by the lens 20. That is, the short wavelength may be shorter or longer than 400 nm, and the longer wavelength may be shorter or longer than 800 nm. Therefore, the degree of freedom of wavelength selection as a distance measuring device is increased, and it becomes possible to select a combination of wavelengths suitable for distance measurement so that distance measurement is suitably performed.
- the invisible light may include ultraviolet rays (near ultraviolet rays) and infrared rays (including far infrared rays, middle infrared rays, and near infrared rays).
- the present invention is not limited thereto, and the shift amount D only needs to change according to the change in the target distance s, and may increase as the distance increases. That is, the difference (shift amount) in the imaging position varies depending on the relationship between the characteristics of the lens 20 and the selected wavelength, and therefore the imaging position difference (shift amount) and the target distance s can be set as the map data 18. It may be a relationship. Under this condition, the imaging position difference with respect to the target distance s may be changed in any way. Therefore, the degree of freedom in selecting an optical system that can be employed in the distance measuring device is increased.
- DESCRIPTION OF SYMBOLS 10 ... Vehicle, 11 ... Spectrum measuring device, 12 ... Human machine interface, 13 ... Vehicle control device, 14 ... Spectrum sensor, 15 ... Spectrum data processing device, 16 ... Arithmetic device, 17 ... Memory
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
特許文献1記載の距離測定装置は、互いに異なる波長を有する所定のパターンからなる光を測定対象に投影する光源を有し、測定対象に投影した光のパターンを、光源の光軸とは異なる方向から撮像する。距離測定装置は、投影した光のパターンと、撮像した光のパターンとの変化に基づき、測定対象までの距離を測定する。このように特許文献1記載の距離測定装置は、撮像を可能ならしめるに十分な強度の光を、測定対象に投影する光源を有する必要がある。このため、特許文献1記載の距離測定装置を車両に搭載すると、光源は、自車両から数十メートル~数百メートルほど離れることもある測定対象に、撮像が可能な強度の光のパターンを投影しなければならない。つまり光源が消費するエネルギーは、軽視できない量になる。
このような構成によれば、結像相対量を2つの波長の光の結像位置の差として検出するため、検出にかかる演算が簡単である。
このような構成でも、検出にかかる演算が簡単である。
この構成によるように、入射光に対してレンズの光軸が傾いていると、それぞれの波長ごとの結像位置に差が生じる。距離測定装置は、この結像位置の差に基づき、測定対象までの距離を測定することができる。たとえば一般的な凸レンズの場合、入射光の進行方向に対してレンズを傾けて配置されると、入射光の進行方向に対してレンズの光軸が傾く。このように距離測定装置におけるレンズの配置の態様や、レンズの特性を簡単なものとすることができる。
この構成によるように、レンズの表面を、レンズの光軸に対して非回転対称に構成することによって、レンズの光軸を傾けることができる。よってレンズの表面形状を調整することによって、出射光の波長と、測定対象までの距離とに応じるように、レンズの光軸を傾けることができる。よって、距離測定装置に用いるレンズの選択自由度や設計自由度が向上する。
この構成によるように、レンズの屈折率を、光軸に対して非回転対称とすることによって、レンズの光軸の傾きを調整できる。よって出射光の波長と、測定対象までの距離とに応じるように、レンズの光軸を傾けることができる。これによっても、距離測定装置に用いるレンズの選択自由度や設計自由度が向上する。
このような構成によれば、スペクトルセンサを用いることによって、任意の波長からなる複数の波長の光を、検出することができるようになる。よってそれら検出した波長の光による像の結像位置に基づき、多くの結像相対量を算出することができる。多くの結像相対量に基づき距離測定することによって、距離の測定精度を高くすることができる。またスペクトルセンサは、もともと波長の選択性自由度が高いことから、周辺環境や環境光などに応じて、距離測定に適した波長の光を適宜選択することも容易となる。更にスペクトルセンサは、そもそも複数の波長の光を検出することができるので、距離測定装置を簡単に構成することができる。すなわち既存のスペクトルセンサを、距離測定装置として活用することを可能にする。
(1)スペクトル測定装置11は、入射光Lの進行方向X1とは異なる向きの光軸AXを有するレンズ20を用いることによって、波長ごとの結像位置がそれぞれ相違する。このことから、複数の結像位置間の結像相対量は、対象距離sごとに異なる量として検出される。すなわちそれぞれ異なる結像相対量に基づき、スペクトル測定装置11は対象距離sを測定することができるようになる。通常のレンズは、それぞれの波長の光ごとに異なる屈折率を有する、つまり色収差を生じる。このため、レンズ20が複数の波長を有する光を結像させる際、それぞれの波長の光ごとに結像位置が異なる。このことから、遠入射光L1の進行方向X1に対して、レンズ20の光軸AXが傾いている場合、すなわちレンズ20が、遠入射光L1をレンズの光軸AXの向きに屈折させる場合、それぞれの波長の光が、それぞれの屈折率によって屈折する。よって、レンズ20によって結像される像の結像位置(結像点の位置)は、それぞれの波長の光ごとに、レンズ20の水平方向や上下方向に、互いに異なる量で変位する。一方、レンズ20と測定対象Tの間の対象距離sの変化によってレンズ20への光の入射角が相違すると、ひとつの波長の光の結像位置も変化する。これらのことからスペクトル測定装置11は、それぞれの波長の結像位置同士の相対関係に基づき、対象距離sを測定できるようになる。
・上記実施形態では、レンズ20に入射される入射光の波長を、フィルタが短波長や長波長の光にする場合を例示した。しかしこれに限らずフィルタは、レンズ20から出射した出射光から、所定の光だけを有する出射光を取得するようにしてもよい。よって、所定の波長の光を取得する構成の自由度が高められる。
Claims (10)
- 測定対象を光学的に検出することによって、前記測定対象までの距離としての対象距離を測定する距離測定装置であって、
前記測定対象からの入射光の進行方向とは異なる向きの光軸を有するレンズであって、前記レンズは、前記入射光を結像することによって前記測定対象の像を求めるように構成されていることと;
前記レンズに対する前記像の位置を示す結像位置を、前記入射光が有する複数の波長のそれぞれに対して求めることによって、それら前記結像位置同士の相対的な関係を示す量としての結像相対量を算出する結像相対量算出手段と;
前記結像相対量と前記対象距離との相関を示すように前記レンズの色収差特性と、前記光軸の向きとによって定まる情報としての相関情報を、記憶する記憶手段と;
前記結像相対量を前記相関情報に照合することによって、前記対象距離を算出する距離算出手段と
を備えることを特徴とする、距離測定装置。 - 前記光は、前記結像位置が互いに相違する2つの波長を有し、
前記相関情報は、それぞれ前記結像相対量を前記対象距離に対応付けたマップデータを構成する、
請求項1記載の距離測定装置。 - 前記結像相対量は、前記2つの波長の結像位置同士の差としての結像位置差である、
請求項2記載の距離測定装置。 - 前記結像相対量は、前記2つの波長の結像位置同士の比としての結像位置比である、
請求項2記載の距離測定装置。 - 前記レンズの光軸は、前記入射光の進行方向に対して傾けられている、
請求項1~4何れか一項記載の距離測定装置。 - 前記レンズの表面は、前記レンズの光軸に対して非回転対称である、
請求項1~4何れか一項記載の距離測定装置。 - 前記レンズの屈折率は、前記レンズの光軸に対して非回転対称である、
請求項1~4何れか一項記載の距離測定装置。 - 前記レンズは、前記測定対象からの光を検出するスペクトルセンサの一部である、
請求項1~7何れか一項記載の距離測定装置。 - 測定対象を光学的に検出することによって、前記測定対象までの距離としての対象距離を測定する距離測定方法であって、
前記測定対象からの入射光の進行方向とは異なる向きの光軸を有するレンズによって前記測定対象の像を結像させ、前記レンズに対する前記像の位置を示す結像位置を、前記入射光が有する複数の波長それぞれに対して求める結像位置算出工程と;
これら前記結像位置同士の相対的な関係を示す量としての結像相対量を算出する結像相対量算出工程と;
前記結像相対量と前記対象距離の相関を示すように前記結像相対量と、前記レンズの色収差特性と、前記光軸の向きとによって定まる情報としての相関情報に、前記結像相対量を照合することによって前記対象距離を算出する距離算出工程と
を備えることを特徴とする、距離測定方法。 - 前記入射光は、2つの波長を有し、
前記結像位置算出工程は、前記結像位置を、前記2つの波長それぞれに対して求め、
前記距離算出工程は、前記結像相対量を前記対象距離に対応付けたマップデータから、前記前記相関情報を取得する、
請求項9記載の距離測定方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080045627.9A CN102575930B (zh) | 2010-07-23 | 2010-07-23 | 距离测定装置以及距离测定方法 |
DE112010005765.6T DE112010005765B4 (de) | 2010-07-23 | 2010-07-23 | Abstandsmessvorrichtung und Abstandsmessverfahren |
JP2012508283A JP5229427B2 (ja) | 2010-07-23 | 2010-07-23 | 距離測定装置および距離測定方法 |
US13/498,659 US9451213B2 (en) | 2010-07-23 | 2010-07-23 | Distance measuring apparatus and distance measuring method |
PCT/JP2010/062404 WO2012011187A1 (ja) | 2010-07-23 | 2010-07-23 | 距離測定装置および距離測定方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/062404 WO2012011187A1 (ja) | 2010-07-23 | 2010-07-23 | 距離測定装置および距離測定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012011187A1 true WO2012011187A1 (ja) | 2012-01-26 |
Family
ID=45496627
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/062404 WO2012011187A1 (ja) | 2010-07-23 | 2010-07-23 | 距離測定装置および距離測定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9451213B2 (ja) |
JP (1) | JP5229427B2 (ja) |
CN (1) | CN102575930B (ja) |
DE (1) | DE112010005765B4 (ja) |
WO (1) | WO2012011187A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2018030319A1 (ja) * | 2016-08-12 | 2018-08-09 | パナソニックIpマネジメント株式会社 | 測距システム、および、移動体システム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101758735B1 (ko) | 2012-12-03 | 2017-07-26 | 한화테크윈 주식회사 | 카메라와 목표물 사이의 수평 거리를 구하는 방법, 이 방법을 채용한 카메라 및 감시 시스템 |
CN108292469B (zh) * | 2015-12-09 | 2021-02-05 | 笠原一 | 位置信息确定方法、装置以及计算机存储介质 |
US10877156B2 (en) * | 2018-03-23 | 2020-12-29 | Veoneer Us Inc. | Localization by light sensors |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03267708A (ja) * | 1990-03-16 | 1991-11-28 | Nec Corp | 2値距離画像採取装置 |
JP2007017401A (ja) * | 2005-07-11 | 2007-01-25 | Central Res Inst Of Electric Power Ind | 立体画像情報取得方法並びに装置 |
JP2009041928A (ja) * | 2007-08-06 | 2009-02-26 | Nissan Motor Co Ltd | 距離計測方法および装置、ならびに距離計測装置を備えた車両 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CH671828A5 (ja) * | 1987-06-26 | 1989-09-29 | Battelle Memorial Institute | |
US5629799A (en) * | 1992-07-16 | 1997-05-13 | Asahi Kogaku Kogyo Kabushiki Kaisha | Chromatic aberration correcting element and its application |
JP3267708B2 (ja) * | 1992-12-16 | 2002-03-25 | 松下電工株式会社 | 充電装置 |
US5785651A (en) * | 1995-06-07 | 1998-07-28 | Keravision, Inc. | Distance measuring confocal microscope |
JP3560123B2 (ja) | 1998-03-17 | 2004-09-02 | 横河電機株式会社 | 共焦点装置 |
JP2000019388A (ja) * | 1998-06-30 | 2000-01-21 | Sony Corp | 色収差補正用光学素子およびこれを具備する光学ピックアップ装置、ならびにこの光学ピックアップ装置を具備する光再生装置および光記録再生装置 |
JP3818028B2 (ja) * | 2000-07-10 | 2006-09-06 | 富士ゼロックス株式会社 | 3次元画像撮像装置および3次元画像撮像方法 |
DE10132988B4 (de) * | 2001-07-06 | 2005-07-28 | Carl Zeiss Smt Ag | Projektionsbelichtungsanlage |
JPWO2003036229A1 (ja) | 2001-10-25 | 2005-02-17 | 東レエンジニアリング株式会社 | 表面形状測定方法およびその装置 |
US7478754B2 (en) * | 2003-08-25 | 2009-01-20 | Symbol Technologies, Inc. | Axial chromatic aberration auto-focusing system and method |
DE10343406A1 (de) | 2003-09-19 | 2005-04-14 | Daimlerchrysler Ag | Entfernungsbestimmung eines Objektes |
JP4578869B2 (ja) * | 2004-06-24 | 2010-11-10 | 富士フイルム株式会社 | 3群ズームレンズ |
JP2006098771A (ja) * | 2004-09-29 | 2006-04-13 | Canon Inc | 焦点検出装置、撮像装置、撮像システム及びレンズユニット |
US7224540B2 (en) * | 2005-01-31 | 2007-05-29 | Datalogic Scanning, Inc. | Extended depth of field imaging system using chromatic aberration |
EP3258687A1 (en) * | 2005-03-04 | 2017-12-20 | Nikon Corporation | Image processor correcting color misregistration, image processing method, and electronic camera |
JP4544103B2 (ja) | 2005-09-07 | 2010-09-15 | パナソニック株式会社 | 界面の位置測定方法および位置測定装置 |
US20080137061A1 (en) * | 2006-12-07 | 2008-06-12 | Christopher John Rush | Displacement Measurement Sensor Using the Confocal Principle |
JP4332814B2 (ja) * | 2007-01-15 | 2009-09-16 | ソニー株式会社 | 撮像装置 |
EP2106531A2 (en) | 2007-01-22 | 2009-10-07 | California Institute Of Technology | Method for quantitative 3-d imaging |
US8089555B2 (en) * | 2007-05-25 | 2012-01-03 | Zoran Corporation | Optical chromatic aberration correction and calibration in digital cameras |
WO2009037949A1 (ja) | 2007-09-19 | 2009-03-26 | Nikon Corporation | 計測装置およびその計測方法 |
US7990522B2 (en) * | 2007-11-14 | 2011-08-02 | Mitutoyo Corporation | Dynamic compensation of chromatic point sensor intensity profile data selection |
JP5094430B2 (ja) * | 2008-01-10 | 2012-12-12 | キヤノン株式会社 | 画像処理方法、画像処理装置、システム |
US20100097693A1 (en) * | 2008-10-16 | 2010-04-22 | Kazunori Koga | Confocal microscope |
KR20100077878A (ko) * | 2008-12-29 | 2010-07-08 | 삼성전자주식회사 | 초점검출장치 및 이를 구비하는 촬상장치 |
DE102009025815A1 (de) | 2009-05-15 | 2010-11-25 | Degudent Gmbh | Messanordnung sowie Verfahren zum dreidimensionalen Messen eines Objektes |
-
2010
- 2010-07-23 DE DE112010005765.6T patent/DE112010005765B4/de not_active Expired - Fee Related
- 2010-07-23 WO PCT/JP2010/062404 patent/WO2012011187A1/ja active Application Filing
- 2010-07-23 CN CN201080045627.9A patent/CN102575930B/zh not_active Expired - Fee Related
- 2010-07-23 US US13/498,659 patent/US9451213B2/en active Active
- 2010-07-23 JP JP2012508283A patent/JP5229427B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03267708A (ja) * | 1990-03-16 | 1991-11-28 | Nec Corp | 2値距離画像採取装置 |
JP2007017401A (ja) * | 2005-07-11 | 2007-01-25 | Central Res Inst Of Electric Power Ind | 立体画像情報取得方法並びに装置 |
JP2009041928A (ja) * | 2007-08-06 | 2009-02-26 | Nissan Motor Co Ltd | 距離計測方法および装置、ならびに距離計測装置を備えた車両 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2018030319A1 (ja) * | 2016-08-12 | 2018-08-09 | パナソニックIpマネジメント株式会社 | 測距システム、および、移動体システム |
US10365462B2 (en) | 2016-08-12 | 2019-07-30 | Panasonic Intellectual Property Management Company, Ltd. | Distance measurement system and mobile object system |
US10852516B2 (en) | 2016-08-12 | 2020-12-01 | Panasonic Intellectual Property Management Co., Ltd. | Imaging device |
Also Published As
Publication number | Publication date |
---|---|
US20130271597A1 (en) | 2013-10-17 |
JPWO2012011187A1 (ja) | 2013-09-09 |
DE112010005765T5 (de) | 2013-05-08 |
CN102575930B (zh) | 2014-10-01 |
JP5229427B2 (ja) | 2013-07-03 |
CN102575930A (zh) | 2012-07-11 |
DE112010005765B4 (de) | 2016-06-16 |
US9451213B2 (en) | 2016-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5354105B2 (ja) | 距離測定装置および距離測定方法 | |
CN104776801B (zh) | 信息处理装置和信息处理方法 | |
EP1919199B1 (en) | Multiband camera system | |
JP5473265B2 (ja) | 多層構造計測方法および多層構造計測装置 | |
JP5489392B2 (ja) | 光学系評価装置、光学系評価方法および光学系評価プログラム | |
US9366630B2 (en) | Fluorescence imaging autofocus systems and methods | |
US11002834B2 (en) | Lidar scanning device and lidar scanning device system | |
JP5229427B2 (ja) | 距離測定装置および距離測定方法 | |
JP2019144570A (ja) | 顕微鏡用の検出装置 | |
US20130162827A1 (en) | Camera Arrangement for a Motor Vehicle | |
US8254010B2 (en) | Imaging of a plurality of types of images based on light of a plurality of wavelength bands | |
JPH01308930A (ja) | 顕微分光装置 | |
CN113834568A (zh) | 光谱测量装置和方法 | |
JP2024028237A (ja) | 分光カメラ、撮像方法、プログラム及び記録媒体 | |
AU2020397496A1 (en) | Method for measuring the optical quality of a given region of a glazing unit, associated measuring device | |
CN1973233A (zh) | 用于光谱分析的像差校正 | |
US20120038923A1 (en) | Collection optics for a color sensor | |
US9891422B2 (en) | Digital confocal optical profile microscopy | |
US20200348398A1 (en) | OPTICAL SYSTEM, IN PARTICULAR A LiDAR SYSTEM, AND VEHICLE | |
EP2732401B1 (fr) | Dispositif et procede d'imagerie pour produire une image de marquages routiers | |
WO2022044162A1 (ja) | 検出装置および検出方法 | |
JP2022126704A (ja) | 撮像装置、撮像方法、プログラム及び記録媒体 | |
WO2022234588A1 (en) | Systems and methods to acquire three dimensional images using spectral information | |
CN117355724A (zh) | 膜厚测定装置及膜厚测定方法 | |
EP3861325A1 (en) | Method and apparatus for surface plasmon resonance imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080045627.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012508283 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10855024 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13498659 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112010005765 Country of ref document: DE Ref document number: 1120100057656 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10855024 Country of ref document: EP Kind code of ref document: A1 |