WO2022050047A1 - Lidar device, lidar system, distance measurement method, and program - Google Patents

Lidar device, lidar system, distance measurement method, and program Download PDF

Info

Publication number
WO2022050047A1
WO2022050047A1 PCT/JP2021/030160 JP2021030160W WO2022050047A1 WO 2022050047 A1 WO2022050047 A1 WO 2022050047A1 JP 2021030160 W JP2021030160 W JP 2021030160W WO 2022050047 A1 WO2022050047 A1 WO 2022050047A1
Authority
WO
WIPO (PCT)
Prior art keywords
lidar
laser beam
lidar device
optical path
ellipse
Prior art date
Application number
PCT/JP2021/030160
Other languages
French (fr)
Japanese (ja)
Inventor
英史 大場
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2022050047A1 publication Critical patent/WO2022050047A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Definitions

  • This disclosure relates to a LiDAR device, a LiDAR system, a distance measuring method and a program.
  • LiDAR Light Detection and Ringing / Laser Imaging Detection and Ringing
  • Patent Document 1 discloses a scanning type range sensor that calculates and measures the distance to a peripheral object based on the reception result of the reflected wave of an electromagnetic wave.
  • the scanning type range sensor is used for both a cylindrical rotating body provided together with a motor, a floodlight, a receiver and a light receiving lens provided inside the cylindrical rotating body, and a light emitting / receiving device installed on the rotation axis of the cylindrical rotating body. Equipped with a mirror.
  • These motors, cylindrical rotating bodies, floodlights, photoreceivers, light receiving lenses, and light emitting / receiving mirrors are installed in the device in a state where they are entirely housed inside a vertically long outer cover.
  • the LiDAR device is generally installed so as to protrude from an object such as a vehicle with the entire mechanism including the optical system covered with a cover body for protection from the surrounding environment.
  • the LiDAR device may be greatly projected from the installation target in order to improve the accuracy of emitting the laser beam and receiving the reflected wave.
  • the protruding structure of the LiDAR device limits the installation conditions such as placement and is conspicuous in appearance, which can greatly spoil the aesthetic appearance of the entire device.
  • a LiDAR device that uses laser light of infrared wavelength when the cover body includes a window that transmits only light of infrared wavelength, it is not always aesthetically harmonious with such a window having a unique hue. It's not easy.
  • the protruding structure of the LiDAR device increases the possibility of collision with surrounding objects and causes an increase in noise and air resistance due to the generation of turbulence.
  • dirt is likely to adhere to the protruding structure of the LiDAR device, and the cleaning mechanism for removing such dirt is also likely to be complicated in order to adapt to the protruding structure of the LiDAR device.
  • the present disclosure provides a LiDAR device, a LiDAR system, a distance measuring method, and a program that are advantageous for suppressing the degree of protrusion from the installation target.
  • One aspect of the present disclosure is a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes.
  • the present invention relates to a LiDAR apparatus including a reflector including a light path adjusting unit for adjusting the traveling direction of the laser light so that the laser light is reflected by a reflecting surface after passing through the position of one focal point of an ellipse.
  • the reflective surface may have a partial shape of the outer peripheral surface of the spheroid.
  • the reflective surface may have the shape of a part of the side surface of the elliptical column.
  • the optical path adjusting unit may have a MEMS mirror.
  • the optical path adjustment unit may be arranged at the position of one focal point.
  • the reflector has an entrance passage through which the laser beam can pass, and the optical path adjustment section adjusts the traveling direction of the laser beam outside the reflector, and the laser beam passes through the entrance passage and inside the reflector. It may be reflected by the reflecting surface after being incident on.
  • the exit passage may be located at the position of the other focal point of the ellipse.
  • the LiDAR device includes a cover body having an emission window portion through which the laser light after reflection passes on the reflection surface, and the emission window portion may have a diameter smaller than the minor axis of the ellipse.
  • the exit window may be located at the position of the other focal point of the ellipse.
  • the cover body has a window support portion that supports the exit window portion, and the outer surface of the exit window portion and the outer surface of the window support portion may be connected to each other without a step.
  • the LiDAR device may include an optical element to which a laser beam is incident.
  • Another aspect of the present disclosure includes a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes.
  • a LiDAR device comprising a reflector and an optical path adjusting unit that adjusts the traveling direction of the laser light so that the laser light passes through the position of one focal point of the ellipse and is reflected by the reflecting surface, and at least the optical path adjusting unit.
  • the present invention relates to a LiDAR system including a LiDAR control unit for controlling the above.
  • the LiDAR system includes an environment image acquisition unit that acquires a captured image of the surrounding environment, and the LiDAR control unit identifies a region of interest in the surrounding environment based on the analysis of the captured image, and emits laser light emitted from the LiDAR device.
  • the optical path adjustment unit may be controlled so as to advance toward the region of interest in the surrounding environment.
  • Another aspect of the present disclosure is a step of emitting a laser beam from a light emitting part of a LiDAR apparatus, and a reflecting surface of a reflector having the laser beam passing through the position of one focal point of the ellipse and then having the shape of a part of the ellipse.
  • the present invention relates to a distance measuring method including a step of adjusting the traveling direction of the laser beam by an optical path adjusting unit so that the light is reflected by the light path and then passes through the exit passing portion of the reflector.
  • the traveling direction of the laser beam is adjusted by the optical path adjustment unit so as to advance the laser beam toward the region of interest in the surrounding environment, including the step of identifying the region of interest in the surrounding environment based on the analysis of the captured image of the surrounding environment. May be done.
  • FIG. 1 is a plan view showing a schematic configuration of an example of a LiDAR device.
  • FIG. 2 is a perspective view of the reflector shown in FIG.
  • FIG. 3 is a side view showing an arrangement example of the light emitting unit and the optical path adjusting unit, and the reflective surface is hatched.
  • FIG. 4 is a diagram showing an example of a reflector having a spheroidal shape, in which the dotted line indicates a position corresponding to the short axis of the ellipse, and the reflecting surface is hatched.
  • FIG. 5 is a plan view showing a schematic configuration of a modified example of the LiDAR device.
  • FIG. 6 is a plan view showing a schematic configuration of another modification of the LiDAR apparatus, and the cover body is shown in cross section.
  • FIG. 7 is a functional block diagram showing an example of a control configuration of a LiDAR system.
  • FIG. 8 is a flowchart showing an example of the distance measuring method.
  • FIG. 9 is a flowchart showing an example of a case where the distance measuring method shown in FIG. 8 is applied to a vehicle driving technique (particularly a technique for avoiding the vehicle from coming into contact with an obstacle).
  • LiDAR device First, a typical example of the LiDAR device 11 will be described.
  • FIG. 1 is a plan view showing a schematic configuration of an example of the LiDAR device 11.
  • FIG. 2 is a perspective view of the reflector 22 shown in FIG. 1, and the reflective surface 32 is hatched.
  • the LiDAR device 11 shown in FIG. 1 includes a light emitting unit 21 including a light source of the laser beam L, a reflector 22 having a reflecting surface 32, and an optical path adjusting unit 23 for adjusting the traveling direction of the laser beam L.
  • the light emitting unit 21 emits laser light L.
  • the light emitting unit 21 shown in FIG. 1 is provided inside the reflector 22 so as to face the reflecting surface 32, and emits laser light L toward the optical path adjusting unit 23.
  • the laser beam L emitted from the light emitting unit 21 in the horizontal direction is variably changed in the traveling direction by the optical path adjusting unit 23, and travels in the horizontal direction toward various points of the reflecting surface 32.
  • the arrangement position of the light emitting unit 21 is not limited to the example shown in FIG.
  • the light emitting unit 21 may be provided at a position not facing the reflecting surface 32, and as shown in FIG. 3, the light emitting unit 21 and the optical path adjusting unit 23 are provided at different positions in the height direction. May be.
  • the traveling direction of the laser beam L emitted upward (that is, directly above) from the light emitting unit 21 is changed by 90 degrees by the optical path adjusting unit 23.
  • the traveling direction of the laser beam L is variably changed by the optical path adjusting unit 23, and the laser beam L travels in the horizontal direction toward various points of the reflecting surface 32.
  • the reflector 22 has a reflecting surface 32 that reflects the laser light L traveling through the optical path adjusting unit 23 and an outlet passing portion through which the laser light L reflected by the reflecting surface 32 passes. 33 and.
  • the reflective surface 32 of this embodiment has a shape of a part of an ellipse. That is, the shape formed when the reflecting surface 32 is cut vertically is an elliptical partial shape.
  • the first focal point F1 (that is, one focal point) and the second focal point F2 (that is, the other focal point) are located on the long axis A1 of the ellipse formed by the reflecting surface 32.
  • the second focal point F2 is closer to the exit passage portion 33 than the first focal point F1.
  • the reflective surface 32 shown in FIGS. 1 and 2 has a part of the shape of the side surface of the elliptical column, and has a shape like a rectangular planar plate curved in an elliptical shape.
  • the sides of an elliptical column form an ellipse when cut in a direction perpendicular to the height of the elliptical column.
  • the shape of the reflective surface 32 is not limited to the examples shown in FIGS. 1 and 2.
  • the reflecting surface 32 may have a shape of a part of the outer peripheral surface of the spheroid.
  • the spheroid is a spheroid obtained by using an ellipse as its major axis or a minor axis as its axis of rotation, and has a three-dimensional ellipsoidal shape.
  • the exit passage portion 33 can be provided at an arbitrary position through which the laser beam L reflected by the reflection surface 32 passes.
  • the exit passage portion 33 shown in FIG. 1 is arranged at the position of the second focal point F2 of the ellipse formed by the reflecting surface 32.
  • the optical path adjusting unit 23 adjusts the traveling direction of the laser beam L so that the laser beam L is reflected by the reflecting surface 32 after passing through the position of the first focal point F1 of the ellipse formed by the reflecting surface 32.
  • the optical path adjusting unit 23 can be installed at various positions on the optical path of the laser beam L from the light emitting unit 21.
  • the optical path adjusting unit 23 shown in FIG. 1 is arranged at the position of the first focal point F1 of the ellipse formed by the reflecting surface 32, and is provided inside the reflecting body 22 so as to face the reflecting surface 32.
  • the optical path adjusting unit 23 of this example has a MEMS mirror (that is, a MEMS type mirror system), and can variably reflect the laser beam L in various directions.
  • the MEMS mirror is configured to be compact and can be driven at a low voltage, and it is possible to swing and drive the reflection mirror quickly and with high positional accuracy while suppressing the inertial force acting on the reflection mirror to a small value.
  • the specific configuration of the MEMS mirror is not limited.
  • the runout drive of the reflection mirror of the MEMS mirror may be performed via a fixed rotation axis or may be performed via a floating rotation axis.
  • the configuration and installation position of the optical path adjusting unit 23 are not limited to the above example.
  • the optical path adjusting unit 23 may have a laser light scanning mechanism other than the MEMS mirror, and may be realized by, for example, a Galvano type mirror system. Further, the optical path adjusting unit 23 may be provided on the outside of the reflector 22 so as not to face the reflecting surface 32.
  • FIG. 5 is a plan view showing a schematic configuration of a modified example of the LiDAR device 11.
  • the reflector 22 of the LiDAR device 11 shown in FIG. 5 has, in addition to the above-mentioned reflecting surface 32 and the exit passing portion 33, an inlet passing portion 34 provided at the position of the first focal point F1 and through which the laser beam L can pass.
  • the light emitting unit 21 and the optical path adjusting unit 23 are provided on the outside of the reflector 22. Although the light emitting unit 21 is fixedly provided as a whole, the emission position and the emission direction of the laser beam L are variably provided.
  • the optical path adjusting unit 23 is fixedly provided, and can be configured by, for example, a microprism or a diffraction grating.
  • the laser beam L emitted from the light emitting unit 21 on the outside of the reflector 22 and whose traveling direction is adjusted by the optical path adjusting unit 23 passes through the inlet passing unit 34 and enters the inside of the reflector 22. It is incident, then reflected by the reflecting surface 32 and emitted from the exit passing portion 33.
  • the optical path adjusting unit 23 basically emits almost all the laser light L toward the inlet passing unit 34 regardless of the incident position of the laser light L from the light emitting unit 21.
  • the specific shape and size of the entrance passage portion 34 are not limited.
  • the shape of the inlet passing portion 34 and the shape of the inlet passing portion 34 so as to correspond to the desired beam shape and desired beam diameter of the laser beam L passing through the inlet passing portion 34. It is preferable that the size (for example, the opening diameter) is determined.
  • the traveling direction of the laser light L is adjusted according to the combination of the light emitting unit 21 (particularly the emission mode of the laser light L) and the optical path adjusting unit 23, the light emitting unit is functionally 21 also forms a part of the optical path adjusting unit 23.
  • the LiDAR device 11 may further include an optical element such as a lens to which the laser beam L emitted from the light emitting unit 21 is incident.
  • the installation position of such an optical element is not limited, and one type or a plurality of types of optical elements can be appropriately installed on the optical path of the laser beam L.
  • an optical element may be provided at a position where the laser beam L after being reflected by the reflecting surface 32 is incident, or an optical element may be provided at a position where the laser light L before being reflected by the reflecting surface 32 is incident. May be done.
  • the optical element 26 may be arranged at the position of the second focal point F2 of the ellipse formed by the reflecting surface 32. In this case, it is possible to optically correct all of the laser light L reflected by the reflecting surface 32 by the single optical element 26. Further, an optical element may be installed between the light emitting unit 21 and the optical path adjusting unit 23, and the laser beam L in a state where the optical characteristics are adjusted by the optical element may be incident on the optical path adjusting unit 23.
  • the specific configuration and function of the optical element are not limited.
  • the optical element 26 changes the beam diameter of the laser beam L, collimates the laser beam L, adjusts the traveling direction of the laser beam L, and adjusts the traveling range (that is, the angle range) of the laser beam L. It can be (for example, enlarged).
  • the LiDAR device 11 may include a cover body that covers the exit passage portion 33 of the reflector 22.
  • the cover body 25 shown in FIG. 6 has an exit window portion 38 through which the laser beam L after reflection passes through the reflection surface 32, and a window support portion 39 that supports the exit window portion 38.
  • the outer surface of the exit window portion 38 (that is, the surface exposed to the outside) and the outer surface of the window support portion 39 are connected to each other without a step and are arranged on the same plane.
  • the exit window portion 38 shown in FIG. 6 is arranged at the position of the second focal point F2, and has a diameter smaller than the minor axis of the ellipse formed by the reflecting surface 32 (length in the vertical direction in FIG. 6).
  • the exit window portion 38 can be installed at an arbitrary position on the optical path of the laser beam L after being reflected by the reflecting surface 32, and may be provided at a position other than the position of the second focal point F2.
  • the exit window portion 38 may be arranged downstream of the second focal point F2 in the traveling direction of the laser beam L.
  • the LiDAR device 11 further includes a light receiving unit (see FIG. 7 described later) that receives the reflected light of the laser light L from the surrounding environment (that is, the laser light L scattered in the surrounding environment).
  • Information on the surrounding environment can be obtained by analyzing the light reception result of the reflected light in the light receiving unit. For example, information on whether or not an object such as an object or a person exists in the vicinity, information on the orientation and distance to an object existing in the vicinity, and information on the nature of the object are obtained based on the analysis of the light reception result of the reflected light. It is possible to derive it.
  • the light receiving unit may be provided integrally with the light emitting unit 21 or may be provided separately from the light emitting unit 21, and the installation position and mode of the light receiving unit are not limited.
  • the distance measuring method described below is performed by driving the LiDAR device 11 under the control of a control device (see FIG. 7 described later).
  • the distance measuring method performed by the LiDAR device 11 includes a step of emitting laser light L from the light emitting unit 21, a step of adjusting the traveling direction of the laser light L by the optical path adjusting unit 23, and a step of receiving reflected light by the light receiving unit. including.
  • the traveling direction of the laser light L adjusts the optical path so that the laser light L passes through the first focal point F1, is reflected by the reflecting surface 32, and then passes through the exit passing portion 33. It is adjusted by the unit 23.
  • the second focal point F2 can be regarded as a virtual scanning center of the laser beam L, and the laser beam L is considered to be emitted in all directions of the scanning range centering on the second focal point F2.
  • laser light scanning can be substantially performed centering on the second focal point F2 without installing equipment such as an optical path adjusting unit 23 in the second focal point F2. Therefore, it is possible to simplify the device configuration in the second focus F2 (that is, the substantially center of laser light scanning), and the second focus is a protrusion such as a wide-range cover that surrounds the entire device such as the optical path adjustment unit 23. It can be prevented from being installed in F2.
  • the above-mentioned LiDAR device 11 when the above-mentioned LiDAR device 11 is installed on a target such as a vehicle, by designing the exposed portion of the LiDAR device 11 to the outside based on the position of the second focal point F2, the protrusion of the LiDAR device 11 from the installation target can be prevented. It can be suppressed.
  • the laser light scanning can be performed while emitting the laser light L from a very limited range at the second focal point F2. Therefore, when the exit window portion 38 (see FIG. 6) through which the laser beam L is passed is installed in the vicinity of the second focus F2 or the second focus F2, the emission window portion 38 is configured to be very small and the laser light over a wide range. Scanning is possible. As described above, the size of the emission window portion 38 can be determined according to the scanning range of the laser beam L. Further, even when the exit window portion 38 has a peculiar color, the emission window portion 38 can be made inconspicuous from the outside by making the exit window portion 38 sufficiently small.
  • a device such as a rotation mechanism is provided with a black cylindrical cover that transmits infrared light but does not transmit visible light. Often covered.
  • a LiDAR device is installed on an object such as a vehicle, the original design of the object to be installed is not a little impaired by the LiDAR device.
  • the degree of freedom in installing the LiDAR device 11 can be improved by suppressing the degree of protrusion of the LiDAR device 11 from the installation target and making the externally exposed portion small.
  • the LiDAR device 11 can be installed in various forms. In particular, the influence on the design of the installation target such as a vehicle can be suppressed to a small extent, and the LiDAR device 11 can be appropriately installed while being in harmony with the peripheral design at a high level.
  • the area of the externally exposed part is set to less than 50% of the area of the entire design, so that the conspicuousness of the externally exposed part can be effectively suppressed. can. According to the above-mentioned LiDAR device 11, it is possible to sufficiently satisfy the condition of less than 50%.
  • the optical path adjusting unit 23 can be arranged in the inner space of the reflector 22 in a state of being protected by the reflector 22.
  • the optical path adjusting unit 23 by configuring the optical path adjusting unit 23 with a compact and lightweight MEMS mirror, it is possible to easily and appropriately install the optical path adjusting unit 23 even if the inner space of the reflector 22 is small.
  • the optical path adjusting unit 23 by configuring the optical path adjusting unit 23 with a compact and lightweight MEMS mirror, high-speed scanning of the laser beam L is possible as compared with the housing rotation type LiDAR device and the scanner head integrated low-speed rotation type LiDAR device.
  • LiDAR system Next, a LiDAR system using the above-mentioned LiDAR device 11 will be described.
  • the installation target of the LiDAR device 11 is not limited to the vehicle, and the following description can be similarly applied when the LiDAR device 11 is mounted on another target (including a moving body and a stationary body).
  • FIG. 7 is a functional block diagram showing an example of the control configuration of the LiDAR system 10.
  • the LiDAR system 10 shown in FIG. 7 includes a LiDAR device 11, a control device 15, and a photographing device 50.
  • the control device 15 and the photographing device 50 are mounted on the target on which the LiDAR device 11 is installed (in this example, the vehicle (not shown)), but are mounted on the target on which the LiDAR device 11 is installed. It does not have to be.
  • the control device 15 and the photographing device 50 may be provided at a position away from the target in which the LiDAR device 11 is installed, or data may be transmitted / received to / from another device via a wireless signal.
  • the photographing device 50 is an environment image acquisition unit that acquires a photographed image D1 (that is, image data) of the surrounding environment, and includes a solid-state image sensor such as a CCD image sensor or a CMOS image sensor.
  • the photographing device 50 shoots under the control of the control device 15 (particularly the photographing control unit 16), and transmits the photographed image D1 to the control device 15 (particularly the photographing control unit 16).
  • the photographing device 50 may have any configuration, and may be configured by, for example, a monocular camera or a stereo camera.
  • the control device 15 has a LiDAR control unit 12 and an imaging control unit 16.
  • the LiDAR control unit 12 controls the LiDAR device 11 (at least the optical path adjustment unit 23).
  • the LiDAR control unit 12 shown in FIG. 7 controls the light emitting unit 21 and the optical path adjusting unit 23.
  • the light emitting unit 21 emits the laser beam L based on the control signal D2 sent from the LiDAR control unit 12.
  • the optical path adjusting unit 23 adjusts the traveling direction of the laser beam L based on the control signal D2 sent from the LiDAR control unit 12.
  • the LiDAR control unit 12 receives the light receiving result D3 of the reflected light of the laser light L from the surrounding environment from the light receiving unit 24 of the LiDAR device 11. Then, the LiDAR control unit 12 analyzes the light receiving result D3 of the light receiving unit 24 and acquires information regarding the state of the surrounding environment.
  • the LiDAR control unit 12 of the present embodiment also functions as an area of interest specifying unit that specifies an area of interest in the surrounding environment based on the analysis of the captured image D1.
  • the photographing control unit 16 controls the photographing device 50 and stores the photographed image D1 sent from the photographing device 50 in a storage unit (not shown). Further, the photographing control unit 16 shown in FIG. 7 transmits the captured image D1 to the LiDAR control unit 12.
  • the LiDAR system 10 having the above configuration can perform selective priority detection processing according to the situation by performing a wide range coarse scan (Coarse Scan) and a fine scan of the region of interest (Fine Scan) in combination. That is, the wide-range co-earth scan performs high-speed photodetection and distance measurement with a relatively coarse resolution over a wide range of the surrounding environment. Based on the results of this wide-area co-earth scan, a region defined based on the position of a candidate for obstacle (for example, an object and a person) is specified as a region of interest (ROI). Then, by performing a limited fine scan on the region of interest, detailed information on the candidate for disability (specific types such as humans, animals, and objects) is specified. In this way, by performing a time-consuming fine scan only on the region of interest, it is possible to perform light detection and distance measurement over a wide area and acquire detailed information on failure candidates in a short time and with high accuracy. Can be done.
  • a wide range coarse scan coarse scan
  • FIG. 8 is a flowchart showing an example of the distance measuring method.
  • a wide range co-earth scan is performed by the LiDAR device 11 (S1 in FIG. 8). That is, the LiDAR control unit 12 controls the optical path adjustment unit 23 so that the laser beam L emitted from the LiDAR device 11 travels toward the scan range of a wide range coearth scan in the surrounding environment. As a result, the light receiving result D3 of the light receiving unit 24 is sent to the LiDAR control unit 12.
  • the wide-range co-earth scan performed here is performed at a relatively coarse resolution and at high speed, and its main purpose is to search for a failure candidate in the scan range and to detect the position (for example, direction and distance) of the failure candidate. Therefore, the wide area co-earth scan does not detect the specific detailed information of the failure candidate.
  • the scan range (for example, azimuth and viewing angle) of the wide-range co-earth scan is not limited, and may be a predetermined range or a range variably determined according to the situation.
  • the LiDAR control unit 12 may adaptively determine the scan range of the wide range co-earth scan according to the map information such as LDM (Local Dynamic Map) and the vehicle speed.
  • the LiDAR control unit 12 can acquire map information and vehicle speed information by any method, for example, map information and vehicle speed from another control unit (for example, speed monitoring unit) or memory included in the control device 15. You may get the information of.
  • the search particle size of the wide-range co-earth scan (for example, scan resolution) is not limited and may be fixedly determined in advance, and the LiDAR control unit 12 changes the search particle size of the wide-range co-earth scan according to the situation (for example, search application). You may decide on the target.
  • Wide-range co-earth scanning is similar to the behavior of a person observing and acquiring an overall picture of the surrounding environment, but in that it requires highly accurate detection of the position (eg, orientation and distance) of the detection target (ie, failure candidate). It is different from the observation operation of.
  • the shooting range of the shooting device 50 is not limited, and may be a predetermined range, or may be a range variably determined by the shooting control unit 16 or the LiDAR control unit 12 according to the situation.
  • the photographing range by the photographing device 50 may be larger, smaller, or the same as, for example, the scan range of the wide area co-earth scan, and may be adaptively determined according to the map information and the vehicle speed.
  • the photographing range by the photographing apparatus 50 is at least partially common with the scanning range of the wide range co-earth scan.
  • the acquisition of the photographed image of the surrounding environment by the photographing apparatus 50 may be performed after the wide area co-earth scan as shown in FIG. 8, but it can be performed at any timing. That is, the acquisition of the captured image of the surrounding environment by the photographing device 50 may be performed prior to the wide-range co-earth scan, or may be performed simultaneously with the wide-range co-earth scan.
  • the LiDAR control unit 12 identifies one or a plurality of regions of interest based on the result of the wide area co-earth scan (that is, the light receiving result D3 of the light receiving unit 24) (S3).
  • the region of interest is a limited region determined based on the position of the detection target (that is, the failure candidate) derived from the light receiving result D3, and is smaller than each of the scan range of the wide range co-earth scan and the shooting range of the photographing device 50.
  • the LiDAR control unit 12 of the present embodiment identifies the region of interest based on the combination of the result of the wide area co-earth scan and the analysis result of the captured image D1. For example, the LiDAR control unit 12 analyzes the captured image D1 to acquire area information (for example, heat map information) indicating the possibility of existence of a detection target (that is, a failure candidate), and is interested in referring to the area information.
  • area information for example, heat map information
  • the region may be determined.
  • the captured image acquired by the visible light image sensor is excellent in azimuth resolution and depth-depth resolution.
  • the LiDAR control unit 12 can accurately determine the region of interest by using the region information obtained from the analysis of the captured image D1 having such characteristics in combination with the result of the wide range co-earth scan.
  • the specific derivation method and information form of heat map information are not limited.
  • the LiDAR control unit 12 can derive heat map information indicating the possibility of existence of a detection target (that is, a failure candidate) in the shooting range by applying a semantic segmentation technique to the shot image D1.
  • the analysis of the above-mentioned captured image D1 and the acquisition of area information such as heat map information may be performed by a device element other than the LiDAR control unit 12 (for example, the photographing control unit 16).
  • the area information indicating the possibility of existence of the detection target (that is, the failure candidate) is sent from the device element to the LiDAR control unit 12.
  • the LiDAR device 11 performs a fine scan of the region of interest (S4). That is, the LiDAR control unit 12 controls the optical path adjusting unit 23 so that the laser beam L emitted from the LiDAR device 11 travels toward the region of interest in the surrounding environment. As a result, the light receiving result D3 of the light receiving unit 24 is sent to the LiDAR control unit 12.
  • the region of interest fine scan (S4) is performed at a higher resolution than the wide area co-earth scan (S1), and provides scan data capable of detecting detailed information of failure candidates.
  • the higher the scan resolution the longer the time required for scanning, but since the scan range of the region of interest fine scan is limited to the region of interest, the region of interest fine scan can be performed in a short time.
  • the scan operation of the region of interest fine scan is similar to the target identification operation (that is, high-speed central visual field movement) indicated by the human eyeball with so-called saccade behavior.
  • the human eye can search for visual information at an extremely high speed compared to the movement speed of the human neck or body.
  • the region of interest fine scan process a high-resolution scan of the selected region (that is, the region of interest) determined based on the result of the wide area co-earth scan and the region information such as heat map information is performed on demand, actively and shortly. It is feasible in time.
  • the search particle size (for example, scan resolution) of the region of interest fine scan is not limited and may be fixedly determined in advance, and the LiDAR control unit 12 searches for the region of interest fine scan according to the situation (for example, search use).
  • the particle size may be variably determined.
  • the LiDAR control unit 12 acquires peripheral information based on the result of the fine scan of the region of interest (that is, the light receiving result D3 of the light receiving unit 24) (S5). That is, the LiDAR control unit 12 acquires specific detailed information on the failure candidate based on the result of the fine scan of the region of interest. For example, whether obstacle candidates are classified into humans, animals, buildings, or other objects is determined based on the results of the region of interest fine scan.
  • the LiDAR control unit 12 acquires analysis information based on the peripheral information (S6).
  • the LiDAR control unit 12 analyzes peripheral information (for example, information on the type of failure candidate), and "is there a high possibility that the failure candidate can be an actual failure?" Or "the vehicle may come into contact with the failure candidate.” Is it expensive? ”Can be acquired as analysis information.
  • peripheral information (S5) and the process of acquiring analysis information (S6) are not necessarily processes that are clearly distinguished from each other, and may actually be performed simultaneously. .. In addition, peripheral information and analysis information are not always clearly distinguished from each other.
  • the LiDAR control unit 12 can acquire time-series search information indicating the transition of the state of the failure candidate over time.
  • the LiDAR control unit 12 may acquire peripheral information and analysis information based on the time-series search information acquired in this way.
  • the LiDAR control unit 12 derives a prediction locus of a failure candidate based on the time-series search information, and derives a traveling path for avoiding a collision between the vehicle and the failure candidate based on the information of the prediction locus. You may.
  • the information derived from the time-series search information in this way may be notified to the driver via a notification device (for example, a display or a voice guide) (not shown), or may be used as basic information for driving assist operation. It is possible.
  • the LiDAR control unit 12 may determine whether the failure candidate detected as a failure candidate has a high possibility of becoming a failure from a practical point of view.
  • an object such as paper flying in the air (that is, an object that is unlikely to interfere with vehicle running) is also detected as an obstacle candidate.
  • the LiDAR control unit 12 may apply a specific analysis method to the result of the region of interest fine scan and determine the possibility that the failure candidate becomes an actual failure. For example, the LiDAR control unit 12 derives information on whether or not the failure candidate is in contact with the road surface, the size of the failure candidate, and the temporal behavior of the failure candidate based on the result of the region of interest fine scan, and the derivation result. The actual possibility of failure of the failure candidate may be determined from.
  • the LiDAR control unit 12 may generate a bird's-eye view distribution map based on the result of a wide range co-earth scan.
  • the bird's-eye view outline distribution map is map information showing the rough position (for example, direction and distance) of the obstacle candidate in the range including the planned traveling route of the own vehicle and the area near the planned traveling route.
  • the obstacle candidates reflected in the bird's-eye view distribution map it does not matter whether or not the obstacle candidates are likely to be obstacles from a practical point of view.
  • the LiDAR control unit 12 may perform a fine scan of the region of interest based on the bird's-eye view outline distribution map and the analysis result of the captured image D1. Based on the result of the region of interest fine scan, the LiDAR control unit 12 may determine whether or not to classify the failure candidate into a target (that is, a tracking target) that needs to be tracked over time. For example, obstacle candidates recognized as vehicles (including motorcycles), bicycles, pedestrians, or animals as a result of a region of interest fine scan may be classified as tracking targets.
  • the LiDAR control unit 12 may perform a course prediction evaluation of the tracking target.
  • the LiDAR control unit 12 may perform a course prediction evaluation of the tracking target based on the characteristics based on the type of the target. For example, if the tracking target is a vehicle, it is likely to follow the roadway, if it is a person, it is likely to follow the sidewalk, and if it is an animal, it is likely to follow the entire road (whether it is a roadway or a sidewalk). Is high.
  • the LiDAR control unit 12 determines whether or not the failure candidate is classified as a tracking target and predicts the course of the tracking target, taking into consideration not only the result of the fine scan of the region of interest but also the analysis result of the captured image D1. Evaluation may be performed. In this case, it is possible to perform more detailed classification determination and course prediction evaluation. For example, the LiDAR control unit 12 obtains information on the presence / absence of a road without a curb and a road with a curb and information on the presence / absence of a guardrail in the planned travel route and the area near the planned travel route from the analysis result of the captured image D1. It is possible.
  • FIG. 9 is a flowchart showing an example of a case where the distance measuring method shown in FIG. 8 is applied to a vehicle driving technique (particularly a technique for avoiding the vehicle coming into contact with an obstacle).
  • the LiDAR control unit 12 determines the possibility of contact between the vehicle and the obstacle candidate based on the peripheral information and analysis information (see S5 and S6 in FIG. 8) derived from the result of the region of interest fine scan (FIG. 9). S11).
  • the contact possibility can be determined by comprehensively considering various information. For example, the possibility of contact between the vehicle and the obstacle candidate may be determined based on the relative position (for example, distance and direction) between the vehicle and the obstacle candidate.
  • the contact possibility determination is continuously repeated (S11).
  • the LiDAR control unit 12 determines that the possibility of contact between the vehicle and the obstacle candidate is high (Y in S12), the LiDAR control unit 12 avoids contact with the vehicle based on peripheral information and analysis information. It is determined whether or not the operation is possible (S13).
  • the contact avoidance operation is not limited, but the operation of changing the vehicle course by the steering operation may be included in the contact avoidance operation.
  • the LiDAR control unit 12 determines that the contact avoidance operation is possible (Y in S14)
  • the LiDAR control unit 12 causes the vehicle to execute the contact avoidance operation (S15).
  • the LiDAR control unit 12 determines whether or not the vehicle braking operation is possible based on the peripheral information and the analysis information. Judgment (S16).
  • the LiDAR control unit 12 determines that the braking operation is possible (Y in S17)
  • the LiDAR control unit 12 causes the vehicle to execute the braking operation (S18).
  • the LiDAR control unit 12 determines that the braking operation is not possible (N in S17)
  • the LiDAR control unit 12 causes the vehicle to perform an emergency operation.
  • the emergency operation is not limited, and for example, an operation of stopping the vehicle in an emergency and an operation of notifying the driver that an emergency situation has occurred may be included in the emergency operation.
  • the LiDAR control unit 12 determines whether or not the vehicle can continue to run based on the peripheral information and the analysis information (S20). When the LiDAR control unit 12 determines that the vehicle can continue to run (Y in S21), the LiDAR control unit 12 can again make contact between the vehicle and the obstacle candidate based on the peripheral information and the analysis information. The sex is determined (S11).
  • the LiDAR control unit 12 determines that the continuous running of the vehicle is not possible (N in S21), the LiDAR control unit 12 issues a command signal for stopping the running of the vehicle, and the vehicle is completely stopped. (S22).
  • the scan range of a wide range co-earth scan can be determined based on the lane on the planned driving route (that is, the driving lane) and the lane adjacent to the driving lane (that is, the adjacent lane).
  • obstacle candidates can be scanned with a resolution that can be classified into falling objects, vehicles, motorcycles, and the like.
  • the above-mentioned distance measuring method can be performed when the vehicle travels on a city road where a building such as a house exists nearby and pedestrians may exist.
  • the above-mentioned distance measuring method can be performed when the vehicle travels in an accident-prone area with wild animals (for example, a country road at night) and the map information such as LDM is acquired in advance.
  • the scan range can be a three-dimensional space based on the position of the failure candidate identified from the result of the wide area co-earth scan.
  • the LiDAR control unit 12 can recognize the approximate space shape of the garage from the result of the wide area co-earth scan, and can also recognize the presence / absence and position of protrusions in the garage.
  • the area of interest fine scan process it is possible to scan with a high resolution that can detect objects that may come into contact with each other in the garage.
  • the time constraints are relatively loose, so it is possible to perform a fine scan of the region of interest at a very high resolution over time.
  • the above-mentioned LiDAR device 11 uses the reflective surface 32 having a partial shape of an ellipse, it is advantageous to suppress the degree of protrusion from the installation target such as a vehicle, and the LiDAR device 11 protrudes from the installation target at all. It is also possible to install it so that it does not.
  • the reflective surface 32 (see FIG. 2) having a part shape of the outer peripheral surface of the spheroid, the reflective surface 32 can be easily formed and the size of the occupied space of the reflective surface 32 can be suppressed. can.
  • the reflecting surface 32 (see FIG. 4) having a part of the shape of the side surface of the elliptical column, the reflecting surface 32 basically appropriately reflects the laser beam L traveling in all directions and is second. It can be advanced towards focus F2.
  • the LiDAR device 11 can perform adaptive and variable speed scanning.
  • the laser beam L can appropriately pass through the first focal point F1.
  • the optical path adjusting unit 23 adjusts the traveling direction of the laser beam L on the outside of the reflector 22 so that the laser beam L is incident on the inside of the reflector 22 via the inlet passing portion 34. Therefore, it is possible to promote the miniaturization of the reflector 22.
  • the exit window portion 38 (see FIG. 6) through which the laser beam L after reflection passes on the reflection surface 32 has a diameter smaller than the minor axis of the ellipse formed by the reflection surface 32, so that the exit window portion 38 is small and conspicuous. It can be configured so that it does not.
  • exit window portion 38 by arranging the exit window portion 38 at the position of the second focal point F2, it is possible to minimize the size of the exit window portion 38.
  • connection portion of the exit window portion 38 and the window support portion 39 has a smooth outer surface without having a protruding structure. be able to.
  • the LiDAR device 11 with an optical element 26 as required, it is possible to emit the laser beam L having desired optical characteristics from the LiDAR device 11.
  • the LiDAR control unit 12 with the LiDAR device 11 provided with the reflecting surface 32 having a part of the ellipse shape, the degree of protrusion of the LiDAR system 10 from the installation target is suppressed, and a wide range of the surrounding environment is covered.
  • Laser light scanning can be performed.
  • the surrounding environment can be suppressed while suppressing the degree of protrusion of the LiDAR system 10 from the installation target.
  • a wide range of laser light scanning can be performed.
  • the technical categories that embody the above-mentioned technical ideas are not limited.
  • the above-mentioned technical idea may be embodied by a computer program for causing a computer to execute one or a plurality of procedures (steps) included in the above-mentioned distance measuring method.
  • the above-mentioned technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
  • the present disclosure may also have the following structure.
  • a light emitting part that emits laser light and A reflector including a reflective surface having a partial shape of an ellipse and reflecting the laser beam, and an outlet passing portion through which the laser beam reflected by the reflective surface passes.
  • An optical path adjusting unit that adjusts the traveling direction of the laser beam so that the laser beam is reflected by the reflecting surface after passing through the position of one focal point of the ellipse.
  • a LiDAR device comprising.
  • Item 2 Item 2. The LiDAR device according to Item 1, wherein the reflective surface has a shape of a part of the outer peripheral surface of a spheroid.
  • the LiDAR device according to item 1 or 2 wherein the reflective surface has a shape of a part of a side surface of an elliptical column.
  • the optical path adjusting unit has a MEMS mirror.
  • the optical path adjusting unit is arranged at the position of one of the focal points.
  • the reflector has an inlet passage through which the laser beam can pass. The optical path adjusting portion adjusts the traveling direction of the laser light outside the reflector, and the laser light passes through the inlet passing portion and enters the inside of the reflector, and then is reflected by the reflecting surface.
  • the LiDAR apparatus according to any one of items 1 to 5.
  • [Item 7] The LiDAR device according to any one of items 1 to 6, wherein the exit passage portion is arranged at the position of the other focal point of the ellipse.
  • a cover body having an emission window portion through which the laser beam after reflection is passed by the reflection surface is provided.
  • the exit window portion is arranged at the position of the other focal point of the ellipse.
  • the cover body has a window support portion that supports the exit window portion, and the cover body has a window support portion.
  • a reflector including a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes.
  • a LiDAR device comprising an optical path adjusting unit that adjusts the traveling direction of the laser light so that the laser light passes through the position of one focal point of the ellipse and then is reflected by the reflecting surface.
  • a LiDAR system including at least a LiDAR control unit that controls the optical path adjustment unit. [Item 13] Equipped with an environment image acquisition unit that acquires captured images of the surrounding environment The LiDAR control unit Based on the analysis of the captured image, the area of interest in the surrounding environment is identified. The LiDAR system according to item 12, wherein the optical path adjusting unit is controlled so that the laser beam emitted from the LiDAR device travels toward the region of interest in the surrounding environment.
  • LiDAR system 11 LiDAR device 12 LiDAR control unit 21 Light emitting unit 22 Reflector 23 Optical path adjustment unit 26 Optical element 32 Reflective surface 33 Exit passage 34 Entrance passage 38 Exit window 39 Window support 50 Imaging device F1 First focus F2 Second focus L laser light

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

[Problem] To provide a LiDAR device advantageous for suppressing the degree of protrusion from the object for installing the LiDAR device. [Solution] A LiDAR device comprises: a light emission unit for emitting laser light; a reflection body including a reflection surface that is in ae shape of a part of an ellipse and reflects the laser light and an exit transmission part that laser light reflected by the reflection surface passes through; and an optical path adjustment unit for adjusting the travel direction of the laser light such that the laser light is reflected by the reflection surface after passing through the position of one of the focal points of the ellipse.

Description

LiDAR装置、LiDARシステム、測距方法及びプログラムLiDAR equipment, LiDAR system, ranging methods and programs
 本開示は、LiDAR装置、LiDARシステム、測距方法及びプログラムに関する。 This disclosure relates to a LiDAR device, a LiDAR system, a distance measuring method and a program.
 LiDAR(Light Detection and Ranging/Laser Imaging Detection and Ranging)を利用した測距技術が様々な分野で注目されている。例えば車両分野では、周辺の物体の位置及び形状を高精度に検出可能なLiDAR技術を使った自動運転技術の開発が進められている。 Distance measurement technology using LiDAR (Light Detection and Ringing / Laser Imaging Detection and Ringing) is attracting attention in various fields. For example, in the vehicle field, the development of automatic driving technology using LiDAR technology, which can detect the position and shape of surrounding objects with high accuracy, is underway.
 特許文献1は、電磁波の反射波の受信結果に基づいて周辺物体までの距離を演算計測するスキャニング型レンジセンサを開示する。当該スキャニング型レンジセンサは、モータとともに設けられる円筒状回転体と、円筒状回転体の内側に設けられる投光器、受光器及び受光レンズと、円筒状回転体の回転軸線上に設置される投受光兼用ミラーとを備える。これらのモータ、円筒状回転体、投光器、受光器、受光レンズ及び投受光兼用ミラーは、縦長のアウターカバーの内側に全体が収容された状態で、装置に据え付けられる。 Patent Document 1 discloses a scanning type range sensor that calculates and measures the distance to a peripheral object based on the reception result of the reflected wave of an electromagnetic wave. The scanning type range sensor is used for both a cylindrical rotating body provided together with a motor, a floodlight, a receiver and a light receiving lens provided inside the cylindrical rotating body, and a light emitting / receiving device installed on the rotation axis of the cylindrical rotating body. Equipped with a mirror. These motors, cylindrical rotating bodies, floodlights, photoreceivers, light receiving lenses, and light emitting / receiving mirrors are installed in the device in a state where they are entirely housed inside a vertically long outer cover.
特開2005-221336号公報Japanese Unexamined Patent Publication No. 2005-221336
 LiDAR装置は、一般に、周辺環境からの保護のために光学系を含む機構全体がカバー体で覆われた状態で、車両等の対象から突出するように設置される。特に、周辺環境の広範囲にわたってレーザー光を走査させる場合、レーザー光の出射及び反射波の受光の精度を高めるために、LiDAR装置を設置対象から大きく突出させることがある。 The LiDAR device is generally installed so as to protrude from an object such as a vehicle with the entire mechanism including the optical system covered with a cover body for protection from the surrounding environment. In particular, when scanning a laser beam over a wide range of the surrounding environment, the LiDAR device may be greatly projected from the installation target in order to improve the accuracy of emitting the laser beam and receiving the reflected wave.
 LiDAR装置の突出構造は、配置等の設置条件を限定的にするとともに、外観上目立つため装置全体の美観を大きく損ないうる。例えば赤外線波長のレーザー光を用いるLiDAR装置において、赤外線波長の光のみを透過させる窓をカバー体が具備する場合、特有の色合いを持つそのような窓を、周辺部材と美観上調和させることは必ずしも容易ではない。 The protruding structure of the LiDAR device limits the installation conditions such as placement and is conspicuous in appearance, which can greatly spoil the aesthetic appearance of the entire device. For example, in a LiDAR device that uses laser light of infrared wavelength, when the cover body includes a window that transmits only light of infrared wavelength, it is not always aesthetically harmonious with such a window having a unique hue. It's not easy.
 またLiDAR装置が車両等の移動体に搭載される場合、LiDAR装置の突出構造は、周辺物体への衝突の可能性を高めるとともに、乱気流の発生に伴う騒音や空気抵抗の増大を招く。また突出構造のLiDAR装置には汚れが付着しやすく、そのような汚れを除去するための洗浄機構も、LiDAR装置の突出構造に適応させるために複雑になりやすい。 Further, when the LiDAR device is mounted on a moving body such as a vehicle, the protruding structure of the LiDAR device increases the possibility of collision with surrounding objects and causes an increase in noise and air resistance due to the generation of turbulence. In addition, dirt is likely to adhere to the protruding structure of the LiDAR device, and the cleaning mechanism for removing such dirt is also likely to be complicated in order to adapt to the protruding structure of the LiDAR device.
 そこで本開示は、設置対象からの突出の程度を抑えるのに有利なLiDAR装置、LiDARシステム、測距方法及びプログラムを提供する。 Therefore, the present disclosure provides a LiDAR device, a LiDAR system, a distance measuring method, and a program that are advantageous for suppressing the degree of protrusion from the installation target.
 本開示の一態様は、レーザー光を発する発光部と、楕円の一部の形状を有しレーザー光を反射する反射面と、反射面で反射されたレーザー光が通過する出口通過部と、を含む反射体と、レーザー光が楕円の一方の焦点の位置を通過した後に反射面で反射されるように、レーザー光の進行方向を調整する光路調整部と、を備えるLiDAR装置に関する。 One aspect of the present disclosure is a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes. The present invention relates to a LiDAR apparatus including a reflector including a light path adjusting unit for adjusting the traveling direction of the laser light so that the laser light is reflected by a reflecting surface after passing through the position of one focal point of an ellipse.
 反射面は、回転楕円体の外周面の一部の形状を有してもよい。 The reflective surface may have a partial shape of the outer peripheral surface of the spheroid.
 反射面は、楕円柱の側面の一部の形状を有してもよい。 The reflective surface may have the shape of a part of the side surface of the elliptical column.
 光路調整部は、MEMSミラーを有してもよい。 The optical path adjusting unit may have a MEMS mirror.
 光路調整部は、一方の焦点の位置に配置されてもよい。 The optical path adjustment unit may be arranged at the position of one focal point.
 反射体は、レーザー光が通過可能な入口通過部を有し、光路調整部は、反射体の外側でレーザー光の進行方向を調整し、レーザー光が入口通過部を通過して反射体の内側に入射した後に反射面で反射されてもよい。 The reflector has an entrance passage through which the laser beam can pass, and the optical path adjustment section adjusts the traveling direction of the laser beam outside the reflector, and the laser beam passes through the entrance passage and inside the reflector. It may be reflected by the reflecting surface after being incident on.
 出口通過部は、楕円の他方の焦点の位置に配置されてもよい。 The exit passage may be located at the position of the other focal point of the ellipse.
 LiDAR装置は、反射面で反射後のレーザー光が通過する出射窓部を有するカバー体を備え、出射窓部は、楕円の短軸よりも小さい径を有してもよい。 The LiDAR device includes a cover body having an emission window portion through which the laser light after reflection passes on the reflection surface, and the emission window portion may have a diameter smaller than the minor axis of the ellipse.
 出射窓部は、楕円の他方の焦点の位置に配置されてもよい。 The exit window may be located at the position of the other focal point of the ellipse.
 カバー体は、出射窓部を支持する窓支持部を有し、出射窓部の外面及び窓支持部の外面は、段差無く、お互いに接続されていてもよい。 The cover body has a window support portion that supports the exit window portion, and the outer surface of the exit window portion and the outer surface of the window support portion may be connected to each other without a step.
 LiDAR装置は、レーザー光が入射する光学素子を備えてもよい。 The LiDAR device may include an optical element to which a laser beam is incident.
 本開示の他の態様は、レーザー光を発する発光部と、楕円の一部の形状を有しレーザー光を反射する反射面と反射面で反射されたレーザー光が通過する出口通過部とを含む反射体と、レーザー光が楕円の一方の焦点の位置を通過した後に反射面で反射されるようにレーザー光の進行方向を調整する光路調整部と、を具備するLiDAR装置と、少なくとも光路調整部を制御するLiDAR制御部と、を備えるLiDARシステムに関する。 Another aspect of the present disclosure includes a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes. A LiDAR device comprising a reflector and an optical path adjusting unit that adjusts the traveling direction of the laser light so that the laser light passes through the position of one focal point of the ellipse and is reflected by the reflecting surface, and at least the optical path adjusting unit. The present invention relates to a LiDAR system including a LiDAR control unit for controlling the above.
 LiDARシステムは、周辺環境の撮影画像を取得する環境画像取得部を備え、LiDAR制御部は、撮影画像の解析に基づいて周辺環境における関心領域を特定し、LiDAR装置から出射されるレーザー光を、周辺環境のうち関心領域に向けて進行させるように、光路調整部を制御してもよい。 The LiDAR system includes an environment image acquisition unit that acquires a captured image of the surrounding environment, and the LiDAR control unit identifies a region of interest in the surrounding environment based on the analysis of the captured image, and emits laser light emitted from the LiDAR device. The optical path adjustment unit may be controlled so as to advance toward the region of interest in the surrounding environment.
 本開示の他の態様は、LiDAR装置の発光部からレーザー光を発する工程と、レーザー光が、楕円の一方の焦点の位置を通過し、その後楕円の一部の形状を有する反射体の反射面で反射され、その後反射体の出口通過部を通過するように、光路調整部によってレーザー光の進行方向を調整する工程と、を含む測距方法に関する。 Another aspect of the present disclosure is a step of emitting a laser beam from a light emitting part of a LiDAR apparatus, and a reflecting surface of a reflector having the laser beam passing through the position of one focal point of the ellipse and then having the shape of a part of the ellipse. The present invention relates to a distance measuring method including a step of adjusting the traveling direction of the laser beam by an optical path adjusting unit so that the light is reflected by the light path and then passes through the exit passing portion of the reflector.
 周辺環境の撮影画像の解析に基づいて周辺環境における関心領域を特定する工程を含み、周辺環境のうち関心領域に向けてレーザー光を進行させるように、光路調整部によってレーザー光の進行方向が調整されてもよい。 The traveling direction of the laser beam is adjusted by the optical path adjustment unit so as to advance the laser beam toward the region of interest in the surrounding environment, including the step of identifying the region of interest in the surrounding environment based on the analysis of the captured image of the surrounding environment. May be done.
図1は、LiDAR装置の一例の概略構成を示す平面図である。FIG. 1 is a plan view showing a schematic configuration of an example of a LiDAR device. 図2は、図1に示す反射体の斜視図である。FIG. 2 is a perspective view of the reflector shown in FIG. 図3は、発光部及び光路調整部の配置例を示す側方図であり、反射面にはハッチングが付されている。FIG. 3 is a side view showing an arrangement example of the light emitting unit and the optical path adjusting unit, and the reflective surface is hatched. 図4は、回転楕円体形状を有する反射体の一例を示す図であり、点線は楕円の短軸に対応する位置を示し、反射面にはハッチングが付されている。FIG. 4 is a diagram showing an example of a reflector having a spheroidal shape, in which the dotted line indicates a position corresponding to the short axis of the ellipse, and the reflecting surface is hatched. 図5は、LiDAR装置の一変形例の概略構成を示す平面図である。FIG. 5 is a plan view showing a schematic configuration of a modified example of the LiDAR device. 図6は、LiDAR装置の他の変形例の概略構成を示す平面図であり、カバー体は断面が示されている。FIG. 6 is a plan view showing a schematic configuration of another modification of the LiDAR apparatus, and the cover body is shown in cross section. 図7は、LiDARシステムの制御構成の一例を示す機能ブロック図である。FIG. 7 is a functional block diagram showing an example of a control configuration of a LiDAR system. 図8は、測距方法の一例を示すフローチャートである。FIG. 8 is a flowchart showing an example of the distance measuring method. 図9は、図8に示す測距方法を車両運転技術(特に車両が障害物と接触することを回避する技術)に応用した場合の一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of a case where the distance measuring method shown in FIG. 8 is applied to a vehicle driving technique (particularly a technique for avoiding the vehicle from coming into contact with an obstacle).
[LiDAR装置]
 まずLiDAR装置11の典型例について説明する。
[LiDAR device]
First, a typical example of the LiDAR device 11 will be described.
 図1は、LiDAR装置11の一例の概略構成を示す平面図である。図2は、図1に示す反射体22の斜視図であり、反射面32にはハッチングが付されている。 FIG. 1 is a plan view showing a schematic configuration of an example of the LiDAR device 11. FIG. 2 is a perspective view of the reflector 22 shown in FIG. 1, and the reflective surface 32 is hatched.
 図1に示すLiDAR装置11は、レーザー光Lの光源を含む発光部21と、反射面32を有する反射体22と、レーザー光Lの進行方向を調整する光路調整部23とを備える。 The LiDAR device 11 shown in FIG. 1 includes a light emitting unit 21 including a light source of the laser beam L, a reflector 22 having a reflecting surface 32, and an optical path adjusting unit 23 for adjusting the traveling direction of the laser beam L.
 発光部21は、レーザー光Lを発する。図1に示す発光部21は、反射体22の内側において反射面32と向かい合うように設けられており、光路調整部23に向けてレーザー光Lを発する。発光部21から水平方向に発せられたレーザー光Lは、光路調整部23により進行方向が可変的に変えられ、反射面32の様々な箇所に向けて水平方向へ進行する。 The light emitting unit 21 emits laser light L. The light emitting unit 21 shown in FIG. 1 is provided inside the reflector 22 so as to face the reflecting surface 32, and emits laser light L toward the optical path adjusting unit 23. The laser beam L emitted from the light emitting unit 21 in the horizontal direction is variably changed in the traveling direction by the optical path adjusting unit 23, and travels in the horizontal direction toward various points of the reflecting surface 32.
 発光部21の配置位置は、図1に示す例には限定されない。例えば反射体22の外側において、反射面32と向かい合わない位置に発光部21が設けられてもよく、図3に示すように発光部21及び光路調整部23は高さ方向に異なる位置に設けられていてもよい。図3に示す例において、発光部21から上方(すなわち真上)に向けて発せられたレーザー光Lは、光路調整部23により進行方向が90度変えられる。図3に示す例においても、レーザー光Lの進行方向は光路調整部23により可変的に変えられ、反射面32の様々な箇所に向けてレーザー光Lは水平方向へ進行する。 The arrangement position of the light emitting unit 21 is not limited to the example shown in FIG. For example, on the outside of the reflector 22, the light emitting unit 21 may be provided at a position not facing the reflecting surface 32, and as shown in FIG. 3, the light emitting unit 21 and the optical path adjusting unit 23 are provided at different positions in the height direction. May be. In the example shown in FIG. 3, the traveling direction of the laser beam L emitted upward (that is, directly above) from the light emitting unit 21 is changed by 90 degrees by the optical path adjusting unit 23. Also in the example shown in FIG. 3, the traveling direction of the laser beam L is variably changed by the optical path adjusting unit 23, and the laser beam L travels in the horizontal direction toward various points of the reflecting surface 32.
 反射体22は、図1及び図2に示すように、光路調整部23を経て進行するレーザー光Lを反射する反射面32と、反射面32で反射されたレーザー光Lが通過する出口通過部33とを有する。 As shown in FIGS. 1 and 2, the reflector 22 has a reflecting surface 32 that reflects the laser light L traveling through the optical path adjusting unit 23 and an outlet passing portion through which the laser light L reflected by the reflecting surface 32 passes. 33 and.
 本実施形態の反射面32は、楕円の一部の形状を有する。すなわち反射面32が垂直に切断された場合に成す形状が、楕円の部分形状となる。反射面32が成す当該楕円の長軸A1上には、第1焦点F1(すなわち一方の焦点)及び第2焦点F2(すなわち他方の焦点)が位置する。第2焦点F2は、第1焦点F1よりも、出口通過部33に近い。 The reflective surface 32 of this embodiment has a shape of a part of an ellipse. That is, the shape formed when the reflecting surface 32 is cut vertically is an elliptical partial shape. The first focal point F1 (that is, one focal point) and the second focal point F2 (that is, the other focal point) are located on the long axis A1 of the ellipse formed by the reflecting surface 32. The second focal point F2 is closer to the exit passage portion 33 than the first focal point F1.
 図1及び図2に示す反射面32は、楕円柱の側面の一部の形状を有し、矩形平面形状のプレートを楕円状に湾曲させたような形状となっている。楕円柱の側面は、楕円柱の高さに対して垂直な方向へ切断された場合に、楕円を成す。 The reflective surface 32 shown in FIGS. 1 and 2 has a part of the shape of the side surface of the elliptical column, and has a shape like a rectangular planar plate curved in an elliptical shape. The sides of an elliptical column form an ellipse when cut in a direction perpendicular to the height of the elliptical column.
 反射面32の形状は、図1及び図2に示す例には限定されない。例えば図4に示すように、反射面32は、回転楕円体の外周面の一部の形状を有していてもよい。回転楕円体は、楕円をその長軸又は短軸を回転軸として得られる回転体であり、3次元的な楕円形状を有する。 The shape of the reflective surface 32 is not limited to the examples shown in FIGS. 1 and 2. For example, as shown in FIG. 4, the reflecting surface 32 may have a shape of a part of the outer peripheral surface of the spheroid. The spheroid is a spheroid obtained by using an ellipse as its major axis or a minor axis as its axis of rotation, and has a three-dimensional ellipsoidal shape.
 出口通過部33は、反射面32で反射されたレーザー光Lが通過する任意の位置に設けられることが可能である。図1に示す出口通過部33は、反射面32が成す楕円の第2焦点F2の位置に配置されている。 The exit passage portion 33 can be provided at an arbitrary position through which the laser beam L reflected by the reflection surface 32 passes. The exit passage portion 33 shown in FIG. 1 is arranged at the position of the second focal point F2 of the ellipse formed by the reflecting surface 32.
 光路調整部23は、レーザー光Lが、反射面32が成す楕円の第1焦点F1の位置を通過した後に反射面32で反射されるように、レーザー光Lの進行方向を調整する。 The optical path adjusting unit 23 adjusts the traveling direction of the laser beam L so that the laser beam L is reflected by the reflecting surface 32 after passing through the position of the first focal point F1 of the ellipse formed by the reflecting surface 32.
 光路調整部23は、発光部21からのレーザー光Lの光路上の様々な位置に設置可能である。図1に示す光路調整部23は、反射面32が成す楕円の第1焦点F1の位置に配置され、反射体22の内側において反射面32と向かい合うように設けられている。 The optical path adjusting unit 23 can be installed at various positions on the optical path of the laser beam L from the light emitting unit 21. The optical path adjusting unit 23 shown in FIG. 1 is arranged at the position of the first focal point F1 of the ellipse formed by the reflecting surface 32, and is provided inside the reflecting body 22 so as to face the reflecting surface 32.
 本例の光路調整部23は、MEMSミラー(すなわちMEMS型ミラーシステム)を有しており、レーザー光Lを様々な方向に向けて可変的に反射させることが可能である。一般に、MEMSミラーは、小型に構成されて低電圧駆動が可能であり、反射ミラーに作用する慣性力を小さく抑えながら、反射ミラーを素早く且つ位置精度良く振れ駆動することが可能である。MEMSミラーの具体的な構成は限定されない。例えば、MEMSミラーの反射ミラーの振れ駆動は、固定回転軸を介して行われてもよいし、浮遊回転軸を介して行われてもよい。 The optical path adjusting unit 23 of this example has a MEMS mirror (that is, a MEMS type mirror system), and can variably reflect the laser beam L in various directions. In general, the MEMS mirror is configured to be compact and can be driven at a low voltage, and it is possible to swing and drive the reflection mirror quickly and with high positional accuracy while suppressing the inertial force acting on the reflection mirror to a small value. The specific configuration of the MEMS mirror is not limited. For example, the runout drive of the reflection mirror of the MEMS mirror may be performed via a fixed rotation axis or may be performed via a floating rotation axis.
 なお光路調整部23の構成及び設置位置は上述の例には限定されない。光路調整部23は、MEMSミラー以外のレーザー光走査機構を有していてもよく、例えばGalvano型ミラーシステムによって実現されてもよい。また光路調整部23は、反射体22の外側において、反射面32と向かい合わないように設けられていてもよい。 The configuration and installation position of the optical path adjusting unit 23 are not limited to the above example. The optical path adjusting unit 23 may have a laser light scanning mechanism other than the MEMS mirror, and may be realized by, for example, a Galvano type mirror system. Further, the optical path adjusting unit 23 may be provided on the outside of the reflector 22 so as not to face the reflecting surface 32.
 図5は、LiDAR装置11の一変形例の概略構成を示す平面図である。図5に示すLiDAR装置11の反射体22は、上述の反射面32及び出口通過部33に加え、第1焦点F1の位置に設けられ且つレーザー光Lが通過可能な入口通過部34を有する。発光部21及び光路調整部23は、反射体22の外側に設けられている。発光部21は、全体としては固定的に設けられているが、レーザー光Lの出射位置及び出射方向が可変に設けられている。光路調整部23は、固定的に設けられており、例えばマイクロプリズムや回折格子によって構成可能である。 FIG. 5 is a plan view showing a schematic configuration of a modified example of the LiDAR device 11. The reflector 22 of the LiDAR device 11 shown in FIG. 5 has, in addition to the above-mentioned reflecting surface 32 and the exit passing portion 33, an inlet passing portion 34 provided at the position of the first focal point F1 and through which the laser beam L can pass. The light emitting unit 21 and the optical path adjusting unit 23 are provided on the outside of the reflector 22. Although the light emitting unit 21 is fixedly provided as a whole, the emission position and the emission direction of the laser beam L are variably provided. The optical path adjusting unit 23 is fixedly provided, and can be configured by, for example, a microprism or a diffraction grating.
 図5に示す例において、反射体22の外側で発光部21から発せられ且つ光路調整部23により進行方向が調整されたレーザー光Lは、入口通過部34を通過して反射体22の内側に入射し、その後反射面32で反射されて出口通過部33から出射する。光路調整部23は、基本的に、発光部21からのレーザー光Lの入射位置にかかわらず、ほぼ全てのレーザー光Lを入口通過部34に向けて出射する。入口通過部34の具体的な形状及びサイズは限定されない。反射体22の内側への不要光の入射を抑える観点からは、入口通過部34を通過するレーザー光Lの所望のビーム形状及び所望のビーム径に対応するように、入口通過部34の形状及びサイズ(例えば開口径)が決められることが好ましい。このように図5に示す例では、発光部21(特にレーザー光Lの出射態様)及び光路調整部23の組み合わせに応じてレーザー光Lの進行方向が調整されるため、機能的には発光部21も光路調整部23の一部を成す。 In the example shown in FIG. 5, the laser beam L emitted from the light emitting unit 21 on the outside of the reflector 22 and whose traveling direction is adjusted by the optical path adjusting unit 23 passes through the inlet passing unit 34 and enters the inside of the reflector 22. It is incident, then reflected by the reflecting surface 32 and emitted from the exit passing portion 33. The optical path adjusting unit 23 basically emits almost all the laser light L toward the inlet passing unit 34 regardless of the incident position of the laser light L from the light emitting unit 21. The specific shape and size of the entrance passage portion 34 are not limited. From the viewpoint of suppressing the incident of unnecessary light inside the reflector 22, the shape of the inlet passing portion 34 and the shape of the inlet passing portion 34 so as to correspond to the desired beam shape and desired beam diameter of the laser beam L passing through the inlet passing portion 34. It is preferable that the size (for example, the opening diameter) is determined. As described above, in the example shown in FIG. 5, since the traveling direction of the laser light L is adjusted according to the combination of the light emitting unit 21 (particularly the emission mode of the laser light L) and the optical path adjusting unit 23, the light emitting unit is functionally 21 also forms a part of the optical path adjusting unit 23.
 LiDAR装置11は、発光部21から発せられたレーザー光Lが入射するレンズ等の光学素子を更に備えていてもよい。そのような光学素子の設置位置は限定されず、1種類又は複数種類の光学素子をレーザー光Lの光路上に適宜設置可能である。例えば、反射面32で反射された後のレーザー光Lが入射する位置に光学素子が設けられてもよいし、反射面32で反射される前のレーザー光Lが入射する位置に光学素子が設けられてもよい。 The LiDAR device 11 may further include an optical element such as a lens to which the laser beam L emitted from the light emitting unit 21 is incident. The installation position of such an optical element is not limited, and one type or a plurality of types of optical elements can be appropriately installed on the optical path of the laser beam L. For example, an optical element may be provided at a position where the laser beam L after being reflected by the reflecting surface 32 is incident, or an optical element may be provided at a position where the laser light L before being reflected by the reflecting surface 32 is incident. May be done.
 例えば図1に示すように、反射面32が成す楕円の第2焦点F2の位置に光学素子26が配置されてもよい。この場合、単一の光学素子26によって、反射面32で反射されたレーザー光Lの全てを光学的に補正することが可能である。また発光部21と光路調整部23との間に光学素子を設置し、光学素子により光学特性が調整された状態のレーザー光Lを光路調整部23に入射させてもよい。光学素子の具体的な構成及び機能は限定されない。光学素子26は、例えば、レーザー光Lのビーム径を変えたり、レーザー光Lをコリメートしたり、レーザー光Lの進行方向を調整したり、レーザー光Lの進行範囲(すなわち画角範囲)を調整(例えば拡大)したりすることができる。 For example, as shown in FIG. 1, the optical element 26 may be arranged at the position of the second focal point F2 of the ellipse formed by the reflecting surface 32. In this case, it is possible to optically correct all of the laser light L reflected by the reflecting surface 32 by the single optical element 26. Further, an optical element may be installed between the light emitting unit 21 and the optical path adjusting unit 23, and the laser beam L in a state where the optical characteristics are adjusted by the optical element may be incident on the optical path adjusting unit 23. The specific configuration and function of the optical element are not limited. The optical element 26, for example, changes the beam diameter of the laser beam L, collimates the laser beam L, adjusts the traveling direction of the laser beam L, and adjusts the traveling range (that is, the angle range) of the laser beam L. It can be (for example, enlarged).
 LiDAR装置11は、反射体22の出口通過部33を覆うカバー体を備えていてもよい。図6に示すカバー体25は、反射面32で反射後のレーザー光Lが通過する出射窓部38と、出射窓部38を支持する窓支持部39とを有する。出射窓部38の外面(すなわち外部に露出する面)及び窓支持部39の外面は、段差無くお互いに接続されており、同一平面上に配置されている。図6に示す出射窓部38は、第2焦点F2の位置に配置されており、反射面32が成す楕円の短軸よりも小さい径(図6の上下方向長さ)を有する。なお出射窓部38は、反射面32で反射後のレーザー光Lの光路上の任意の位置に設置可能であり、第2焦点F2の位置以外の位置に設けられてもよい。例えば、第2焦点F2の位置に光学素子26を設置しつつ(図1参照)、レーザー光Lの進行方向に関して第2焦点F2よりも下流側に出射窓部38を配置してもよい。 The LiDAR device 11 may include a cover body that covers the exit passage portion 33 of the reflector 22. The cover body 25 shown in FIG. 6 has an exit window portion 38 through which the laser beam L after reflection passes through the reflection surface 32, and a window support portion 39 that supports the exit window portion 38. The outer surface of the exit window portion 38 (that is, the surface exposed to the outside) and the outer surface of the window support portion 39 are connected to each other without a step and are arranged on the same plane. The exit window portion 38 shown in FIG. 6 is arranged at the position of the second focal point F2, and has a diameter smaller than the minor axis of the ellipse formed by the reflecting surface 32 (length in the vertical direction in FIG. 6). The exit window portion 38 can be installed at an arbitrary position on the optical path of the laser beam L after being reflected by the reflecting surface 32, and may be provided at a position other than the position of the second focal point F2. For example, while the optical element 26 is installed at the position of the second focal point F2 (see FIG. 1), the exit window portion 38 may be arranged downstream of the second focal point F2 in the traveling direction of the laser beam L.
 LiDAR装置11は、更に、周辺環境からのレーザー光Lの反射光(すなわち周辺環境で散乱されたレーザー光L)を受光する受光部(後述の図7参照)を備える。受光部における反射光の受光結果を解析することによって、周辺環境に関する情報を取得することができる。例えば、物体や人などの対象が周辺に存在するか否かに関する情報や、周辺に存在する対象までの方位及び距離に関する情報及び当該対象の性質に関する情報を、反射光の受光結果の解析に基づいて導出することが可能である。受光部は、発光部21と一体的に設けられていてもよいし、発光部21とは別体として設けられていてもよく、受光部の設置位置及び設置態様は限定されない。 The LiDAR device 11 further includes a light receiving unit (see FIG. 7 described later) that receives the reflected light of the laser light L from the surrounding environment (that is, the laser light L scattered in the surrounding environment). Information on the surrounding environment can be obtained by analyzing the light reception result of the reflected light in the light receiving unit. For example, information on whether or not an object such as an object or a person exists in the vicinity, information on the orientation and distance to an object existing in the vicinity, and information on the nature of the object are obtained based on the analysis of the light reception result of the reflected light. It is possible to derive it. The light receiving unit may be provided integrally with the light emitting unit 21 or may be provided separately from the light emitting unit 21, and the installation position and mode of the light receiving unit are not limited.
 次に、上述のLiDAR装置11の作用及び効果について説明する。以下に説明する測距方法は、LiDAR装置11が制御装置(後述の図7参照)の制御下で駆動されることで行われる。 Next, the operation and effect of the above-mentioned LiDAR device 11 will be described. The distance measuring method described below is performed by driving the LiDAR device 11 under the control of a control device (see FIG. 7 described later).
 LiDAR装置11により行われる測距方法は、発光部21からレーザー光Lを発する工程と、光路調整部23によりレーザー光Lの進行方向を調整する工程と、受光部により反射光を受光する工程とを含む。特に、レーザー光進行方向調整工程では、レーザー光Lが第1焦点F1を通過し、その後反射面32で反射され、その後出口通過部33を通過するように、レーザー光Lの進行方向が光路調整部23により調整される。 The distance measuring method performed by the LiDAR device 11 includes a step of emitting laser light L from the light emitting unit 21, a step of adjusting the traveling direction of the laser light L by the optical path adjusting unit 23, and a step of receiving reflected light by the light receiving unit. including. In particular, in the laser light traveling direction adjusting step, the traveling direction of the laser light L adjusts the optical path so that the laser light L passes through the first focal point F1, is reflected by the reflecting surface 32, and then passes through the exit passing portion 33. It is adjusted by the unit 23.
 発光部21から発せられたレーザー光Lは、第1焦点F1を通過した後に楕円形状の反射面32で反射されるため、反射面32上の反射位置にかかわらず第2焦点F2を必ず通過する。そのため、第2焦点F2をレーザー光Lの仮想的な走査中心とみなすことができ、レーザー光Lは、第2焦点F2を中心に、走査範囲のあらゆる方向へ出射するものとみなされる。 Since the laser beam L emitted from the light emitting unit 21 is reflected by the elliptical reflecting surface 32 after passing through the first focal point F1, it always passes through the second focal point F2 regardless of the reflecting position on the reflecting surface 32. .. Therefore, the second focal point F2 can be regarded as a virtual scanning center of the laser beam L, and the laser beam L is considered to be emitted in all directions of the scanning range centering on the second focal point F2.
 このように上述のLiDAR装置11によれば、第2焦点F2に光路調整部23等の機器類を設置することなく、実質的に第2焦点F2を中心にレーザー光走査を行うことができる。そのため第2焦点F2(すなわち実質的なレーザー光走査中心)における装置構成を簡素化することができ、例えば光路調整部23等の機器類の全体を取り囲む広範囲カバーのような突出体を第2焦点F2に設置しないようにすることができる。 As described above, according to the above-mentioned LiDAR device 11, laser light scanning can be substantially performed centering on the second focal point F2 without installing equipment such as an optical path adjusting unit 23 in the second focal point F2. Therefore, it is possible to simplify the device configuration in the second focus F2 (that is, the substantially center of laser light scanning), and the second focus is a protrusion such as a wide-range cover that surrounds the entire device such as the optical path adjustment unit 23. It can be prevented from being installed in F2.
 したがって車両等の対象に上述のLiDAR装置11を設置する場合、LiDAR装置11の外部への露出箇所を第2焦点F2の位置を基準に設計することで、設置対象からのLiDAR装置11の突出を抑えることができる。 Therefore, when the above-mentioned LiDAR device 11 is installed on a target such as a vehicle, by designing the exposed portion of the LiDAR device 11 to the outside based on the position of the second focal point F2, the protrusion of the LiDAR device 11 from the installation target can be prevented. It can be suppressed.
 また上述のLiDAR装置11によれば、第2焦点F2においてレーザー光Lを非常に限られた範囲から出射させつつ、レーザー光走査を行うことができる。そのため、レーザー光Lを通過させる出射窓部38(図6参照)を第2焦点F2又は第2焦点F2の近傍に設置する場合、出射窓部38を非常に小さく構成しつつ、広範囲にわたるレーザー光走査が可能である。このように出射窓部38の大きさは、レーザー光Lの走査範囲に応じて決められることができる。また出射窓部38が特有の色味を持つ場合であっても、出射窓部38を十分に小さくすることで、出射窓部38を外部から目立たなくさせることができる。 Further, according to the above-mentioned LiDAR device 11, the laser light scanning can be performed while emitting the laser light L from a very limited range at the second focal point F2. Therefore, when the exit window portion 38 (see FIG. 6) through which the laser beam L is passed is installed in the vicinity of the second focus F2 or the second focus F2, the emission window portion 38 is configured to be very small and the laser light over a wide range. Scanning is possible. As described above, the size of the emission window portion 38 can be determined according to the scanning range of the laser beam L. Further, even when the exit window portion 38 has a peculiar color, the emission window portion 38 can be made inconspicuous from the outside by making the exit window portion 38 sufficiently small.
 現在市販化されているLiDAR装置の多くは、広角対応のために回転式スキャナーを採用するとともに赤外光を用いている。またスキャナーの回転機構を外部に晒さないために且つ赤外光検出のノイズとなる可視光を取り除くために、赤外光は透過するが可視光を透過させない黒い円筒カバーにより回転機構等の機器が覆われることが多い。そのようなLiDAR装置を車両等の対象に設置する場合、設置対象の本来的なデザインがLiDAR装置によって少なからず損なわれる。 Most of the LiDAR devices currently on the market use a rotary scanner and infrared light to support a wide angle. In addition, in order not to expose the rotation mechanism of the scanner to the outside and to remove visible light that is a noise of infrared light detection, a device such as a rotation mechanism is provided with a black cylindrical cover that transmits infrared light but does not transmit visible light. Often covered. When such a LiDAR device is installed on an object such as a vehicle, the original design of the object to be installed is not a little impaired by the LiDAR device.
 一方、上述のLiDAR装置11によれば、設置対象からのLiDAR装置11の突出の程度を抑えて、外部露出箇所を小さく構成することで、LiDAR装置11の設置の自由度を向上させることができ、LiDAR装置11を様々な形態で設置可能である。特に、車両等の設置対象のデザインに対する影響を小さく抑えることができ、LiDAR装置11を、周辺デザインと高いレベルで調和させつつ適切に設置することが可能である。一般に、あるデザインの一部としてLiDAR装置11の外部露出箇所を設ける場合、外部露出箇所の面積をデザイン全体の面積の50%未満とすることで、外部露出箇所の目立ちを効果的に抑えることができる。上述のLiDAR装置11によれば、当該50%未満の条件を十分に満たすことが可能である。 On the other hand, according to the above-mentioned LiDAR device 11, the degree of freedom in installing the LiDAR device 11 can be improved by suppressing the degree of protrusion of the LiDAR device 11 from the installation target and making the externally exposed portion small. , The LiDAR device 11 can be installed in various forms. In particular, the influence on the design of the installation target such as a vehicle can be suppressed to a small extent, and the LiDAR device 11 can be appropriately installed while being in harmony with the peripheral design at a high level. Generally, when an externally exposed part of the LiDAR device 11 is provided as a part of a certain design, the area of the externally exposed part is set to less than 50% of the area of the entire design, so that the conspicuousness of the externally exposed part can be effectively suppressed. can. According to the above-mentioned LiDAR device 11, it is possible to sufficiently satisfy the condition of less than 50%.
 またLiDAR装置11の突出の程度を抑えることで、安全性を向上させて、騒音や空気抵抗を低減することができる。さらにLiDAR装置11の外部露出箇所に対する汚れの付着を抑えることができ、簡素な洗浄機構によって外部露出箇所に付着した汚れを除去することが可能である。例えば、船舶で使われている高速回転窓のような水滴除去機構が組み合わされた洗浄機構によって、LiDAR装置11の外部露出箇所を適切に洗浄することも可能である。 Further, by suppressing the degree of protrusion of the LiDAR device 11, safety can be improved and noise and air resistance can be reduced. Further, it is possible to suppress the adhesion of dirt to the externally exposed portion of the LiDAR device 11, and it is possible to remove the stain adhering to the externally exposed portion by a simple cleaning mechanism. For example, it is possible to appropriately clean the externally exposed portion of the LiDAR device 11 by a cleaning mechanism combined with a water droplet removing mechanism such as a high-speed rotating window used in a ship.
 また上述のLiDAR装置11によれば、光路調整部23を、反射体22の内側空間において、反射体22により保護された状態で配置することができる。特に、光路調整部23を小型軽量のMEMSミラーにより構成することで、反射体22の内側空間が小さくても、光路調整部23を簡単且つ適切に設置することが可能である。さらに光路調整部23を小型軽量のMEMSミラーにより構成することで、筐体回転式のLiDAR装置やスキャナーヘッド一体低速回転型のLiDAR装置に比べて、レーザー光Lの高速走査が可能である。 Further, according to the above-mentioned LiDAR device 11, the optical path adjusting unit 23 can be arranged in the inner space of the reflector 22 in a state of being protected by the reflector 22. In particular, by configuring the optical path adjusting unit 23 with a compact and lightweight MEMS mirror, it is possible to easily and appropriately install the optical path adjusting unit 23 even if the inner space of the reflector 22 is small. Further, by configuring the optical path adjusting unit 23 with a compact and lightweight MEMS mirror, high-speed scanning of the laser beam L is possible as compared with the housing rotation type LiDAR device and the scanner head integrated low-speed rotation type LiDAR device.
[LiDARシステム]
 次に、上述のLiDAR装置11を利用したLiDARシステムについて説明する。
[LiDAR system]
Next, a LiDAR system using the above-mentioned LiDAR device 11 will be described.
 以下では、一例として、車両にLiDAR装置11を搭載する場合について説明する。ただしLiDAR装置11の設置対象は車両には限定されず、LiDAR装置11を他の対象(移動体及び静止体を含む)に搭載する場合にも以下の説明は同様に適用されうる。 Below, as an example, a case where the LiDAR device 11 is mounted on a vehicle will be described. However, the installation target of the LiDAR device 11 is not limited to the vehicle, and the following description can be similarly applied when the LiDAR device 11 is mounted on another target (including a moving body and a stationary body).
 図7は、LiDARシステム10の制御構成の一例を示す機能ブロック図である。 FIG. 7 is a functional block diagram showing an example of the control configuration of the LiDAR system 10.
 図7に示すLiDARシステム10は、LiDAR装置11、制御装置15及び撮影装置50を備える。制御装置15及び撮影装置50は、本実施形態ではLiDAR装置11が設置される対象(本例では車両(図示省略))に搭載されているが、LiDAR装置11が設置される対象に搭載されていなくてもよい。例えば制御装置15及び撮影装置50は、LiDAR装置11が設置される対象から離れた位置に設けられていてもよく、無線信号を介して他の機器とのデータ送受信を行ってもよい。 The LiDAR system 10 shown in FIG. 7 includes a LiDAR device 11, a control device 15, and a photographing device 50. In the present embodiment, the control device 15 and the photographing device 50 are mounted on the target on which the LiDAR device 11 is installed (in this example, the vehicle (not shown)), but are mounted on the target on which the LiDAR device 11 is installed. It does not have to be. For example, the control device 15 and the photographing device 50 may be provided at a position away from the target in which the LiDAR device 11 is installed, or data may be transmitted / received to / from another device via a wireless signal.
 撮影装置50は、周辺環境の撮影画像D1(すなわち画像データ)を取得する環境画像取得部であり、CCDイメージセンサやCMOSイメージセンサなどの固体撮像素子を具備する。撮影装置50は、制御装置15(特に撮影制御部16)の制御下で撮影を行い、撮影画像D1を制御装置15(特に撮影制御部16)に送信する。撮影装置50は、任意の構成を有することができ、例えば単眼カメラによって構成されてもよいし、ステレオカメラによって構成されてもよい。 The photographing device 50 is an environment image acquisition unit that acquires a photographed image D1 (that is, image data) of the surrounding environment, and includes a solid-state image sensor such as a CCD image sensor or a CMOS image sensor. The photographing device 50 shoots under the control of the control device 15 (particularly the photographing control unit 16), and transmits the photographed image D1 to the control device 15 (particularly the photographing control unit 16). The photographing device 50 may have any configuration, and may be configured by, for example, a monocular camera or a stereo camera.
 制御装置15は、LiDAR制御部12及び撮影制御部16を有する。 The control device 15 has a LiDAR control unit 12 and an imaging control unit 16.
 LiDAR制御部12は、LiDAR装置11(少なくとも光路調整部23)を制御する。図7に示すLiDAR制御部12は発光部21及び光路調整部23を制御する。発光部21は、LiDAR制御部12から送られてくる制御信号D2に基づいて、レーザー光Lを発する。光路調整部23は、LiDAR制御部12から送られてくる制御信号D2に基づいて、レーザー光Lの進行方向を調整する。 The LiDAR control unit 12 controls the LiDAR device 11 (at least the optical path adjustment unit 23). The LiDAR control unit 12 shown in FIG. 7 controls the light emitting unit 21 and the optical path adjusting unit 23. The light emitting unit 21 emits the laser beam L based on the control signal D2 sent from the LiDAR control unit 12. The optical path adjusting unit 23 adjusts the traveling direction of the laser beam L based on the control signal D2 sent from the LiDAR control unit 12.
 LiDAR制御部12は、LiDAR装置11の受光部24から、周辺環境からのレーザー光Lの反射光の受光結果D3を受信する。そしてLiDAR制御部12は、受光部24の受光結果D3を解析し、周辺環境の状態に関する情報を取得する。 The LiDAR control unit 12 receives the light receiving result D3 of the reflected light of the laser light L from the surrounding environment from the light receiving unit 24 of the LiDAR device 11. Then, the LiDAR control unit 12 analyzes the light receiving result D3 of the light receiving unit 24 and acquires information regarding the state of the surrounding environment.
 本実施形態のLiDAR制御部12は、後述のように、撮影画像D1の解析に基づいて周辺環境における関心領域を特定する関心領域特定部としても働く。 As will be described later, the LiDAR control unit 12 of the present embodiment also functions as an area of interest specifying unit that specifies an area of interest in the surrounding environment based on the analysis of the captured image D1.
 撮影制御部16は、撮影装置50を制御するとともに、撮影装置50から送られてくる撮影画像D1を記憶部(図示省略)に保存する。また図7に示す撮影制御部16は、撮影画像D1をLiDAR制御部12に送信する。 The photographing control unit 16 controls the photographing device 50 and stores the photographed image D1 sent from the photographing device 50 in a storage unit (not shown). Further, the photographing control unit 16 shown in FIG. 7 transmits the captured image D1 to the LiDAR control unit 12.
 上述の構成を有するLiDARシステム10は、広範囲コアーススキャン(Coarse Scan)及び関心領域ファインスキャン(Fine Scan)を組み合わせて行うことで、状況に応じた選択的優先検知処理を行うことができる。すなわち広範囲コアーススキャンによって、周辺環境の広域な範囲に対し、比較的粗い解像度ではあるが高速な光検出及び測距が行われる。この広範囲コアーススキャンの結果に基づき、障害候補(例えば物体及び人)の位置を基準に定められる領域が、関心領域(ROI:Region of Interest)として特定される。そして関心領域に対して限定的にファインスキャンを行うことで、障害候補の詳細情報(人、動物及び物体などの具体的な種類等)が特定される。このように、時間を要するファインスキャンを関心領域に対してのみ行うことで、広域範囲に対する光検出及び測距の実施と、障害候補の詳細情報の取得とを、短時間且つ高精度に行うことができる。 The LiDAR system 10 having the above configuration can perform selective priority detection processing according to the situation by performing a wide range coarse scan (Coarse Scan) and a fine scan of the region of interest (Fine Scan) in combination. That is, the wide-range co-earth scan performs high-speed photodetection and distance measurement with a relatively coarse resolution over a wide range of the surrounding environment. Based on the results of this wide-area co-earth scan, a region defined based on the position of a candidate for obstacle (for example, an object and a person) is specified as a region of interest (ROI). Then, by performing a limited fine scan on the region of interest, detailed information on the candidate for disability (specific types such as humans, animals, and objects) is specified. In this way, by performing a time-consuming fine scan only on the region of interest, it is possible to perform light detection and distance measurement over a wide area and acquire detailed information on failure candidates in a short time and with high accuracy. Can be done.
 図8は、測距方法の一例を示すフローチャートである。 FIG. 8 is a flowchart showing an example of the distance measuring method.
 まず、LiDAR制御部12の制御下で、LiDAR装置11によって広範囲コアーススキャンが行われる(図8のS1)。すなわちLiDAR制御部12は、LiDAR装置11から出射されるレーザー光Lを、周辺環境のうち広範囲コアーススキャンのスキャン範囲に向けて進行させるように、光路調整部23を制御する。その結果としての受光部24の受光結果D3は、LiDAR制御部12に送られる。 First, under the control of the LiDAR control unit 12, a wide range co-earth scan is performed by the LiDAR device 11 (S1 in FIG. 8). That is, the LiDAR control unit 12 controls the optical path adjustment unit 23 so that the laser beam L emitted from the LiDAR device 11 travels toward the scan range of a wide range coearth scan in the surrounding environment. As a result, the light receiving result D3 of the light receiving unit 24 is sent to the LiDAR control unit 12.
 ここで行われる広範囲コアーススキャンは、比較的粗い解像度で高速に行われ、スキャン範囲における障害候補の探索と、当該障害候補の位置(例えば方位及び距離)の検知を主たる目的とする。したがって広範囲コアーススキャンでは、障害候補の具体的な詳細情報の検知は行われない。 The wide-range co-earth scan performed here is performed at a relatively coarse resolution and at high speed, and its main purpose is to search for a failure candidate in the scan range and to detect the position (for example, direction and distance) of the failure candidate. Therefore, the wide area co-earth scan does not detect the specific detailed information of the failure candidate.
 広範囲コアーススキャンのスキャン範囲(例えば方位及び視野角)は限定されず、予め定められた所定範囲であってもよいし、状況に応じて可変的に決められた範囲であってもよい。例えば車両走行中の場合、LiDAR制御部12は、LDM(Local Dynamic Map)等のマップ情報及び車両速度に応じて、広範囲コアーススキャンのスキャン範囲を適応的に決めてもよい。LiDAR制御部12は、マップ情報及び車両速度の情報を任意の方法で取得することができ、例えば制御装置15に含まれる他の制御部(例えば速度監視部)やメモリから、マップ情報及び車両速度の情報を取得してもよい。 The scan range (for example, azimuth and viewing angle) of the wide-range co-earth scan is not limited, and may be a predetermined range or a range variably determined according to the situation. For example, when the vehicle is running, the LiDAR control unit 12 may adaptively determine the scan range of the wide range co-earth scan according to the map information such as LDM (Local Dynamic Map) and the vehicle speed. The LiDAR control unit 12 can acquire map information and vehicle speed information by any method, for example, map information and vehicle speed from another control unit (for example, speed monitoring unit) or memory included in the control device 15. You may get the information of.
 広範囲コアーススキャンの探索粒度(例えばスキャン解像度)は限定されず、予め固定的に定められていてもよいし、LiDAR制御部12が状況(例えば探索用途)に応じて広範囲コアーススキャンの探索粒度を可変的に決めてもよい。 The search particle size of the wide-range co-earth scan (for example, scan resolution) is not limited and may be fixedly determined in advance, and the LiDAR control unit 12 changes the search particle size of the wide-range co-earth scan according to the situation (for example, search application). You may decide on the target.
 広範囲コアーススキャンは、人が周辺環境の全体像を観察取得する動作に似ているが、検知対象(すなわち障害候補)の位置(例えば方位及び距離)の高精度検知を必要とする点で、人の観察動作とは異なる。 Wide-range co-earth scanning is similar to the behavior of a person observing and acquiring an overall picture of the surrounding environment, but in that it requires highly accurate detection of the position (eg, orientation and distance) of the detection target (ie, failure candidate). It is different from the observation operation of.
 一方、撮影制御部16の制御下で、撮影装置50により周辺環境の広範囲が撮影されて撮影画像D1が取得される(S2)。 On the other hand, under the control of the photographing control unit 16, a wide range of the surrounding environment is photographed by the photographing device 50, and the photographed image D1 is acquired (S2).
 撮影装置50の撮影範囲は限定されず、予め定められた所定範囲であってもよいし、撮影制御部16又はLiDAR制御部12が状況に応じて可変的に決めた範囲であってもよい。撮影装置50による撮影範囲は、例えば広範囲コアーススキャンのスキャン範囲より大きくても、小さくても、同じでもよく、マップ情報及び車両速度に応じて適応的に決められてもよい。ただし撮影装置50による撮影範囲は、少なくとも部分的に、広範囲コアーススキャンのスキャン範囲と共通する。 The shooting range of the shooting device 50 is not limited, and may be a predetermined range, or may be a range variably determined by the shooting control unit 16 or the LiDAR control unit 12 according to the situation. The photographing range by the photographing device 50 may be larger, smaller, or the same as, for example, the scan range of the wide area co-earth scan, and may be adaptively determined according to the map information and the vehicle speed. However, the photographing range by the photographing apparatus 50 is at least partially common with the scanning range of the wide range co-earth scan.
 撮影装置50による周辺環境の撮影画像の取得は、図8に示すように広範囲コアーススキャンの後に行われてもよいが、任意のタイミングで実施可能である。すなわち撮影装置50による周辺環境の撮影画像の取得は、広範囲コアーススキャンに先立って行われてもよいし、広範囲コアーススキャンと同時的に行われてもよい。 The acquisition of the photographed image of the surrounding environment by the photographing apparatus 50 may be performed after the wide area co-earth scan as shown in FIG. 8, but it can be performed at any timing. That is, the acquisition of the captured image of the surrounding environment by the photographing device 50 may be performed prior to the wide-range co-earth scan, or may be performed simultaneously with the wide-range co-earth scan.
 そしてLiDAR制御部12は、広範囲コアーススキャンの結果(すなわち受光部24の受光結果D3)に基づいて、1又は複数の関心領域を特定する(S3)。関心領域は、受光結果D3から導き出される検知対象(すなわち障害候補)の位置を基準に決められる限定的な領域であり、広範囲コアーススキャンのスキャン範囲及び撮影装置50の撮影範囲の各々よりは小さい。 Then, the LiDAR control unit 12 identifies one or a plurality of regions of interest based on the result of the wide area co-earth scan (that is, the light receiving result D3 of the light receiving unit 24) (S3). The region of interest is a limited region determined based on the position of the detection target (that is, the failure candidate) derived from the light receiving result D3, and is smaller than each of the scan range of the wide range co-earth scan and the shooting range of the photographing device 50.
 本実施形態のLiDAR制御部12は、広範囲コアーススキャンの結果と、撮影画像D1の解析結果との組み合わせに基づいて、関心領域を特定する。例えば、LiDAR制御部12は、撮影画像D1を解析することで、検知対象(すなわち障害候補)の存在可能性を示す領域情報(例えばヒートマップ情報)を取得し、当該領域情報を参照して関心領域を決定してもよい。一般に、可視光イメージセンサにより取得される撮影画像は、アジマス分解能及び奥行き深度分解能に優れている。LiDAR制御部12は、そのような特性を持つ撮影画像D1の解析から得られる領域情報を、広範囲コアーススキャンの結果と組み合わせて用いることで、関心領域を精度良く決定することができる。 The LiDAR control unit 12 of the present embodiment identifies the region of interest based on the combination of the result of the wide area co-earth scan and the analysis result of the captured image D1. For example, the LiDAR control unit 12 analyzes the captured image D1 to acquire area information (for example, heat map information) indicating the possibility of existence of a detection target (that is, a failure candidate), and is interested in referring to the area information. The region may be determined. Generally, the captured image acquired by the visible light image sensor is excellent in azimuth resolution and depth-depth resolution. The LiDAR control unit 12 can accurately determine the region of interest by using the region information obtained from the analysis of the captured image D1 having such characteristics in combination with the result of the wide range co-earth scan.
 ヒートマップ情報の具体的な導出方法及び情報形態(例えばヒートマップ情報に含まれる情報要素)は限定されない。例えば、LiDAR制御部12は、撮影画像D1に対してセマンティックセグメンテーション技術を適用することで、撮影範囲における検知対象(すなわち障害候補)の存在可能性を示すヒートマップ情報を導出することができる。 The specific derivation method and information form of heat map information (for example, information elements included in heat map information) are not limited. For example, the LiDAR control unit 12 can derive heat map information indicating the possibility of existence of a detection target (that is, a failure candidate) in the shooting range by applying a semantic segmentation technique to the shot image D1.
 上述の撮影画像D1の解析及びヒートマップ情報等の領域情報の取得は、LiDAR制御部12以外のデバイス要素(例えば撮影制御部16)で行われてもよい。この場合、検知対象(すなわち障害候補)の存在可能性を示す領域情報が、当該デバイス要素からLiDAR制御部12に送られる。 The analysis of the above-mentioned captured image D1 and the acquisition of area information such as heat map information may be performed by a device element other than the LiDAR control unit 12 (for example, the photographing control unit 16). In this case, the area information indicating the possibility of existence of the detection target (that is, the failure candidate) is sent from the device element to the LiDAR control unit 12.
 そして、LiDAR制御部12の制御下で、LiDAR装置11によって関心領域ファインスキャンが行われる(S4)。すなわちLiDAR制御部12は、LiDAR装置11から出射されるレーザー光Lを、周辺環境のうち関心領域に向けて進行させるように、光路調整部23を制御する。その結果としての受光部24の受光結果D3は、LiDAR制御部12に送られる。 Then, under the control of the LiDAR control unit 12, the LiDAR device 11 performs a fine scan of the region of interest (S4). That is, the LiDAR control unit 12 controls the optical path adjusting unit 23 so that the laser beam L emitted from the LiDAR device 11 travels toward the region of interest in the surrounding environment. As a result, the light receiving result D3 of the light receiving unit 24 is sent to the LiDAR control unit 12.
 関心領域ファインスキャン(S4)は、広範囲コアーススキャン(S1)よりも高い解像度で行われ、障害候補の詳細情報を検知可能なスキャンデータを提供する。一般にスキャン解像度が高くなるほどスキャンに要する時間は長くなるが、関心領域ファインスキャンのスキャン範囲は関心領域に限定されるので、短時間で関心領域ファインスキャンを行うことができる。 The region of interest fine scan (S4) is performed at a higher resolution than the wide area co-earth scan (S1), and provides scan data capable of detecting detailed information of failure candidates. Generally, the higher the scan resolution, the longer the time required for scanning, but since the scan range of the region of interest fine scan is limited to the region of interest, the region of interest fine scan can be performed in a short time.
 関心領域ファインスキャンのスキャン動作は、いわゆるサッカード挙動を伴う人の眼球が示す対象特定動作(すなわち高速中心視野移動)に似ている。一般に、人の目は、人の首や体の動作速度に比べ、極めて高速に視覚情報探索を行うことが可能である。 The scan operation of the region of interest fine scan is similar to the target identification operation (that is, high-speed central visual field movement) indicated by the human eyeball with so-called saccade behavior. In general, the human eye can search for visual information at an extremely high speed compared to the movement speed of the human neck or body.
 特に、高速応答性能を持つMEMSミラーを光路調整部23として利用する場合には、人の眼球のサッカード挙動を模した高速スキャンが可能である。一般のスキャナー式LiDAR装置は、一定速度でスキャン動作を行う。一方、慣性負荷が極めて小さいMEMSミラーを利用した上述のLiDAR装置11によれば、適応的且つ変速的な高速スキャンを行うことができ、人の眼球の高速情報探索挙動に似た関心領域ファインスキャンを行うことができる。 In particular, when a MEMS mirror having high-speed response performance is used as the optical path adjusting unit 23, high-speed scanning that imitates the saccade behavior of the human eyeball is possible. A general scanner-type LiDAR device performs a scanning operation at a constant speed. On the other hand, according to the above-mentioned LiDAR device 11 using a MEMS mirror having an extremely small inertial load, adaptive and variable speed high-speed scanning can be performed, and a region of interest fine scan similar to the high-speed information search behavior of the human eyeball can be performed. It can be performed.
 このように関心領域ファインスキャン処理では、広範囲コアーススキャンの結果及びヒートマップ情報等の領域情報に基づいて決められる選択領域(すなわち関心領域)の高解像度スキャンを、オンデマンド的に、能動的且つ短時間に実行可能である。なお関心領域ファインスキャンの探索粒度(例えばスキャン解像度)は限定されず、予め固定的に定められていてもよいし、LiDAR制御部12が状況(例えば探索用途)に応じて関心領域ファインスキャンの探索粒度を可変的に決めてもよい。 In this way, in the region of interest fine scan process, a high-resolution scan of the selected region (that is, the region of interest) determined based on the result of the wide area co-earth scan and the region information such as heat map information is performed on demand, actively and shortly. It is feasible in time. The search particle size (for example, scan resolution) of the region of interest fine scan is not limited and may be fixedly determined in advance, and the LiDAR control unit 12 searches for the region of interest fine scan according to the situation (for example, search use). The particle size may be variably determined.
 そして、LiDAR制御部12は、関心領域ファインスキャンの結果(すなわち受光部24の受光結果D3)に基づいて、周辺情報を取得する(S5)。すなわちLiDAR制御部12は、関心領域ファインスキャンの結果に基づいて、障害候補の具体的な詳細情報を取得する。例えば、障害候補が人、動物、建造物、及び他の物体のいずれの種類に分類されるのかが、関心領域ファインスキャンの結果に基づいて決められる。 Then, the LiDAR control unit 12 acquires peripheral information based on the result of the fine scan of the region of interest (that is, the light receiving result D3 of the light receiving unit 24) (S5). That is, the LiDAR control unit 12 acquires specific detailed information on the failure candidate based on the result of the fine scan of the region of interest. For example, whether obstacle candidates are classified into humans, animals, buildings, or other objects is determined based on the results of the region of interest fine scan.
 そして、LiDAR制御部12は、周辺情報に基づいて解析情報を取得する(S6)。例えば、LiDAR制御部12は、周辺情報(例えば障害候補の種類に関する情報)を解析し、「障害候補が実際の障害となりうる可能性が高いのか」や「車両が障害候補と接触する可能性が高いのか」を解析情報として取得することができる。 Then, the LiDAR control unit 12 acquires analysis information based on the peripheral information (S6). For example, the LiDAR control unit 12 analyzes peripheral information (for example, information on the type of failure candidate), and "is there a high possibility that the failure candidate can be an actual failure?" Or "the vehicle may come into contact with the failure candidate." Is it expensive? ”Can be acquired as analysis information.
 なお上述の周辺情報を取得する処理(S5)と解析情報を取得する処理(S6)とは、必ずしもお互いから明確に区別される処理であるとは限らず、実際には同時的に行われうる。また周辺情報及び解析情報も、必ずしもお互いから明確には区別されない。 The above-mentioned process of acquiring peripheral information (S5) and the process of acquiring analysis information (S6) are not necessarily processes that are clearly distinguished from each other, and may actually be performed simultaneously. .. In addition, peripheral information and analysis information are not always clearly distinguished from each other.
[周辺情報及び解析情報の取得の具体例]
 以下に、周辺情報及び解析情報の取得の典型的な具体例が示される。
[Specific example of acquisition of peripheral information and analysis information]
The following is a typical concrete example of acquisition of peripheral information and analysis information.
[周辺対象物情報の取得]
 車両走行中に上述の測距方法(図8参照)を行うことで、広範囲コアーススキャンによって障害候補の存在の有無を検知し、その後の関心領域ファインスキャンによって障害候補の具体的な種類を検知することができる。検知可能な障害候補の種類は限定されず、例えば単なる人、動物、車、建造物及び路上物体等に障害候補を分類してもよいし、更に詳細に大人、子供、小型動物及び大型動物等に障害候補を分類してもよい。
[Acquisition of peripheral object information]
By performing the above-mentioned ranging method (see FIG. 8) while the vehicle is running, the presence or absence of a failure candidate is detected by a wide range co-earth scan, and the specific type of the failure candidate is detected by a subsequent fine scan of the region of interest. be able to. The types of obstacle candidates that can be detected are not limited, and the obstacle candidates may be classified into, for example, simple people, animals, cars, buildings, road objects, etc., and in more detail, adults, children, small animals, large animals, etc. Disability candidates may be classified into.
 図8に示す上述の一連の処理フローは、経時的に繰り返し行われてもよい。これによりLiDAR制御部12は、障害候補の経時的な状態推移を示す時系列探索情報を取得することができる。LiDAR制御部12は、このようにして取得される時系列探索情報に基づいて、周辺情報及び解析情報を取得してもよい。例えば、LiDAR制御部12は、時系列探索情報に基づいて障害候補の予測軌跡を導出し、当該予測軌跡の情報に基づいて車両と障害候補との間の衝突を回避するための進行路を導出してもよい。このようにして時系列探索情報から導出される情報は、図示しない報知装置(例えばディスプレイや音声ガイド)を介してドライバーに報知されたり、運転アシスト操作のための基礎情報に供されたりすることが可能である。 The above-mentioned series of processing flows shown in FIG. 8 may be repeated over time. As a result, the LiDAR control unit 12 can acquire time-series search information indicating the transition of the state of the failure candidate over time. The LiDAR control unit 12 may acquire peripheral information and analysis information based on the time-series search information acquired in this way. For example, the LiDAR control unit 12 derives a prediction locus of a failure candidate based on the time-series search information, and derives a traveling path for avoiding a collision between the vehicle and the failure candidate based on the information of the prediction locus. You may. The information derived from the time-series search information in this way may be notified to the driver via a notification device (for example, a display or a voice guide) (not shown), or may be used as basic information for driving assist operation. It is possible.
[障害候補の障害可能性判定]
 LiDAR制御部12は、障害候補として検知された障害候補が、実際的な観点から、障害となる可能性が高いのかを判定してもよい。
[Failure possibility judgment of failure candidate]
The LiDAR control unit 12 may determine whether the failure candidate detected as a failure candidate has a high possibility of becoming a failure from a practical point of view.
 上述の広範囲コアーススキャンによれば、例えば空中に舞っている紙などの物体(すなわち車両走行の障害になる可能性が低い物体)も障害候補として検知される。 According to the above-mentioned wide-range co-earth scan, for example, an object such as paper flying in the air (that is, an object that is unlikely to interfere with vehicle running) is also detected as an obstacle candidate.
 そのためLiDAR制御部12は、関心領域ファインスキャンの結果に対して特有の解析手法を適用し、障害候補が実際の障害となる可能性を判定してもよい。例えばLiDAR制御部12は、関心領域ファインスキャンの結果に基づいて、障害候補が路面に接地しているか否か、障害候補のサイズ、及び障害候補の経時的挙動の情報を導出し、当該導出結果から障害候補の実際の障害可能性を判定してもよい。 Therefore, the LiDAR control unit 12 may apply a specific analysis method to the result of the region of interest fine scan and determine the possibility that the failure candidate becomes an actual failure. For example, the LiDAR control unit 12 derives information on whether or not the failure candidate is in contact with the road surface, the size of the failure candidate, and the temporal behavior of the failure candidate based on the result of the region of interest fine scan, and the derivation result. The actual possibility of failure of the failure candidate may be determined from.
[俯瞰投影空間に投影した自車走行予定ルート空間と障害候補との接触可能性判定]
 LiDAR制御部12は、広範囲コアーススキャンの結果に基づいて俯瞰概要分布マップを生成してもよい。俯瞰概要分布マップは、自車両の予定走行ルート及び予定走行ルート近傍のエリアを含む範囲において、障害候補の大まかな位置(例えば方位及び距離)を示すマップ情報である。なお俯瞰概要分布マップに反映される障害候補については、障害候補が実際的な観点から障害となる可能性が高いか否かは問われなくてもよい。
[Determination of contact possibility between the planned route space of the vehicle projected on the bird's-eye view projection space and the obstacle candidate]
The LiDAR control unit 12 may generate a bird's-eye view distribution map based on the result of a wide range co-earth scan. The bird's-eye view outline distribution map is map information showing the rough position (for example, direction and distance) of the obstacle candidate in the range including the planned traveling route of the own vehicle and the area near the planned traveling route. Regarding the obstacle candidates reflected in the bird's-eye view distribution map, it does not matter whether or not the obstacle candidates are likely to be obstacles from a practical point of view.
 そしてLiDAR制御部12は、俯瞰概要分布マップと撮影画像D1の解析結果とに基づいて関心領域ファインスキャンを行ってもよい。LiDAR制御部12は、関心領域ファインスキャンの結果に基づいて、障害候補を、経時的に追跡する必要がある対象(すなわちトラッキング対象)に分類するか否かの判定を行ってもよい。例えば、関心領域ファインスキャンの結果、車両(バイクを含む)、自転車、歩行者、或いは動物と認識される障害候補は、トラッキング対象に分類されてもよい。 Then, the LiDAR control unit 12 may perform a fine scan of the region of interest based on the bird's-eye view outline distribution map and the analysis result of the captured image D1. Based on the result of the region of interest fine scan, the LiDAR control unit 12 may determine whether or not to classify the failure candidate into a target (that is, a tracking target) that needs to be tracked over time. For example, obstacle candidates recognized as vehicles (including motorcycles), bicycles, pedestrians, or animals as a result of a region of interest fine scan may be classified as tracking targets.
 またLiDAR制御部12は、トラッキング対象の進路予測評価を行ってもよい。例えば、LiDAR制御部12は、対象の種類に基づく特性に基づいて、トラッキング対象の進路予測評価を行ってもよい。例えば、トラッキング対象が車両の場合には車道を進む可能性が高く、人の場合には歩道を進む可能性が高く、動物の場合には道全般(車道及び歩道を問わない)を進む可能性が高い。 Further, the LiDAR control unit 12 may perform a course prediction evaluation of the tracking target. For example, the LiDAR control unit 12 may perform a course prediction evaluation of the tracking target based on the characteristics based on the type of the target. For example, if the tracking target is a vehicle, it is likely to follow the roadway, if it is a person, it is likely to follow the sidewalk, and if it is an animal, it is likely to follow the entire road (whether it is a roadway or a sidewalk). Is high.
 なおLiDAR制御部12は、関心領域ファインスキャンの結果だけではなく、撮影画像D1の解析結果も加味した上で、障害候補がトラッキング対象に分類されるか否かの判定や、トラッキング対象の進路予測評価を行ってもよい。この場合、より詳細な分類判定及び進路予測評価を行うことが可能である。例えばLiDAR制御部12は、予定走行ルート及び予定走行ルート近傍のエリアにおける、縁石無し道路及び縁石有り道路の存在の有無に関する情報やガードレールの存在の有無に関する情報を、撮影画像D1の解析結果から得ることが可能である。 The LiDAR control unit 12 determines whether or not the failure candidate is classified as a tracking target and predicts the course of the tracking target, taking into consideration not only the result of the fine scan of the region of interest but also the analysis result of the captured image D1. Evaluation may be performed. In this case, it is possible to perform more detailed classification determination and course prediction evaluation. For example, the LiDAR control unit 12 obtains information on the presence / absence of a road without a curb and a road with a curb and information on the presence / absence of a guardrail in the planned travel route and the area near the planned travel route from the analysis result of the captured image D1. It is possible.
 次に、上述の測距方法を車両運転技術(特に車両が障害物と接触することを回避する技術)に応用した場合の一例について説明する。 Next, an example will be described when the above-mentioned distance measuring method is applied to vehicle driving technology (particularly technology for avoiding contact of a vehicle with an obstacle).
 図9は、図8に示す測距方法を車両運転技術(特に車両が障害物と接触することを回避する技術)に応用した場合の一例を示すフローチャートである。 FIG. 9 is a flowchart showing an example of a case where the distance measuring method shown in FIG. 8 is applied to a vehicle driving technique (particularly a technique for avoiding the vehicle coming into contact with an obstacle).
 LiDAR制御部12は、関心領域ファインスキャンの結果から導き出される周辺情報及び解析情報(図8のS5及びS6参照)に基づいて、車両と障害候補との間の接触可能性を判定する(図9のS11)。当該接触可能性は、様々な情報を総合的に考慮して判定されることが可能である。例えば車両と障害候補との間の相対位置(例えば距離及び方位)に基づいて、車両と障害候補との間の接触可能性が判定されてもよい。車両と障害候補との間の接触可能性が大きくないとLiDAR制御部12により判定される場合(S12のN)、接触可能性の判定が引き続き繰り返し行われる(S11)。 The LiDAR control unit 12 determines the possibility of contact between the vehicle and the obstacle candidate based on the peripheral information and analysis information (see S5 and S6 in FIG. 8) derived from the result of the region of interest fine scan (FIG. 9). S11). The contact possibility can be determined by comprehensively considering various information. For example, the possibility of contact between the vehicle and the obstacle candidate may be determined based on the relative position (for example, distance and direction) between the vehicle and the obstacle candidate. When the LiDAR control unit 12 determines that the contact possibility between the vehicle and the obstacle candidate is not large (N in S12), the contact possibility determination is continuously repeated (S11).
 一方、車両と障害候補との間の接触可能性が大きいとLiDAR制御部12により判定される場合(S12のY)、LiDAR制御部12は、周辺情報及び解析情報に基づいて、車両の接触回避動作が可能か否かを判定する(S13)。接触回避動作は限定されないが、ステアリング操作によって車両進路を変更する動作が接触回避動作に含まれうる。接触回避動作が可能であるとLiDAR制御部12により判定される場合(S14のY)、LiDAR制御部12は車両に接触回避動作を実行させる(S15)。 On the other hand, when the LiDAR control unit 12 determines that the possibility of contact between the vehicle and the obstacle candidate is high (Y in S12), the LiDAR control unit 12 avoids contact with the vehicle based on peripheral information and analysis information. It is determined whether or not the operation is possible (S13). The contact avoidance operation is not limited, but the operation of changing the vehicle course by the steering operation may be included in the contact avoidance operation. When the LiDAR control unit 12 determines that the contact avoidance operation is possible (Y in S14), the LiDAR control unit 12 causes the vehicle to execute the contact avoidance operation (S15).
 一方、接触回避動作が可能ではないとLiDAR制御部12により判定される場合(S14のN)、LiDAR制御部12は、周辺情報及び解析情報に基づいて、車両の制動動作が可能か否かを判定する(S16)。制動動作が可能であるとLiDAR制御部12により判定される場合(S17のY)、LiDAR制御部12は車両に制動動作を実行させる(S18)。 On the other hand, when the LiDAR control unit 12 determines that the contact avoidance operation is not possible (N in S14), the LiDAR control unit 12 determines whether or not the vehicle braking operation is possible based on the peripheral information and the analysis information. Judgment (S16). When the LiDAR control unit 12 determines that the braking operation is possible (Y in S17), the LiDAR control unit 12 causes the vehicle to execute the braking operation (S18).
 一方、制動動作が可能ではないとLiDAR制御部12により判定される場合(S17のN)、LiDAR制御部12は、車両に緊急動作を実行させる。緊急動作は限定されず、例えば車両を緊急停止させる動作や、ドライバーに対して緊急事態が発生していることを報知する動作が緊急動作に含まれうる。 On the other hand, when the LiDAR control unit 12 determines that the braking operation is not possible (N in S17), the LiDAR control unit 12 causes the vehicle to perform an emergency operation. The emergency operation is not limited, and for example, an operation of stopping the vehicle in an emergency and an operation of notifying the driver that an emergency situation has occurred may be included in the emergency operation.
 そしてLiDAR制御部12は、周辺情報及び解析情報に基づいて、車両の継続走行が可能か否かを判定する(S20)。車両の継続走行が可能であるとLiDAR制御部12により判定される場合(S21のY)、LiDAR制御部12は、再び、周辺情報及び解析情報に基づいて車両と障害候補との間の接触可能性を判定する(S11)。 Then, the LiDAR control unit 12 determines whether or not the vehicle can continue to run based on the peripheral information and the analysis information (S20). When the LiDAR control unit 12 determines that the vehicle can continue to run (Y in S21), the LiDAR control unit 12 can again make contact between the vehicle and the obstacle candidate based on the peripheral information and the analysis information. The sex is determined (S11).
 一方、車両の継続走行が可能ではないとLiDAR制御部12により判定される場合(S21のN)、LiDAR制御部12は車両の走行を停止させるための指令信号を発し、車両は完全に停止させられる(S22)。 On the other hand, when the LiDAR control unit 12 determines that the continuous running of the vehicle is not possible (N in S21), the LiDAR control unit 12 issues a command signal for stopping the running of the vehicle, and the vehicle is completely stopped. (S22).
 以下に、上述の測距方法の他の応用事例が例示される。 Below, other application examples of the above-mentioned ranging method are illustrated.
[応用事例1]
 道路(例えば高速道路)での車両走行の際に、自車両と、予定走行ルート上に位置する障害候補(例えば他の車両や路上障害物)及び予定走行ルート近傍の障害候補との間の相対位置関係の検知に関し、上述の測距方法を応用することができる。
[Application example 1]
When driving a vehicle on a road (for example, a highway), the relative between the own vehicle and an obstacle candidate located on the planned driving route (for example, another vehicle or an obstacle on the road) and an obstacle candidate near the planned driving route. The above-mentioned distance measuring method can be applied to the detection of the positional relationship.
 例えば、予定走行ルート上の車線(すなわち走行車線)及び当該走行車線に隣り合う車線(すなわち隣接車線)を基準に、広範囲コアーススキャンのスキャン範囲を決めることができる。 For example, the scan range of a wide range co-earth scan can be determined based on the lane on the planned driving route (that is, the driving lane) and the lane adjacent to the driving lane (that is, the adjacent lane).
 関心領域ファインスキャン処理では、障害候補を落下物、車両及びバイクなどに分類可能な程度の解像度で、スキャンを行うことができる。 In the area of interest fine scan process, obstacle candidates can be scanned with a resolution that can be classified into falling objects, vehicles, motorcycles, and the like.
[応用事例2]
 家屋等の建造物が近くに存在し且つ歩行者が存在しうる市街道路を車両が走行する場合に、上述の測距方法を行うことができる。
[Application example 2]
The above-mentioned distance measuring method can be performed when the vehicle travels on a city road where a building such as a house exists nearby and pedestrians may exist.
 例えば、予定走行ルート上の走行車線及び隣接車線に加え、走行車線の近傍において人や動物が歩く可能性が高い道(車道及び歩道を含む)を基準に、広範囲コアーススキャンのスキャン範囲を決めることができる。 For example, determine the scan range of a wide range of co-earth scans based on the roads (including roadways and sidewalks) where people and animals are likely to walk in the vicinity of the driving lane, in addition to the driving lane and adjacent lanes on the planned driving route. Can be done.
 関心領域ファインスキャン処理では、障害候補を多種類(例えば屋台などの干渉物を含む)に分類可能な程度の解像度で、スキャンを行うことができる。また、歩行者の動的行動追跡を行うことができる程度の解像度で時系列探索情報を取得することができる。このように、車両の接触回避のための詳細な方位情報及び車両軌跡情報の取得を、関心領域において行うことができる。 In the area of interest fine scan process, it is possible to scan with a resolution that can classify obstacle candidates into many types (including interfering objects such as food stalls). In addition, it is possible to acquire time-series search information with a resolution sufficient for tracking the dynamic behavior of a pedestrian. In this way, detailed directional information and vehicle trajectory information for avoiding contact with the vehicle can be acquired in the region of interest.
[応用事例3]
 車両が野生動物との事故多発地帯(例えば夜間田舎道)を走行し、且つ、事前にLDM等のマップ情報を取得している場合に、上述の測距方法を行うことができる。
[Application example 3]
The above-mentioned distance measuring method can be performed when the vehicle travels in an accident-prone area with wild animals (for example, a country road at night) and the map information such as LDM is acquired in advance.
 例えば、予定走行ルート上の走行車線及び隣接車線に加え、走行車線の近傍において人や動物が歩く可能性が高い道(車道及び歩道を含む)を基準に、広範囲コアーススキャンのスキャン範囲を決めることができる。 For example, determine the scan range of a wide range of co-earth scans based on the roads (including roadways and sidewalks) where people and animals are likely to walk in the vicinity of the driving lane, in addition to the driving lane and adjacent lanes on the planned driving route. Can be done.
 関心領域ファインスキャン処理では、広範囲コアーススキャンの結果から特定される障害候補の位置を基準とした3次元空間を、スキャン範囲とすることができる。 In the area of interest fine scan process, the scan range can be a three-dimensional space based on the position of the failure candidate identified from the result of the wide area co-earth scan.
[応用事例4]
 不特定物が配置されているガレージに車両を駐車する際に、上述の測距方法を行うことができる。
[Application example 4]
When the vehicle is parked in the garage where the unspecified object is arranged, the above-mentioned distance measuring method can be performed.
 例えば、LiDAR制御部12は、広範囲コアーススキャンの結果から、ガレージのおおよそのスペース形状を認識するとともに、ガレージにおける突起物の有無及び位置を認識することができる。 For example, the LiDAR control unit 12 can recognize the approximate space shape of the garage from the result of the wide area co-earth scan, and can also recognize the presence / absence and position of protrusions in the garage.
 関心領域ファインスキャン処理では、ガレージにおける接触可能性がある物体を検知可能な高い解像で、スキャンを行うことができる。例えば自宅ガレージの場合、時間的な拘束条件が比較的緩いため、非常に高い解像度で時間をかけて関心領域ファインスキャンを実施しうる。 In the area of interest fine scan process, it is possible to scan with a high resolution that can detect objects that may come into contact with each other in the garage. For example, in the case of a home garage, the time constraints are relatively loose, so it is possible to perform a fine scan of the region of interest at a very high resolution over time.
 以上説明したように上述のLiDAR装置11は、楕円の一部の形状を有する反射面32を用いるため、車両等の設置対象からの突出の程度を抑えるのに有利であり、設置対象から一切突出しないように設置されることも可能である。 As described above, since the above-mentioned LiDAR device 11 uses the reflective surface 32 having a partial shape of an ellipse, it is advantageous to suppress the degree of protrusion from the installation target such as a vehicle, and the LiDAR device 11 protrudes from the installation target at all. It is also possible to install it so that it does not.
 また回転楕円体の外周面の一部の形状を有する反射面32(図2参照)を用いることで、反射面32の形成が容易であり、反射面32の占有スペースの大きさを抑えることができる。 Further, by using the reflective surface 32 (see FIG. 2) having a part shape of the outer peripheral surface of the spheroid, the reflective surface 32 can be easily formed and the size of the occupied space of the reflective surface 32 can be suppressed. can.
 また楕円柱の側面の一部の形状を有する反射面32(図4参照)を用いることで、反射面32は、基本的に、あらゆる方向に進行するレーザー光Lを適切に反射して第2焦点F2に向けて進行させることができる。 Further, by using the reflecting surface 32 (see FIG. 4) having a part of the shape of the side surface of the elliptical column, the reflecting surface 32 basically appropriately reflects the laser beam L traveling in all directions and is second. It can be advanced towards focus F2.
 また光路調整部23がMEMSミラーを有することで、LiDAR装置11は適応的且つ変速的な高速スキャンを行うことができる。 Further, since the optical path adjusting unit 23 has a MEMS mirror, the LiDAR device 11 can perform adaptive and variable speed scanning.
 また光路調整部23を第1焦点F1に配置することによって、レーザー光Lは適切に第1焦点F1を通過することができる。 Further, by arranging the optical path adjusting unit 23 at the first focal point F1, the laser beam L can appropriately pass through the first focal point F1.
 また図5に示すように反射体22の外側で光路調整部23がレーザー光Lの進行方向を調整して、レーザー光Lを、入口通過部34を介して反射体22の内側に入射させることで、反射体22の小型化を促すことができる。 Further, as shown in FIG. 5, the optical path adjusting unit 23 adjusts the traveling direction of the laser beam L on the outside of the reflector 22 so that the laser beam L is incident on the inside of the reflector 22 via the inlet passing portion 34. Therefore, it is possible to promote the miniaturization of the reflector 22.
 また出口通過部33を第2焦点F2の位置に配置することによって、実質的なレーザー光走査中心となる第2焦点F2を基準とした装置設計が容易である。 Further, by arranging the exit passing portion 33 at the position of the second focal point F2, it is easy to design the device based on the second focal point F2 which is the substantial center of laser light scanning.
 また反射面32で反射後のレーザー光Lが通過する出射窓部38(図6参照)が、反射面32が成す楕円の短軸よりも小さい径を有することで、出射窓部38を小さく目立たないように構成することができる。 Further, the exit window portion 38 (see FIG. 6) through which the laser beam L after reflection passes on the reflection surface 32 has a diameter smaller than the minor axis of the ellipse formed by the reflection surface 32, so that the exit window portion 38 is small and conspicuous. It can be configured so that it does not.
 また出射窓部38を第2焦点F2の位置に配置することで、出射窓部38の大きさを最小化することが可能である。 Further, by arranging the exit window portion 38 at the position of the second focal point F2, it is possible to minimize the size of the exit window portion 38.
 また出射窓部38の外面及び窓支持部39の外面を段差無くお互いに接続することで、出射窓部38及び窓支持部39の接続部は、突出構造を持つことなく、滑らかな外面を有することができる。 Further, by connecting the outer surface of the exit window portion 38 and the outer surface of the window support portion 39 to each other without a step, the connection portion of the exit window portion 38 and the window support portion 39 has a smooth outer surface without having a protruding structure. be able to.
 またLiDAR装置11が必要に応じた光学素子26を具備することで、所望の光学特性を持ったレーザー光LをLiDAR装置11から出射させることが可能である。 Further, by providing the LiDAR device 11 with an optical element 26 as required, it is possible to emit the laser beam L having desired optical characteristics from the LiDAR device 11.
 また楕円の一部の形状を持つ反射面32を具備するLiDAR装置11に対してLiDAR制御部12を組み合わせることにより、設置対象からのLiDARシステム10の突出の程度を抑えつつ、周辺環境の広範囲にわたるレーザー光走査を行うことができる。同様に、楕円の一部の形状を持つ反射面32で反射させたレーザー光Lを使った測距方法を実施することで、設置対象からのLiDARシステム10の突出の程度を抑えつつ、周辺環境の広範囲にわたるレーザー光走査を行うことができる。 Further, by combining the LiDAR control unit 12 with the LiDAR device 11 provided with the reflecting surface 32 having a part of the ellipse shape, the degree of protrusion of the LiDAR system 10 from the installation target is suppressed, and a wide range of the surrounding environment is covered. Laser light scanning can be performed. Similarly, by implementing a distance measuring method using a laser beam L reflected by a reflecting surface 32 having a part of an ellipse, the surrounding environment can be suppressed while suppressing the degree of protrusion of the LiDAR system 10 from the installation target. A wide range of laser light scanning can be performed.
 また図8に示すように、広範囲コアーススキャン及び関心領域ファインスキャンを組み合わせることによって、広域範囲に対する光検出及び測距の実施と、障害候補の詳細情報の取得とを、短時間且つ高精度に行うことができる。同様に、広範囲コアーススキャン及び関心領域ファインスキャンが行われる測距方法によって、広域範囲に対する光検出及び測距の実施と、障害候補の詳細情報の取得とを、短時間且つ高精度に行うことができる。特に、撮影画像D1の解析に基づいて周辺環境における関心領域を特定することで、関心領域の特定を精度良く行うことができる。 Further, as shown in FIG. 8, by combining a wide area co-earth scan and a fine area of interest scan, light detection and distance measurement for a wide range and acquisition of detailed information on failure candidates can be performed in a short time and with high accuracy. be able to. Similarly, by a distance measuring method in which a wide range co-earth scan and a fine scan of a region of interest are performed, it is possible to perform light detection and distance measurement for a wide range and acquire detailed information on failure candidates in a short time and with high accuracy. can. In particular, by specifying the region of interest in the surrounding environment based on the analysis of the captured image D1, the region of interest can be specified with high accuracy.
 本明細書で開示されている実施形態及び変形例はすべての点で例示に過ぎず限定的には解釈されないことに留意されるべきである。上述の実施形態及び変形例は、添付の特許請求の範囲及びその趣旨を逸脱することなく、様々な形態での省略、置換及び変更が可能である。例えば上述の実施形態及び変形例が組み合わされてもよく、また上述以外の実施形態が上述の実施形態又は変形例と組み合わされてもよい。また、本明細書に記載された本開示の効果は例示に過ぎず、その他の効果があってもよい。 It should be noted that the embodiments and variations disclosed herein are merely exemplary in all respects and are not to be construed in a limited way. The above-described embodiments and modifications can be omitted, replaced or modified in various forms without departing from the scope and purpose of the attached claims. For example, the above-described embodiment and modification may be combined, or an embodiment other than the above may be combined with the above-mentioned embodiment or modification. In addition, the effects of the present disclosure described herein are merely exemplary and may have other effects.
 また上述の技術的思想を具現化する技術的カテゴリーは限定されない。例えば上述の測距方法に含まれる1又は複数の手順(ステップ)をコンピュータに実行させるためのコンピュータプログラムによって、上述の技術的思想が具現化されてもよい。またそのようなコンピュータプログラムが記録されたコンピュータが読み取り可能な非一時的(non-transitory)な記録媒体によって、上述の技術的思想が具現化されてもよい。 Also, the technical categories that embody the above-mentioned technical ideas are not limited. For example, the above-mentioned technical idea may be embodied by a computer program for causing a computer to execute one or a plurality of procedures (steps) included in the above-mentioned distance measuring method. Further, the above-mentioned technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
 なお、本開示は以下のような構成を取ることもできる。
[項目1]
 レーザー光を発する発光部と、
 楕円の一部の形状を有し前記レーザー光を反射する反射面と、前記反射面で反射された前記レーザー光が通過する出口通過部と、を含む反射体と、
 前記レーザー光が前記楕円の一方の焦点の位置を通過した後に前記反射面で反射されるように、前記レーザー光の進行方向を調整する光路調整部と、
 を備えるLiDAR装置。
[項目2]
 前記反射面は、回転楕円体の外周面の一部の形状を有する項目1に記載のLiDAR装置。
[項目3]
 前記反射面は、楕円柱の側面の一部の形状を有する項目1又は2に記載のLiDAR装置。
[項目4]
 前記光路調整部は、MEMSミラーを有する項目1~3のいずれかに記載のLiDAR装置。
[項目5]
 前記光路調整部は、前記一方の焦点の位置に配置される項目1~4のいずれかに記載のLiDAR装置。
[項目6]
 前記反射体は、前記レーザー光が通過可能な入口通過部を有し、
 前記光路調整部は、前記反射体の外側で前記レーザー光の進行方向を調整し、前記レーザー光が前記入口通過部を通過して前記反射体の内側に入射した後に前記反射面で反射される項目1~5のいずれかに記載のLiDAR装置。
[項目7]
 前記出口通過部は、前記楕円の他方の焦点の位置に配置される項目1~6のいずれかに記載のLiDAR装置。
[項目8]
 前記反射面で反射後の前記レーザー光が通過する出射窓部を有するカバー体を備え、
 前記出射窓部は、前記楕円の短軸よりも小さい径を有する項目1~7のいずれかに記載のLiDAR装置。
[項目9]
 前記出射窓部は、前記楕円の他方の焦点の位置に配置される項目8に記載のLiDAR装置。
[項目10]
 前記カバー体は、前記出射窓部を支持する窓支持部を有し、
 前記出射窓部の外面及び前記窓支持部の外面は、段差無く、お互いに接続されている項目8又は9に記載のLiDAR装置。
[項目11]
 前記レーザー光が入射する光学素子を備える項目1~10のいずれかに記載のLiDAR装置。
[項目12]
 レーザー光を発する発光部と、楕円の一部の形状を有し前記レーザー光を反射する反射面と前記反射面で反射された前記レーザー光が通過する出口通過部とを含む反射体と、前記レーザー光が前記楕円の一方の焦点の位置を通過した後に前記反射面で反射されるように前記レーザー光の進行方向を調整する光路調整部と、を具備するLiDAR装置と、
 少なくとも前記光路調整部を制御するLiDAR制御部と、を備えるLiDARシステム。
[項目13]
 周辺環境の撮影画像を取得する環境画像取得部を備え、
 前記LiDAR制御部は、
 前記撮影画像の解析に基づいて周辺環境における関心領域を特定し、
 前記LiDAR装置から出射される前記レーザー光を、周辺環境のうち前記関心領域に向けて進行させるように、前記光路調整部を制御する項目12に記載のLiDARシステム。
[項目14]
 LiDAR装置の発光部からレーザー光を発する工程と、
 前記レーザー光が、楕円の一方の焦点の位置を通過し、その後前記楕円の一部の形状を有する反射体の反射面で反射され、その後前記反射体の出口通過部を通過するように、光路調整部によって前記レーザー光の進行方向を調整する工程と、
 を含む測距方法。
[項目15]
 周辺環境の撮影画像の解析に基づいて周辺環境における関心領域を特定する工程を含み、
 周辺環境のうち前記関心領域に向けて前記レーザー光を進行させるように、前記光路調整部によって前記レーザー光の進行方向が調整される項目14に記載の測距方法。
[項目16]
 LiDAR装置の発光部からレーザー光を発するステップと、
 前記レーザー光が、楕円の一方の焦点の位置を通過し、その後前記楕円の一部の形状を有する反射体の反射面で反射され、その後前記反射体の出口通過部を通過するように、光路調整部によって前記レーザー光の進行方向を調整するステップと、
 をコンピュータに実行させるためのプログラム。
[項目17]
 周辺環境の撮影画像の解析に基づいて周辺環境における関心領域を特定するステップをコンピュータに実行させ、
 周辺環境のうち前記関心領域に向けて前記レーザー光を進行させるように、前記光路調整部によって前記レーザー光の進行方向が調整される項目16に記載のプログラム。
The present disclosure may also have the following structure.
[Item 1]
A light emitting part that emits laser light and
A reflector including a reflective surface having a partial shape of an ellipse and reflecting the laser beam, and an outlet passing portion through which the laser beam reflected by the reflective surface passes.
An optical path adjusting unit that adjusts the traveling direction of the laser beam so that the laser beam is reflected by the reflecting surface after passing through the position of one focal point of the ellipse.
A LiDAR device comprising.
[Item 2]
Item 2. The LiDAR device according to Item 1, wherein the reflective surface has a shape of a part of the outer peripheral surface of a spheroid.
[Item 3]
The LiDAR device according to item 1 or 2, wherein the reflective surface has a shape of a part of a side surface of an elliptical column.
[Item 4]
The LiDAR device according to any one of items 1 to 3, wherein the optical path adjusting unit has a MEMS mirror.
[Item 5]
The LiDAR device according to any one of items 1 to 4, wherein the optical path adjusting unit is arranged at the position of one of the focal points.
[Item 6]
The reflector has an inlet passage through which the laser beam can pass.
The optical path adjusting portion adjusts the traveling direction of the laser light outside the reflector, and the laser light passes through the inlet passing portion and enters the inside of the reflector, and then is reflected by the reflecting surface. The LiDAR apparatus according to any one of items 1 to 5.
[Item 7]
The LiDAR device according to any one of items 1 to 6, wherein the exit passage portion is arranged at the position of the other focal point of the ellipse.
[Item 8]
A cover body having an emission window portion through which the laser beam after reflection is passed by the reflection surface is provided.
The LiDAR device according to any one of Items 1 to 7, wherein the exit window portion has a diameter smaller than the minor axis of the ellipse.
[Item 9]
The LiDAR device according to item 8, wherein the exit window portion is arranged at the position of the other focal point of the ellipse.
[Item 10]
The cover body has a window support portion that supports the exit window portion, and the cover body has a window support portion.
The LiDAR device according to item 8 or 9, wherein the outer surface of the exit window portion and the outer surface of the window support portion are connected to each other without a step.
[Item 11]
The LiDAR apparatus according to any one of Items 1 to 10, further comprising an optical element to which the laser beam is incident.
[Item 12]
A reflector including a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes. A LiDAR device comprising an optical path adjusting unit that adjusts the traveling direction of the laser light so that the laser light passes through the position of one focal point of the ellipse and then is reflected by the reflecting surface.
A LiDAR system including at least a LiDAR control unit that controls the optical path adjustment unit.
[Item 13]
Equipped with an environment image acquisition unit that acquires captured images of the surrounding environment
The LiDAR control unit
Based on the analysis of the captured image, the area of interest in the surrounding environment is identified.
The LiDAR system according to item 12, wherein the optical path adjusting unit is controlled so that the laser beam emitted from the LiDAR device travels toward the region of interest in the surrounding environment.
[Item 14]
The process of emitting laser light from the light emitting part of the LiDAR device,
The optical path is such that the laser beam passes through the position of one focal point of the ellipse, then is reflected by the reflecting surface of the reflector having the shape of a part of the ellipse, and then passes through the exit passage portion of the reflector. The process of adjusting the traveling direction of the laser beam by the adjusting unit, and
Distance measurement method including.
[Item 15]
Including the step of identifying the region of interest in the surrounding environment based on the analysis of the captured image of the surrounding environment.
The distance measuring method according to item 14, wherein the traveling direction of the laser beam is adjusted by the optical path adjusting unit so that the laser beam is advanced toward the region of interest in the surrounding environment.
[Item 16]
The step of emitting laser light from the light emitting part of the LiDAR device,
The optical path is such that the laser beam passes through the position of one focal point of the ellipse, then is reflected by the reflecting surface of the reflector having the shape of a part of the ellipse, and then passes through the exit passage portion of the reflector. The step of adjusting the traveling direction of the laser beam by the adjusting unit, and
A program that lets your computer run.
[Item 17]
Have the computer perform the steps to identify the area of interest in the surrounding environment based on the analysis of the captured image of the surrounding environment.
The program according to item 16, wherein the traveling direction of the laser beam is adjusted by the optical path adjusting unit so as to advance the laser beam toward the region of interest in the surrounding environment.
10 LiDARシステム
11 LiDAR装置
12 LiDAR制御部
21 発光部
22 反射体
23 光路調整部
26 光学素子
32 反射面
33 出口通過部
34 入口通過部
38 出射窓部
39 窓支持部
50 撮影装置
F1 第1焦点
F2 第2焦点
L レーザー光
10 LiDAR system 11 LiDAR device 12 LiDAR control unit 21 Light emitting unit 22 Reflector 23 Optical path adjustment unit 26 Optical element 32 Reflective surface 33 Exit passage 34 Entrance passage 38 Exit window 39 Window support 50 Imaging device F1 First focus F2 Second focus L laser light

Claims (17)

  1.  レーザー光を発する発光部と、
     楕円の一部の形状を有し前記レーザー光を反射する反射面と、前記反射面で反射された前記レーザー光が通過する出口通過部と、を含む反射体と、
     前記レーザー光が前記楕円の一方の焦点の位置を通過した後に前記反射面で反射されるように、前記レーザー光の進行方向を調整する光路調整部と、
     を備えるLiDAR装置。
    A light emitting part that emits laser light and
    A reflector including a reflective surface having a partial shape of an ellipse and reflecting the laser beam, and an outlet passing portion through which the laser beam reflected by the reflective surface passes.
    An optical path adjusting unit that adjusts the traveling direction of the laser beam so that the laser beam is reflected by the reflecting surface after passing through the position of one focal point of the ellipse.
    A LiDAR device comprising.
  2.  前記反射面は、回転楕円体の外周面の一部の形状を有する請求項1に記載のLiDAR装置。 The LiDAR device according to claim 1, wherein the reflective surface has a shape of a part of the outer peripheral surface of a spheroid.
  3.  前記反射面は、楕円柱の側面の一部の形状を有する請求項1に記載のLiDAR装置。 The LiDAR device according to claim 1, wherein the reflective surface has a shape of a part of a side surface of an elliptical column.
  4.  前記光路調整部は、MEMSミラーを有する請求項1に記載のLiDAR装置。 The LiDAR device according to claim 1, wherein the optical path adjusting unit has a MEMS mirror.
  5.  前記光路調整部は、前記一方の焦点の位置に配置される請求項1に記載のLiDAR装置。 The LiDAR device according to claim 1, wherein the optical path adjusting unit is arranged at the position of one of the focal points.
  6.  前記反射体は、前記レーザー光が通過可能な入口通過部を有し、
     前記光路調整部は、前記反射体の外側で前記レーザー光の進行方向を調整し、前記レーザー光が前記入口通過部を通過して前記反射体の内側に入射した後に前記反射面で反射される請求項1に記載のLiDAR装置。
    The reflector has an inlet passage through which the laser beam can pass.
    The optical path adjusting portion adjusts the traveling direction of the laser light outside the reflector, and the laser light passes through the inlet passing portion and enters the inside of the reflector, and then is reflected by the reflecting surface. The LiDAR apparatus according to claim 1.
  7.  前記出口通過部は、前記楕円の他方の焦点の位置に配置される請求項1に記載のLiDAR装置。 The LiDAR device according to claim 1, wherein the exit passage portion is arranged at the position of the other focal point of the ellipse.
  8.  前記反射面で反射後の前記レーザー光が通過する出射窓部を有するカバー体を備え、
     前記出射窓部は、前記楕円の短軸よりも小さい径を有する請求項1に記載のLiDAR装置。
    A cover body having an emission window portion through which the laser beam after reflection is passed by the reflection surface is provided.
    The LiDAR device according to claim 1, wherein the exit window portion has a diameter smaller than the minor axis of the ellipse.
  9.  前記出射窓部は、前記楕円の他方の焦点の位置に配置される請求項8に記載のLiDAR装置。 The LiDAR device according to claim 8, wherein the exit window portion is arranged at the position of the other focal point of the ellipse.
  10.  前記カバー体は、前記出射窓部を支持する窓支持部を有し、
     前記出射窓部の外面及び前記窓支持部の外面は、段差無く、お互いに接続されている請求項8に記載のLiDAR装置。
    The cover body has a window support portion that supports the exit window portion, and the cover body has a window support portion.
    The LiDAR device according to claim 8, wherein the outer surface of the exit window portion and the outer surface of the window support portion are connected to each other without a step.
  11.  前記レーザー光が入射する光学素子を備える請求項1に記載のLiDAR装置。 The LiDAR device according to claim 1, further comprising an optical element to which the laser beam is incident.
  12.  レーザー光を発する発光部と、楕円の一部の形状を有し前記レーザー光を反射する反射面と前記反射面で反射された前記レーザー光が通過する出口通過部とを含む反射体と、前記レーザー光が前記楕円の一方の焦点の位置を通過した後に前記反射面で反射されるように前記レーザー光の進行方向を調整する光路調整部と、を具備するLiDAR装置と、
     少なくとも前記光路調整部を制御するLiDAR制御部と、を備えるLiDARシステム。
    A reflector including a light emitting portion that emits laser light, a reflecting surface that has a partial shape of an ellipse and reflects the laser light, and an outlet passing portion through which the laser light reflected by the reflecting surface passes, and the above. A LiDAR device comprising an optical path adjusting unit that adjusts the traveling direction of the laser light so that the laser light passes through the position of one focal point of the ellipse and then is reflected by the reflecting surface.
    A LiDAR system including at least a LiDAR control unit that controls the optical path adjustment unit.
  13.  周辺環境の撮影画像を取得する環境画像取得部を備え、
     前記LiDAR制御部は、
     前記撮影画像の解析に基づいて周辺環境における関心領域を特定し、
     前記LiDAR装置から出射される前記レーザー光を、周辺環境のうち前記関心領域に向けて進行させるように、前記光路調整部を制御する請求項12に記載のLiDARシステム。
    Equipped with an environment image acquisition unit that acquires captured images of the surrounding environment
    The LiDAR control unit
    Based on the analysis of the captured image, the area of interest in the surrounding environment is identified.
    The LiDAR system according to claim 12, wherein the optical path adjusting unit is controlled so that the laser beam emitted from the LiDAR device travels toward the region of interest in the surrounding environment.
  14.  LiDAR装置の発光部からレーザー光を発する工程と、
     前記レーザー光が、楕円の一方の焦点の位置を通過し、その後前記楕円の一部の形状を有する反射体の反射面で反射され、その後前記反射体の出口通過部を通過するように、光路調整部によって前記レーザー光の進行方向を調整する工程と、
     を含む測距方法。
    The process of emitting laser light from the light emitting part of the LiDAR device,
    The optical path is such that the laser beam passes through the position of one focal point of the ellipse, then is reflected by the reflecting surface of the reflector having the shape of a part of the ellipse, and then passes through the exit passage portion of the reflector. The process of adjusting the traveling direction of the laser beam by the adjusting unit, and
    Distance measurement method including.
  15.  周辺環境の撮影画像の解析に基づいて周辺環境における関心領域を特定する工程を含み、
     周辺環境のうち前記関心領域に向けて前記レーザー光を進行させるように、前記光路調整部によって前記レーザー光の進行方向が調整される請求項14に記載の測距方法。
    Including the step of identifying the region of interest in the surrounding environment based on the analysis of the captured image of the surrounding environment.
    The distance measuring method according to claim 14, wherein the traveling direction of the laser beam is adjusted by the optical path adjusting unit so as to advance the laser beam toward the region of interest in the surrounding environment.
  16.  LiDAR装置の発光部からレーザー光を発するステップと、
     前記レーザー光が、楕円の一方の焦点の位置を通過し、その後前記楕円の一部の形状を有する反射体の反射面で反射され、その後前記反射体の出口通過部を通過するように、光路調整部によって前記レーザー光の進行方向を調整するステップと、
     をコンピュータに実行させるためのプログラム。
    The step of emitting laser light from the light emitting part of the LiDAR device,
    The optical path is such that the laser beam passes through the position of one focal point of the ellipse, then is reflected by the reflecting surface of the reflector having the shape of a part of the ellipse, and then passes through the exit passage portion of the reflector. The step of adjusting the traveling direction of the laser beam by the adjusting unit, and
    A program that lets your computer run.
  17.  周辺環境の撮影画像の解析に基づいて周辺環境における関心領域を特定するステップをコンピュータに実行させ、
     周辺環境のうち前記関心領域に向けて前記レーザー光を進行させるように、前記光路調整部によって前記レーザー光の進行方向が調整される請求項16に記載のプログラム。
    Have the computer perform the steps to identify the area of interest in the surrounding environment based on the analysis of the captured image of the surrounding environment.
    16. The program according to claim 16, wherein the traveling direction of the laser beam is adjusted by the optical path adjusting unit so as to advance the laser beam toward the region of interest in the surrounding environment.
PCT/JP2021/030160 2020-09-07 2021-08-18 Lidar device, lidar system, distance measurement method, and program WO2022050047A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-150000 2020-09-07
JP2020150000A JP2023165039A (en) 2020-09-07 2020-09-07 Lidar device, lidar system, distance measuring method, and program

Publications (1)

Publication Number Publication Date
WO2022050047A1 true WO2022050047A1 (en) 2022-03-10

Family

ID=80490772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/030160 WO2022050047A1 (en) 2020-09-07 2021-08-18 Lidar device, lidar system, distance measurement method, and program

Country Status (2)

Country Link
JP (1) JP2023165039A (en)
WO (1) WO2022050047A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03223712A (en) * 1990-01-30 1991-10-02 Citizen Watch Co Ltd Laser light scanner
JPH08262437A (en) * 1995-03-22 1996-10-11 Mitsubishi Electric Corp Lighting device
JPH1152286A (en) * 1997-08-05 1999-02-26 Kawaguchi Kogaku Sangyo:Kk Ring illuminator
JP2000066104A (en) * 1998-08-25 2000-03-03 Kawasaki Heavy Ind Ltd Optical system for compensating beam
WO2011013627A1 (en) * 2009-07-27 2011-02-03 Suzuki Chihiro Optical unit
JP2012252068A (en) * 2011-06-01 2012-12-20 Nippon Signal Co Ltd:The Optical scanner
WO2019010425A1 (en) * 2017-07-07 2019-01-10 Aeye, Inc. Ladar transmitter with reimager

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03223712A (en) * 1990-01-30 1991-10-02 Citizen Watch Co Ltd Laser light scanner
JPH08262437A (en) * 1995-03-22 1996-10-11 Mitsubishi Electric Corp Lighting device
JPH1152286A (en) * 1997-08-05 1999-02-26 Kawaguchi Kogaku Sangyo:Kk Ring illuminator
JP2000066104A (en) * 1998-08-25 2000-03-03 Kawasaki Heavy Ind Ltd Optical system for compensating beam
WO2011013627A1 (en) * 2009-07-27 2011-02-03 Suzuki Chihiro Optical unit
JP2012252068A (en) * 2011-06-01 2012-12-20 Nippon Signal Co Ltd:The Optical scanner
WO2019010425A1 (en) * 2017-07-07 2019-01-10 Aeye, Inc. Ladar transmitter with reimager

Also Published As

Publication number Publication date
JP2023165039A (en) 2023-11-15

Similar Documents

Publication Publication Date Title
US11346951B2 (en) Object detection system
US10345447B1 (en) Dynamic vision sensor to direct lidar scanning
US10387733B2 (en) Processing apparatus, processing system, and processing method
US20210109197A1 (en) Lidar with guard laser beam and adaptive high-intensity laser beam
JP6387407B2 (en) Perimeter detection system
US10402664B2 (en) Processing apparatus, processing system, processing program, and processing method
US7130745B2 (en) Vehicle collision warning system
JP2023022136A (en) LIDAR system and method
US20150332103A1 (en) Processing apparatus, computer program product, and processing method
CN107547784B (en) Camera system
JP2019526056A (en) Dynamic steered LIDAR adapted to the shape of the vehicle
KR20190047595A (en) A 3d lidar system using a dichroic mirror for autonomous driving vehicles
US20120081544A1 (en) Image Acquisition Unit, Acquisition Method, and Associated Control Unit
JP7512252B2 (en) Head-up display for vehicle and head-up display system for vehicle
CN104364673A (en) Gated imaging using an adaptive depth of field
US10482340B2 (en) System and method for object recognition and ranging by deformation of projected shapes in a multimodal vision and sensing system for autonomous devices
KR20180015093A (en) System and method for stereo triangulation
WO2022050047A1 (en) Lidar device, lidar system, distance measurement method, and program
US20190129038A1 (en) Monitoring System for a Mobile Device and Method for Monitoring Surroundings of a Mobile Device
RU2706757C1 (en) Control method and unit for rear view
TWI792512B (en) Vision based light detection and ranging system using multi-fields of view
WO2023221118A1 (en) Information processing methods and apparatus, electronic device, and storage medium
KR102421054B1 (en) The Apparatus And Method For Parking Guide

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21864105

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21864105

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP