WO2023079944A1 - Dispositif de commande, procédé de commande et programme de commande - Google Patents

Dispositif de commande, procédé de commande et programme de commande Download PDF

Info

Publication number
WO2023079944A1
WO2023079944A1 PCT/JP2022/038730 JP2022038730W WO2023079944A1 WO 2023079944 A1 WO2023079944 A1 WO 2023079944A1 JP 2022038730 W JP2022038730 W JP 2022038730W WO 2023079944 A1 WO2023079944 A1 WO 2023079944A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
intensity
main
receiving
sub
Prior art date
Application number
PCT/JP2022/038730
Other languages
English (en)
Japanese (ja)
Inventor
謙一 柳井
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2023079944A1 publication Critical patent/WO2023079944A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers

Definitions

  • the present disclosure relates to a technique for controlling an optical sensor that receives reflected light with respect to irradiation light emitted to a detection area of a vehicle.
  • Patent Document 1 In the technology disclosed in Patent Document 1, two types of light-receiving pixels with different sensitivities coexist in order to receive the reflected light from the target with respect to the irradiation light in the optical sensor. As a result, even when the reflection intensity from the target is high, it is possible to suppress erroneous detection of the reflection point distance due to saturation when the light reception intensity reaches the upper limit intensity in the light receiving pixels on the low sensitivity side. ing.
  • An object of the present disclosure is to provide a control device that enhances the dynamic range of optical sensors. Another object of the present disclosure is to provide a control method that enhances the dynamic range of optical sensors. Yet another object of the present disclosure is to provide a control program that enhances the dynamic range of optical sensors.
  • a first aspect of the present disclosure is A control device that controls an optical sensor that has a processor and receives reflected light from a target with respect to irradiation light that is irradiated onto a detection area of a vehicle by a plurality of light-receiving pixels,
  • the processor Acquiring, as the main light receiving intensity, the light receiving intensity of the reflected light received from the detection area by the optical sensor with respect to the irradiation light of the main irradiation intensity emitted from the optical sensor, for each light receiving pixel; Acquiring the received light intensity of the reflected light received from the detection area by the optical sensor as the sub received light intensity for each light receiving pixel with respect to the irradiation light with the sub irradiation intensity which is lower than the main irradiation intensity emitted from the optical sensor. and, Complementing the main light receiving intensity of the saturated pixel that has reached the upper limit intensity among the main light receiving intensity obtained for each light receiving pixel based on the sub light receiving intensity of the
  • a second aspect of the present disclosure is A control method executed by a processor for controlling an optical sensor in which a plurality of light receiving pixels receive reflected light from a target with respect to irradiation light irradiated to a detection area of a vehicle, the control method comprising: Acquiring, as the main light receiving intensity, the light receiving intensity of the reflected light received from the detection area by the optical sensor with respect to the irradiation light of the main irradiation intensity emitted from the optical sensor, for each light receiving pixel; Acquiring the received light intensity of the reflected light received from the detection area by the optical sensor as the sub received light intensity for each light receiving pixel with respect to the irradiation light with the sub irradiation intensity which is lower than the main irradiation intensity emitted from the optical sensor. and, Complementing the main light receiving intensity of the saturated pixel that has reached the upper limit intensity among the main light receiving intensity obtained for each light receiving pixel based on the sub light receiving intensity of the saturated pixel.
  • a third aspect of the present disclosure is A control program stored in a storage medium and containing instructions to be executed by a processor in order to control an optical sensor that receives, with a plurality of light receiving pixels, light reflected from a target with respect to irradiation light irradiated to a detection area of a vehicle, the control program comprising: , the instruction is Acquiring, as the main light receiving intensity, the light receiving intensity of the reflected light received from the detection area by the optical sensor with respect to the irradiation light with the main irradiation intensity emitted from the optical sensor, for each light receiving pixel; Acquiring the received light intensity of the reflected light received from the detection area by the optical sensor as the sub received light intensity for each light receiving pixel with respect to the irradiation light with the sub irradiation intensity emitted from the optical sensor that is lower than the main irradiation intensity. and, Complementing the main light receiving intensity of the saturated pixel that has reached the upper limit intensity among the main light receiving intensity
  • the main light receiving intensity of the reflected light received from the detection area by the optical sensor is obtained for each light receiving pixel with respect to the irradiation light of the main irradiation intensity emitted from the optical sensor. be done. Therefore, according to the first to third aspects, the sub-light-receiving intensity of the reflected light received from the detection area by the optical sensor is Acquired for each light-receiving pixel. According to this, of the main light receiving intensity obtained for each light receiving pixel, the main light receiving intensity of the saturated pixel that has reached the upper limit intensity does not reach the upper limit intensity even for the saturated pixel with low sub irradiation intensity.
  • the difference from the true value exceeding the upper limit intensity can be complemented by being based on the sub received light intensity of . Therefore, by detecting the main light reception intensity complemented in this way as the reflection intensity from the target, it is possible to increase the dynamic range of the optical sensor.
  • FIG. 1 is a schematic diagram showing the overall configuration of a sensing system according to one embodiment
  • FIG. 3 is a schematic diagram showing the detailed configuration of an optical sensor according to one embodiment
  • FIG. It is a block diagram which shows the functional structure of the control apparatus by one Embodiment.
  • 1 is a schematic diagram illustrating a light projector according to one embodiment
  • FIG. 2 is a schematic diagram showing characteristics of a laser diode according to one embodiment
  • 4 is a time chart showing detection frames according to one embodiment
  • FIG. 3 is a schematic diagram showing a receiver according to one embodiment
  • FIG. 4 is a schematic diagram showing main image data according to one embodiment
  • FIG. 4 is a schematic diagram showing sub-image data according to one embodiment
  • 4 is a schematic diagram showing background image data according to one embodiment; 5 is a graph for explaining complementation according to one embodiment; 4 is a flowchart illustrating control flow according to one embodiment.
  • one embodiment of the present disclosure relates to a sensing system 2 comprising an optical sensor 10 and a control device 1.
  • the sensing system 2 is mounted on the vehicle 5 .
  • the vehicle 5 is a moving object such as an automobile that can travel on a road while a passenger is on board.
  • the vehicle 5 is capable of steady or temporary automatic driving in the automatic driving control mode.
  • the autonomous driving control mode may be realized by autonomous driving control, such as conditional driving automation, advanced driving automation, or full driving automation, in which the system when activated performs all driving tasks.
  • the automated driving control mode may be implemented in advanced driving assistance controls, such as driving assistance or partial driving automation, where the occupant performs some or all driving tasks.
  • the automatic driving control mode may be realized by either one, combination, or switching of the autonomous driving control and advanced driving support control.
  • the front, rear, up, down, left, and right directions are defined with respect to the vehicle 5 on the horizontal plane.
  • the horizontal direction means a direction parallel to a horizontal plane serving as a direction reference of the vehicle 5 .
  • the vertical direction indicates a vertical direction that is also a vertical direction with respect to a horizontal plane serving as a direction reference of the vehicle 5 .
  • the optical sensor 10 is a so-called LiDAR (Light Detection and Ranging/Laser Imaging Detection and Ranging) for acquiring image data that can be used for driving control of the vehicle 5 including the automatic driving mode.
  • the optical sensor 10 is arranged in at least one portion of the vehicle 5, for example, the front portion, the left and right side portions, the rear portion, the upper roof, or the like.
  • a three-dimensional orthogonal coordinate system is defined by the X-axis direction, the Y-axis direction, and the Z-axis direction as three mutually orthogonal axial directions.
  • the X-axis and Z-axis are set along different horizontal directions of the vehicle 5
  • the Y-axis is set along the vertical direction of the vehicle 5 .
  • the left side of the dashed line along the Y-axis direction (optical window 12 side described later) is actually perpendicular to the right side of the dashed line (units 21 and 41 sides described later).
  • Fig. 4 shows a cross-section;
  • the optical sensor 10 emits light toward the detection area Ad corresponding to the location and viewing angle in the external space of the vehicle 5 .
  • the optical sensor 10 receives reflected light that is incident when the irradiated light is reflected from the detection area Ad.
  • the reflected light received at this time also includes background light (that is, external light) incident from the detection area Ad.
  • the optical sensor 10 can receive only such background light when light irradiation is stopped.
  • the optical sensor 10 senses a target Tr that reflects light within the detection area Ad.
  • sensing in this embodiment means detecting the reflection intensity from the target Tr and the reflection point distance from the optical sensor 10 to the target Tr.
  • a representative sensing target in the optical sensor 10 applied to the vehicle 5 may be at least one of moving objects such as pedestrians, cyclists, non-human animals, and other vehicles.
  • a typical sensing target object in the optical sensor 10 applied to the vehicle 5 is at least one of stationary objects such as guardrails, road signs, roadside structures, and falling objects on the road. good too.
  • the optical sensor 10 includes a housing 11, a light projecting unit 21, a scanning unit 31, and a light receiving unit 41.
  • the housing 11 is formed in a box shape and has a light shielding property.
  • the housing 11 accommodates the light projecting unit 21, the scanning unit 31, and the light receiving unit 41 inside.
  • the housing 11 has a translucent optical window 12 .
  • the light projecting unit 21 includes a light projector 22 and a light projecting lens system 26 .
  • the projector 22 is constructed by arranging a plurality of laser diodes 24 in an array on a substrate.
  • the laser diodes 24 of this embodiment are arranged in a single row along the Y-axis direction.
  • Each laser diode 24 has a resonator structure capable of resonating light oscillated in the PN junction layer, and a mirror layer structure capable of repeatedly reflecting light across the PN junction layer.
  • Each laser diode 24 emits light corresponding to the application of current in accordance with the control signal from control device 1 .
  • each laser diode 24 of the present embodiment emits light in the near-infrared region that is difficult for humans existing in the external space including the detection area Ad of the vehicle 5 to visually recognize.
  • each laser diode 24 emits pulsed light by entering an oscillating LD (Laser Diode) mode when a current higher than the switching current value Cth is applied.
  • the light emitted from each laser diode 24 by such LD mode pulse emission constitutes irradiation light having a main irradiation intensity Iim in the near-infrared region, as shown in FIG.
  • each laser diode 24 enters a non-oscillating LED (Light Emitting Diode) mode when a current lower than the switching current value Cth is applied, thereby continuously DC ( Direct Current) emits light.
  • LED Light Emitting Diode
  • each laser diode 24 by such continuous light emission in the LED mode constitutes irradiation light with a sub-irradiation intensity Iis lower than the main irradiation intensity Iim in the near-infrared region, as shown in FIG. will do.
  • the light projector 22 has a light projection window 25 formed on one side of the substrate, which is quasi-defined by a rectangular contour long in the Y-axis direction and short in the X-axis direction.
  • the projection window 25 is configured as a collection of projection apertures in each laser diode 24 .
  • the light emitted from the projection aperture of each laser diode 24 is projected from the light projection window 25 as irradiation light artificially arranged in a line along the Y-axis direction in the detection area Ad.
  • the irradiation light may include non-light-emitting portions corresponding to the arrangement intervals of the laser diodes 24 in the Y-axis direction.
  • the detection area Ad forms a line-shaped irradiation light with non-light-emitting portions macroscopically eliminated by the diffraction action.
  • the projection lens system 26 projects the irradiation light from the light projector 22 toward the scanning mirror 32 of the scanning unit 31 .
  • the light projecting lens system 26 exhibits at least one type of optical action among, for example, condensing, collimating, and shaping.
  • Projection lens system 26 forms a projection optical axis along the Z-axis.
  • the projection lens system 26 has at least one projection lens 27 having a lens shape corresponding to the optical action to be exerted on the projection optical axis.
  • the light projector 22 is positioned on the light projection optical axis of the light projection lens system 26 .
  • the irradiation light emitted from the light projection window 25 in the light projector 22 is guided along the light projection optical axis of the light projection lens system 26 .
  • the scanning unit 31 has a scanning mirror 32 and a scanning motor 35 .
  • the scanning mirror 32 is formed in a plate shape by vapor-depositing a reflective film on the reflective surface 33, which is one side of the substrate.
  • the scanning mirror 32 is supported by the housing 11 so as to be rotatable around the rotation centerline along the Y-axis direction.
  • the scanning mirror 32 oscillates within a drive section limited by a mechanical or electrical stopper.
  • the scanning mirror 32 is commonly provided for the light projecting unit 21 and the light receiving unit 41 .
  • the scanning mirror 32 reflects the irradiation light incident from the light projecting lens system 26 of the light projecting unit 21 on the reflecting surface 33 and projects the light onto the detection area Ad through the optical window 12, thereby rotating the detection area Ad. Scan according to the angle.
  • the mechanical scanning of the detection area Ad by the scanning mirror 32 is substantially limited to scanning in the horizontal direction. Simultaneously with such scanning, the scanning mirror 32 further reflects the reflected light incident through the optical window 12 from the detection area Ad to the light receiving unit 41 side by the reflecting surface 33 according to the rotation angle.
  • the velocities of the irradiation light and the reflected light are sufficiently high with respect to the rotational motion speed of the scanning mirror 32 .
  • the reflected light with respect to the irradiation light is guided to the light receiving unit 41 side so as to go in the opposite direction to the irradiation light at the scanning mirror 32 having substantially the same rotation angle as that of the irradiation light.
  • the scanning motor 35 is, for example, a voice coil motor, a brushed DC motor, a stepping motor, or the like.
  • the scanning motor 35 rotationally drives (that is, swings) the scanning mirror 32 within a finite drive section according to a control signal from the control device 1 . At this time, the rotation angle of the scanning mirror 32 is sequentially changed in synchronization with the light emission timing of each laser diode 24 .
  • the light receiving unit 41 is displaced from the light projecting unit 21 in the Y-axis direction.
  • the light receiving unit 41 of this embodiment is positioned below the light projecting unit 21 in the Y-axis direction.
  • the light receiving unit 41 includes a light receiving lens system 42 and a light receiver 45 .
  • the light-receiving lens system 42 exerts an optical action so as to form an image of the incident light on the light receiver 45 .
  • the light-receiving lens system 42 forms a light-receiving optical axis along the Z-axis.
  • the light-receiving lens system 42 has at least one light-receiving lens 43 on the light-receiving optical axis, which has a lens shape corresponding to the optical action to be exerted.
  • the light incident on the scanning mirror 32 from the reflecting surface 33 is guided along the light receiving optical axis of the light receiving lens system 42 within the driving section of the scanning mirror 32 .
  • the light receiver 45 is positioned on the light receiving optical axis of the light receiving lens system 42 . As shown in FIG. 7, the light receiver 45 is constructed by arranging a plurality of light receiving pixels 46 in an array on the substrate. In particular, the light-receiving pixels 46 of this embodiment are arranged in a two-dimensional array along the Y-axis direction and the X-axis direction. Each light-receiving pixel 46 is further composed of a plurality of light-receiving elements. A light-receiving element of each light-receiving pixel 46 is mainly formed of a photodiode such as a single photon avalanche diode (SPAD).
  • SPAD single photon avalanche diode
  • the light receiver 45 has a light-receiving surface 47 defined by a rectangular outline that is long in the Y-axis direction and short in the X-axis direction on one side of the substrate.
  • the light-receiving surface 47 is configured as a collection of incident surfaces of the respective light-receiving pixels 46 .
  • Each light-receiving pixel 46 receives reflected light incident on the light-receiving surface 47 from the detection area Ad through the light-receiving lens system 42 with each light-receiving element. At this time, the reflected light of the linear irradiation light in the detection area Ad is received as a beam that spreads linearly along the Y-axis direction.
  • the photodetector 45 has an output circuit 48 integrally.
  • the output circuit 48 performs sampling processing according to the control signal from the control device 1 in the detection frame Fd shown in FIG.
  • the detection frame Fd is repeated at predetermined time intervals while the vehicle 5 is running. Therefore, in the sampling process, the output circuit 48 generates a detection signal for each scanning line based on the output signal from the photodetector 45 for each detection frame Fd.
  • the detection signal thus generated is output from the output circuit 48 to the control device 1 as shown in FIG.
  • the control device 1 shown in FIG. 1 is connected to the optical sensor 10 via at least one of, for example, a LAN (Local Area Network), a wire harness, an internal bus, and the like.
  • the control device 1 includes at least one dedicated computer.
  • the dedicated computer that constitutes the control device 1 may be a sensor ECU (Electronic Control Unit) specialized for controlling the optical sensor 10. In this case, the sensor ECU is housed inside the housing 11.
  • a dedicated computer that constitutes the control device 1 may be an operation control ECU that controls the operation of the vehicle 5 .
  • a dedicated computer that configures the control device 1 may be a navigation ECU that navigates the travel route of the vehicle 5 .
  • a dedicated computer that constitutes the control device 1 may be a locator ECU that estimates the self-state quantity of the vehicle 5 .
  • the dedicated computer that constitutes the control device 1 has at least one memory 1a and one processor 1b.
  • the memory 1a stores computer-readable programs and data non-temporarily, and includes at least one type of non-transitory storage medium such as a semiconductor memory, a magnetic medium, and an optical medium. tangible storage medium).
  • the processor 1b is, for example, CPU (Central Processing Unit), GPU (Graphics Processing Unit), RISC (Reduced Instruction Set Computer)-CPU, DFP (Data Flow Processor), GSP (Graph Streaming Processor), etc. At least one type as core.
  • the processor 1b executes a plurality of instructions included in the control program stored in the memory 1a. Thereby, the control device 1 constructs a plurality of functional blocks for controlling the optical sensor 10 .
  • the control program stored in the memory 1a for controlling the optical sensor 10 causes the processor 1b to execute a plurality of instructions, thereby constructing a plurality of functional blocks.
  • a plurality of functional blocks constructed by the control device 1 include a main acquisition block 100, a sub-acquisition block 110, a background acquisition block 120, an intensity interpolation block 130, and a distance detection block 140, as shown in FIG.
  • the main acquisition block 100 controls the optical sensor 10 to the LD mode in which each laser diode 24 in the oscillating state emits pulses during the main period Pm set in the detection frame Fd as shown in FIG.
  • the detection area Ad is irradiated with intermittent pulses of irradiation light having the main irradiation intensity Iim from the optical sensor 10 during the main period Pm.
  • the main acquisition block 100 shown in FIG. 3 calculates the received light intensity of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiated light of the main irradiation intensity Iim by the pulse light emission in the main period Pm. Obtained every 46
  • the light receiving intensity of each light receiving pixel 46 obtained as shown in FIGS. 3 and 8 is defined as the main light receiving intensity Irm.
  • the main acquisition block 100 generates main image data Dm by converting the main received light intensity Irm for each light receiving pixel 46 into data.
  • the main acquisition block 100 controls the rotational driving of the scanning mirror 32 by the scanning motor 35 over the entire drive section to the emission timing tm of the pulse emission shown in FIG. Synchronize and control. Therefore, the main acquisition block 100 acquires the main light receiving intensity Irm for each light receiving pixel 46 for each of a plurality of scanning lines according to the rotation angle of the scanning mirror 32, and converts the main image data Dm for each scanning line into the main image data Dm shown in FIG. Generate like
  • the sub-acquisition block 110 shown in FIG. 6 is set to the LED mode in which each laser diode 24 in the non-oscillating state continuously emits light during the light-emitting period Psl of the sub-period Ps set in the detection frame Fd. Control the sensor 10 .
  • the sub-period Ps is defined as a period before or after the main period Pm in the same detection frame Fd.
  • the sub-period Ps in this embodiment is defined as a period continuously following the main period Pm.
  • the light emission period Psl in this embodiment is defined as a part of the sub-period Ps that is biased toward the front.
  • the detection area Ad is continuously irradiated with the irradiation light of the sub irradiation intensity Iis lower than the main irradiation intensity Iim from the optical sensor 10 controlled to the LED mode.
  • the sub-acquisition block 110 shown in FIG. 3 calculates the received light intensity of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiated light of the sub-irradiated intensity Iis due to continuous light emission in the light emission period Psl, to each light-receiving pixel. Obtained every 46 Here, the received light intensity of each light receiving pixel 46 acquired as shown in FIGS. Under this definition, the sub-acquisition block 110 generates sub-image data Ds by converting the sub-light receiving intensity Irs for each light-receiving pixel 46 into data.
  • the sub-acquisition block 110 controls the rotational driving of the scanning mirror 32 by the scanning motor 35 over the entire drive section in parallel with the continuous light emission shown in FIG. . Therefore, the sub-acquisition block 110 acquires the sub-light-receiving intensity Irs for each light-receiving pixel 46 for each of a plurality of scanning lines corresponding to the rotation angle of the scanning mirror 32, and converts the sub-image data Ds for each scanning line into the sub-image data Ds shown in FIG. Generate like
  • the background acquisition block 120 shown in FIG. 3 stops the application of the current in the stop period Pss of the sub-period Ps set in the detection frame Fd as shown in FIG. Control the optical sensor 10 to stop mode.
  • the stop period Pss in this embodiment is defined as a partial period of the sub-period Ps that is biased toward the rear side of the light emission period Psl.
  • the stop period Pss of the present embodiment is set to have substantially the same time length ⁇ t as the light emission period Psl of continuous light emission. In such a stop period Pss, the irradiation light from the optical sensor 10 controlled to the stop mode to the detection area Ad stops continuously.
  • the background acquisition block 120 shown in FIG. 3 acquires the received light intensity of the background light received by the optical sensor 10 from the detection area Ad during the stop period Pss for each light receiving pixel 46 .
  • the received light intensity of each light receiving pixel 46 acquired as shown in FIGS. 3 and 10 is defined as the background received light intensity Irb that is lower than the corresponding sub received light intensity Irs.
  • the background acquisition block 120 generates background image data Db by digitizing the background received light intensity Irb for each light receiving pixel 46 .
  • the background acquisition block 120 controls the rotational driving of the scanning mirror 32 by the scanning motor 35 over the entire drive section in parallel with the stoppage of light emission of each laser diode 24 shown in FIG. . Therefore, the background acquisition block 120 acquires the background received light intensity Irb for each of the light receiving pixels 46 for each of a plurality of scanning lines according to the rotation angle of the scanning mirror 32, and converts the background image data Db for each scanning line as shown in FIG. Generate like
  • the intensity complementing block 130 shown in FIG. 3 determines whether the main light receiving intensity Irm of each light receiving pixel 46 in each of the main image data Dm for each scanning line has reached the upper limit intensity Iru of the corresponding light receiving pixel 46. to judge.
  • the intensity interpolation block 130 performs the main acquisition block 100 for the unsaturated pixels 460 in which the main received light intensity Irm does not reach the upper limit intensity Iru as shown in FIG. is maintained as the determined value of the main received light intensity Irm.
  • the intensity interpolation block 130 performs the main acquisition block 100 for the saturated pixels 461 in which the main received light intensity Irm reaches the upper limit intensity Iru as shown in FIG. Complement the value obtained by .
  • the complementing of the main received light intensity Irm of the saturated pixel 461 is based on the sub-received light intensity Irs of the saturated pixel 461 in the sub-image data Ds of the scanning line including the saturated pixel 461 as shown in FIG.
  • the intensity interpolation block 130 subtracts the background received light intensity Irb of the saturated pixel 461 from the background image data Db of the scanning line including the saturated pixel 461 as shown in FIG.
  • the sub received light intensity Irs of the saturated pixel 461 is corrected.
  • the intensity complementing block 130 converts the sub-light receiving intensity Irs of the saturated pixel 461 thus corrected into a complementing intensity Irc for the main received light intensity Irm of the saturated pixel 461 .
  • the corrected difference between the sub received light intensity Irs in the light emission period Psl and the background received light intensity Irb in the stop period Pss having substantially the same length of time as the light emission period Psl is expressed in Equation 1.
  • Complementary intensity Irc is the difference between the true value Irt of the main received light intensity Irm representing the actual reflected intensity and the upper limit intensity Ir limited in the saturated pixel 461 as shown in FIG. will be estimated as Therefore, the intensity complementing block 130 adds the complementing intensity Irc as shown in Equation 2 to the main received light intensity Irm of the saturated pixel 461 that has reached the upper limit intensity Iru, thereby obtaining the main received light intensity based on the sub received light intensity Irs. Complement Irm.
  • the intensity complementing block 130 maintains the intensity when the upper limit intensity Iru is not reached for each light receiving pixel 46, and complements the main light receiving intensity Irm when the upper limit intensity Iru is reached. is synthesized as intensity image data.
  • the main image data Dm synthesized in this manner is data obtained by detecting the main light receiving intensity Irm for each light receiving pixel 46 for each scanning line as the reflection intensity from the target Tr in the detection area Ad.
  • the synthesized main image data Dm is used in the automatic driving control mode of the vehicle 5, so that the multiple targets Tr can be accurately separated and identified in the recognition process, for example.
  • the intensity complementing block 130 may store the combined main image data Dm in the memory 1a in association with at least one type of, for example, the time stamp and the driving environment information of the vehicle 5.
  • the intensity complementing block 130 associates the synthesized main image data Dm with at least one type of, for example, a time stamp and driving environment information of the vehicle 5, and transmits the data to the external center, so that the data is stored in the storage medium of the external center. may be accumulated.
  • the distance detection block 140 shown in FIG. 3 synthesizes the reflection point distance of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiation light of the main irradiation intensity Iim in the main period Pm by the intensity complementing block 130. detection based on the main image data Dm. At this time, the detection of the reflection point distance is realized by dTOF (direct Time Of Flight) based on the flight time of the irradiation light from the pulse emission timing tm to the acquisition of the peak intensity of the main received light intensity Irm. At this time, the error in the reflection point distance caused by flare or the like caused by the reflected light from the high-reflectance target Tr, which is schematically superimposed and displayed in FIGS. It may be corrected based on the main image data Dm.
  • dTOF Direct Time Of Flight
  • the distance detection block 140 generates distance image data Dd as shown in FIG. 3 by synthesizing the reflection point distance data obtained for each light receiving pixel 46 for each scanning line.
  • the generated distance image data Dd is used together with the main image data Dm in the automatic driving control mode of the vehicle 5. For example, in recognition processing, it is possible to accurately separate and grasp the distances to multiple targets Tr. .
  • the distance detection block 140 may store the distance image data Dd in the memory 1a in accordance with the case of the main image data Dm by the intensity interpolation block 130, or may store the distance image data Dd after transmitting it to an external center. .
  • control method in which the control device 1 controls the optical sensor 10 of the vehicle 5 is executed according to the control flow shown in FIG. This control flow is repeatedly executed for each detection frame Fd while the vehicle 5 is running.
  • Each "S" in the control flow means a plurality of steps executed by a plurality of instructions included in the control program.
  • the main acquisition block 100 obtains the main received light intensity Irm of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiated light of the main irradiation intensity Iim by pulse emission in the main period Pm for each scanning line. is obtained for each light-receiving pixel 46 of the . Therefore, the main acquisition block 100 in S10 generates main image data Dm for each scanning line by converting the acquired main light receiving intensity Irm for each light receiving pixel 46 into data for each scanning line.
  • the sub-acquisition block 110 acquires the sub-light-receiving intensity Irs of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiation light of the sub-irradiation intensity Iis due to the pulse emission in the light-emitting period Psl of the sub-period Ps. is obtained for each light-receiving pixel 46 for each scanning line. Therefore, the sub-acquisition block 110 in S11 generates sub-image data Ds for each scanning line by converting the acquired sub-light-receiving intensity Irs for each light-receiving pixel 46 into data for each scanning line.
  • the background acquisition block 120 acquires the background light receiving intensity Irb of the background light received from the detection area Ad by the optical sensor 10 during the stop period Pss of the sub period Ps for each light receiving pixel 46 for each scanning line. Therefore, the background acquisition block 120 in S12 generates background image data Db for each scanning line by converting the acquired background light receiving intensity Irb for each light receiving pixel 46 into data for each scanning line.
  • the intensity interpolation block 130 determines whether the main light reception intensity Irm of each light receiving pixel 46 in each of the main image data Dm generated for each scanning line in S10 has reached the upper limit intensity Iru of the corresponding light receiving pixel 46. determine whether As a result, for the unsaturated pixels 460 whose main received light intensity Irm has not reached the upper limit intensity Iru, in S14 the intensity interpolation block 130 maintains the value acquired in S10 as it is as the final value of the main received light intensity Irm.
  • the intensity complementation block 130 complements the value acquired in S10 in S15.
  • the intensity complementing block 130 corrects the sub-light-receiving intensity Irs of the saturated pixel 461 in S11 using the background light-receiving intensity Irb of the saturated pixel 461 in S12, and then corrects the main light-receiving intensity Irm of the saturated pixel 461 in S10. Complementation is performed based on the subsequent sub received light intensity Irs.
  • the control flow moves to S16.
  • the intensity complementing block 130 maintains the intensity when the upper limit intensity Iru is not reached for each light receiving pixel 46, and complements it when the upper limit intensity Iru is reached.
  • the main image data Dm are synthesized.
  • the distance detection block 140 calculates the reflection point distance of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiation light of the main irradiation intensity Iim in the main period Pm, and calculates the distance from the main image synthesized in S16.
  • the distance image data Dd is generated by detecting based on the data Dm. At least one of the image data Dm and Dd in S16 and S17 may be stored in the memory 1a, or may be stored after being transmitted to an external center.
  • the main light receiving intensity Irm of the reflected light received from the detection area Ad by the optical sensor 10 is acquired for each light receiving pixel 46 with respect to the irradiation light with the main irradiation intensity Iim emitted from the optical sensor 10. be done. Therefore, according to the present embodiment, the sub-light receiving intensity Irs of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiation light with the sub-irradiation intensity Iis lower than the main irradiation intensity Iim from the optical sensor 10 is is also obtained for each light-receiving pixel 46 .
  • the main light receiving intensity Irm of the saturated pixel 461 that has reached the upper limit intensity Iru is also, by using the sub received light intensity Irs that does not reach the upper limit intensity Iru, it is possible to complement the difference from the true value Irt that exceeds the upper limit intensity Iru. Therefore, it is possible to increase the dynamic range of the optical sensor 10 by detecting the interpolated main received light intensity Irm as the reflection intensity from the target Tr.
  • the main received light intensity Irm that reaches the upper limit intensity Iru is obtained by adding the complementary intensity Irc estimated according to the sub received light intensity Irs as the difference between the true value Irt of the main received light intensity Irm and the upper limit intensity Iru. can be complemented accurately. Therefore, it is possible to appropriately increase the dynamic range of the optical sensor 10 .
  • the main light-receiving intensity Irm for each light-receiving pixel 46 is obtained with respect to the irradiation light of the main irradiation intensity Iim by pulse light emission, while the sub light-receiving intensity Irs for each light-receiving pixel 46 is obtained by continuous light emission. is obtained for the illuminating light of sub-illuminating intensity Iis by .
  • the main light receiving intensity Irm for each light receiving pixel 46 is obtained with respect to the irradiation light of the main irradiation intensity Iim from the multiple laser diodes 24 controlled to oscillate in the optical sensor 10 . Therefore, the sub-light-receiving intensity Irs for each light-receiving pixel 46 is obtained with respect to the irradiation light of the sub-irradiation intensity Iis which is irradiated by controlling the same plurality of laser diodes 24 in the non-oscillating state in the optical sensor 10 .
  • the dynamic range is determined by complementing the main light receiving intensity Irm based on the sub light receiving intensity Irs. can be increased.
  • the main light receiving intensity Irm for each light receiving pixel 46 is obtained with respect to the irradiation light with the main irradiation intensity Iim emitted during the main period Pm. Therefore, the sub-light-receiving intensity Irs for each light-receiving pixel 46 is obtained with respect to the irradiation light with the sub-irradiation intensity Iis irradiated in the sub-period Ps before or after the main period Pm. According to this, the light-receiving pixels 46 that receive the main light-receiving intensity Irm and the sub-light-receiving intensity Irs from the same target Tr in the detection area Ad of the moving vehicle 5 are easily shared.
  • the optical sensor 10 It is possible to appropriately increase the dynamic range of
  • the background light reception intensity Irb of the background light received from the detection area Ad by the optical sensor 10 is obtained for each light receiving pixel 46 during the stop period Pss during which the irradiation light is stopped. Therefore, based on the sub-light-receiving intensity Irs corrected by the background light-receiving intensity Irb, the main light-receiving intensity Irm that has reached the upper limit intensity Iru can be complemented with high accuracy while excluding the influence of the background light. Therefore, it is possible to appropriately increase the dynamic range of the optical sensor 10 as well as the detection accuracy of the reflection intensity.
  • the reflection point distance of the reflected light received from the detection area Ad by the optical sensor 10 with respect to the irradiation light of the main irradiation intensity Iim is based on the main image data Dm including the interpolated main light reception intensity Irm. , is detected.
  • the reflection point distance can also be detected with high accuracy by utilizing the highly accurate reflection intensity due to the high dynamic range. That is, it is possible to appropriately increase the dynamic range of the optical sensor 10 together with the detection accuracy of the reflection point distance as well as the reflection intensity detection accuracy.
  • the dedicated computer that configures the control device 1 may be a computer other than the vehicle 5 that builds an external center or mobile terminal that can communicate with the vehicle 5.
  • the dedicated computer that constitutes the control device 1 may have at least one of digital circuits and analog circuits as a processor.
  • Digital circuits here include, for example, ASIC (Application Specific Integrated Circuit), FPGA (Field Programmable Gate Array), SOC (System on a Chip), PGA (Programmable Gate Array), and CPLD (Complex Programmable Logic Device). , at least one Such digital circuits may also have a memory that stores the program.
  • the mechanical scanning of the detection area Ad by the scanning mirror 32 may be substantially limited to scanning in the vertical direction.
  • the longitudinal sides of the rectangular contours of the light projecting window 25 and the light receiving surface 47 may be defined along the X-axis direction.
  • the mechanical scanning of the detection area Ad by the scanning mirror 32 may be both scanning in the horizontal direction and scanning in the vertical direction.
  • the optical sensor 10 of the modified example may adopt various two-dimensional or three-dimensional scanning methods such as a rotary type, a MEMS (Micro Electro Mechanical Systems) type, or a Lissajous type.
  • the light projector 22 that emits the irradiation light with the main irradiation intensity Iim and the light projector 22 that emits the irradiation light with the sub-irradiation intensity Iis may be provided separately.
  • a light emitting diode (LED) may be used as the light projector 22 that emits the irradiation light of the sub irradiation intensity Iis.
  • the main period Pm may be defined as a period continuously following the sub-period Ps.
  • a light emission period Psl biased toward the rear side of the stop period Pss may be defined in a sub-period Ps before or after the main period Pm.
  • the stop period Pss may be omitted from the sub-period Ps.
  • the background received light intensity Irb may be estimated using, for example, an algorithm based on at least one of the main received light intensity Irm and the sub received light intensity Irs.
  • the correction of the sub received light intensity Irs using the background received light intensity Irb may be omitted.
  • the distance detection block 140 and S17 may be omitted.
  • the vehicle to which the control device 1 is applied may be, for example, a traveling robot or the like whose traveling on the traveling road can be remotely controlled from an external center.
  • the above-described embodiments and modifications may be implemented as a semiconductor device (for example, a semiconductor chip or the like) having at least one processor 1b and at least one memory 1a.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

Ce dispositif de commande commande un capteur optique qui reçoit, par l'intermédiaire d'une pluralité de pixels de réception de lumière, la lumière réfléchie par une cible, la lumière étant une lumière de projection projetée sur une zone de détection d'un véhicule. Un processeur du dispositif de commande est configuré pour : acquérir, par rapport à la lumière de projection d'une intensité de projection principale (Iim) qui est projetée à partir du capteur optique, l'intensité de réception de lumière de la lumière réfléchie reçue de la zone de détection par un capteur optique, ladite intensité de réception de lumière étant acquise comme une intensité de réception de lumière principale (Irm) pour chaque pixel de réception de lumière ; acquérir, par rapport à la lumière de projection d'une intensité de sous-projection (Iis), qui est projetée plus bas que l'intensité de projection principale (Iim), à partir du capteur optique, l'intensité de réception de lumière de la lumière réfléchie reçue de la zone de détection par le capteur optique, ladite intensité de réception de lumière étant acquise en tant qu'intensité de réception de lumière secondaire (Irs) pour chaque pixel de réception de lumière ; et compléter l'intensité de réception de lumière principale (Irm), parmi les intensités de réception de lumière principale (Irm) acquises pour les pixels de réception de lumière, d'un pixel saturé (461) qui a atteint une intensité maximale (Iru), sur la base de l'intensité de réception de sous-lumière (IR) dudit pixel saturé.
PCT/JP2022/038730 2021-11-02 2022-10-18 Dispositif de commande, procédé de commande et programme de commande WO2023079944A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-179617 2021-11-02
JP2021179617A JP2023068471A (ja) 2021-11-02 2021-11-02 制御装置、制御方法、制御プログラム

Publications (1)

Publication Number Publication Date
WO2023079944A1 true WO2023079944A1 (fr) 2023-05-11

Family

ID=86241481

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/038730 WO2023079944A1 (fr) 2021-11-02 2022-10-18 Dispositif de commande, procédé de commande et programme de commande

Country Status (2)

Country Link
JP (1) JP2023068471A (fr)
WO (1) WO2023079944A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117412A1 (en) * 2001-12-21 2003-06-26 General Electric Company Method for high dynamic range image construction based on multiple images with multiple illumination intensities
US20080319690A1 (en) * 2007-06-20 2008-12-25 Usa As Represented By The Administrator Of The National Aeronautics & Space Administration Forward Voltage Short-Pulse Technique for Measuring High Power Laser Diode Array Junction Temperature
WO2015025497A1 (fr) * 2013-08-23 2015-02-26 パナソニックIpマネジメント株式会社 Système de mesure de la distance et dispositif de génération de signaux
WO2017183114A1 (fr) * 2016-04-19 2017-10-26 株式会社日立エルジーデータストレージ Dispositif et procédé de génération d'images de distance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117412A1 (en) * 2001-12-21 2003-06-26 General Electric Company Method for high dynamic range image construction based on multiple images with multiple illumination intensities
US20080319690A1 (en) * 2007-06-20 2008-12-25 Usa As Represented By The Administrator Of The National Aeronautics & Space Administration Forward Voltage Short-Pulse Technique for Measuring High Power Laser Diode Array Junction Temperature
WO2015025497A1 (fr) * 2013-08-23 2015-02-26 パナソニックIpマネジメント株式会社 Système de mesure de la distance et dispositif de génération de signaux
WO2017183114A1 (fr) * 2016-04-19 2017-10-26 株式会社日立エルジーデータストレージ Dispositif et procédé de génération d'images de distance

Also Published As

Publication number Publication date
JP2023068471A (ja) 2023-05-17

Similar Documents

Publication Publication Date Title
US11609329B2 (en) Camera-gated lidar system
JP2020003236A (ja) 測距装置、移動体、測距方法、測距システム
US20150204977A1 (en) Object detection device and sensing apparatus
CN105723239A (zh) 测距摄像系统
CN111157977B (zh) 用于自动驾驶车辆的、使用时间-数字转换器和多像素光子计数器的lidar峰值检测
WO2020189339A1 (fr) Dispositif de mesure de distance et procédé de mesure de distance
WO2020075525A1 (fr) Système de fusion de capteur, dispositif de commande de synchronisation et procédé de commande de synchronisation
US10877134B2 (en) LIDAR peak detection using splines for autonomous driving vehicles
CN113423620A (zh) 污垢检测系统、LiDAR单元、车辆用传感系统及车辆
US11520019B2 (en) Light signal detection device, range finding device, and detection method
CN111587381A (zh) 调整扫描元件运动速度的方法及测距装置、移动平台
JP7331083B2 (ja) 車両用センシングシステム及び車両
US20060092004A1 (en) Optical sensor
WO2023079944A1 (fr) Dispositif de commande, procédé de commande et programme de commande
EP3540467B1 (fr) Système de recherche de portée, procédé de recherche de portée, dispositif embarqué et véhicule
JP7338455B2 (ja) 物体検出装置
WO2023074207A1 (fr) Dispositif de commande, procédé de commande et programme de commande
WO2023042637A1 (fr) Dispositif de commande, procédé de commande et programme de commande
WO2023047928A1 (fr) Dispositif, procédé et programme de commande
WO2023074208A1 (fr) Dispositif, procédé et programme de commande
WO2024111239A1 (fr) Dispositif de commande, système de détection optique, procédé de commande et programme de commande
WO2023074206A1 (fr) Dispositif, procédé et programme de commande
WO2022249838A1 (fr) Dispositif de commande de capteur, procédé de commande de capteur et programme de commande de capteur
JP2023048113A (ja) 制御装置、制御方法、制御プログラム
WO2023149335A1 (fr) Dispositif de télémétrie et procédé de télémétrie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22889765

Country of ref document: EP

Kind code of ref document: A1