WO2019107550A1 - Dispositif de commande, dispositif de détection, procédé de commande, programme et support de stockage - Google Patents

Dispositif de commande, dispositif de détection, procédé de commande, programme et support de stockage Download PDF

Info

Publication number
WO2019107550A1
WO2019107550A1 PCT/JP2018/044213 JP2018044213W WO2019107550A1 WO 2019107550 A1 WO2019107550 A1 WO 2019107550A1 JP 2018044213 W JP2018044213 W JP 2018044213W WO 2019107550 A1 WO2019107550 A1 WO 2019107550A1
Authority
WO
WIPO (PCT)
Prior art keywords
control device
information
current position
slope
detection
Prior art date
Application number
PCT/JP2018/044213
Other languages
English (en)
Japanese (ja)
Inventor
野中 慶也
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019107550A1 publication Critical patent/WO2019107550A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating

Definitions

  • the present invention relates to adjustment technology of a detection range of a detection device for object detection.
  • Patent Document 1 discloses an irradiation detection unit that emits laser light from the vehicle to the surroundings to detect reflected light, and a gradient detection unit that detects the presence or absence of a gradient change of a road ahead of the vehicle.
  • a laser radar control device comprising: an output control unit which suppresses an irradiation output of laser light when a gradient change is detected by the gradient detection unit.
  • an object on a slope may not be included in the irradiation range of the laser light or the like emitted to detect the object.
  • the object to be detected can not be accurately detected, and the object detection accuracy decreases. There is a risk of
  • the present invention has been made to solve the problems as described above, and a main object of the present invention is to provide a control device and a detection device capable of suitably adjusting the detection range of the detection device for detecting an object. To aim.
  • the invention described in the claims is a control device for controlling a detection device capable of detecting an object present in the periphery of a moving object, wherein the current position of the moving object and a position separated by a predetermined distance from the current position of the moving object
  • the invention described in the claims is a detector including: an irradiation unit that irradiates an electromagnetic wave to the periphery of the moving body; and a receiving unit that receives the electromagnetic wave reflected by the object, the moving body
  • the control part which controls the said irradiation part so that the direction with respect to the said moving body may be changed.
  • the invention described in the claims is a control method executed by a control device that controls a detection device capable of detecting an object present in the periphery of a moving object, the current position of the moving object and the current position of the moving object.
  • the detection step of changing the direction of the detection range of the detection device relative to the movable body based on the acquisition step of acquiring first information on an altitude difference with respect to a position separated by a predetermined distance from the position;
  • the invention recited in the claims is a program executed by a computer that controls a detection device capable of detecting an object present in the periphery of a moving object, wherein the current position of the moving object and the current position of the moving object are used. Controlling the detection device to change the direction of the detection range of the detection device with respect to the movable body based on the first information and an acquisition unit for acquiring first information on an altitude difference from a position separated by a predetermined distance
  • the computer functions as a control unit.
  • FIG. 2 shows a block configuration of a control device.
  • the schematic structural example of a rider is shown.
  • the sectional view of the road cut along the direction of movement of vehicles in case vehicles run around the uphill is shown.
  • the cross-sectional view of the road cut along the advancing direction of the vehicle in case the vehicle travels near the downhill is shown.
  • Fig. 6 shows an example of the electronic adjustment of the actual scan range of the lidar.
  • the sectional view of the road cut along the direction of movement of vehicles in case vehicles run around the uphill is shown.
  • the cross-sectional view of the road cut along the advancing direction of the vehicle in case the vehicle travels near the downhill is shown.
  • a control device for controlling a detection device capable of detecting an object present in the periphery of a moving object, wherein the current position of the moving object and a predetermined distance from the current position of the moving object Control that controls the detection device to change the direction of the detection range of the detection device relative to the moving object based on the first information and an acquisition unit that acquires first information related to a height difference with a distant position And a unit.
  • the control device can suitably control the detection range of the detection device so as to be able to detect an object at a position separated by a predetermined distance even when the mobile object is present near a slope.
  • control device when the position separated by the predetermined distance is lower than the current position, the control unit lowers the direction than when the position separated by the predetermined distance is the same height as the current position.
  • the control device preferably controls the detection range of the detection device so that an object at a position separated by a predetermined distance can be detected even when the detection direction of the detection device is directed to a slope. it can.
  • the control device further includes an attitude information acquisition unit that acquires attitude information indicating an attitude of the mobile object, and the control unit further includes: the first information and the attitude information.
  • the detection device is controlled to change the direction.
  • the control device can determine the detection range of the detection device by suitably considering the influence of the posture of the mobile body itself on the detection range of the detection device.
  • the posture information acquisition unit may acquire, as posture information, an output signal of a 3-axis acceleration sensor or the like installed on a moving body. Further, the posture information acquisition unit may be configured by a three-axis acceleration sensor provided in the control device.
  • control unit controls the detection device to change the direction based on the first information and a gradient at a position separated by the predetermined distance.
  • control device can accurately determine the detection range of the detection device even when the position separated by the predetermined distance is on the slope or the like.
  • the acquisition unit acquires, as the first information, information on an elevation difference between the current position and a position at a predetermined distance from the current position in the direction of the detection range.
  • the control unit controls the detection device to change the elevation angle of the detection range based on the first information.
  • the control device can suitably determine the elevation angle of the detection range of the detection device so that an object at a position separated by a predetermined distance from the direction in which the detection range of the detection device is directed can be detected.
  • an irradiation unit for emitting an electromagnetic wave or an ultrasonic wave to the periphery of the movable body and a receiving unit for receiving the electromagnetic wave or the ultrasonic wave reflected by the object.
  • a detection device the acquisition unit acquiring first information related to the slope of a slope existing in the direction in which the movable body may travel, and the irradiation unit configured to transmit an electromagnetic wave or an ultrasonic wave based on the first information
  • a control unit configured to control the irradiation unit to change the direction of the irradiation range to be irradiated with respect to the movable body.
  • control the irradiation unit to change the direction of the irradiation range to which the irradiation unit irradiates electromagnetic waves or ultrasonic waves with respect to the movable body means that the irradiation unit is an electromagnetic wave of the actual irradiation range of the irradiation unit.
  • control of changing the ultrasonic wave within the possible range the control of changing the mechanical (physical) arrangement position of the irradiation unit, and the arrangement angle of the irradiation unit (the installation angle with respect to the moving body) It may include control to change (physically).
  • the detection device can suitably control the detection range so that an object at a position separated by a predetermined distance can be detected even when the moving object is present near a slope.
  • the detection device to change the direction of the detection range of the detection device with respect to the movable body based on the first information, and a first acquisition step of acquiring first information on a slope of a slope existing in a certain direction
  • Control step of controlling the control device can suitably control the detection range so that an object at a position separated by a predetermined distance can be detected even when the moving object is present near a slope.
  • a program executed by a computer that controls a detection device capable of detecting an object present in the periphery of a mobile object the mobile object may travel Control for controlling the detection device to change the direction of the detection range of the detection device relative to the moving object based on the first information and an acquisition unit for acquiring first information on a slope of a slope existing in a direction
  • the computer functions as a part.
  • the computer can suitably control the detection range so that an object at a position separated by a predetermined distance can be detected even when the mobile object is near a slope.
  • the program is stored in a storage medium.
  • FIG. 1 is a schematic configuration of a measurement system according to a first embodiment.
  • the measurement system is a system for performing measurement for automatic driving of the vehicle 5, and mainly includes a control device 1 and a sensor group including a lidar (Lidar: Light Detection and Ranging or Laser Illuminated Detection And Ranging) 3. Equipped with a lidar (Lidar: Light Detection and Ranging or Laser Illuminated Detection And Ranging) 3. Equipped with
  • the control device 1 is connected to a sensor group including the lidar 3 by wire or wirelessly, performs high-accuracy position estimation of the vehicle 5 based on the output data of the sensor group, and controls automatic driving of the vehicle 5 . Further, in the present embodiment, the control device 1 controls the detection range of the object by the lidar 3 by controlling the irradiation direction of the laser of the lidar 3.
  • the control device 1 may be incorporated in the vehicle 5 as an electronic control unit (ECU) that automatically controls the operation of the vehicle 5. Even if the control device 1 is an in-vehicle device or the like that transmits a control signal related to automatic driving to the vehicle 5. Good. In another example, the controller 1 may be configured as part of the rider 3.
  • the lidar 3 discretely measures the distance to an object present in the external world by emitting a pulse laser which is an electromagnetic wave to a predetermined angle range in the horizontal direction and the vertical direction, and indicates the position of the object Generate dimensional point cloud information.
  • the lidar 3 includes an irradiation unit that emits a laser beam while changing the irradiation direction, and a reception unit that receives the reflected light (scattered light) of the irradiated laser beam.
  • the point cloud information is generated based on the irradiation direction corresponding to the laser beam received by the light receiving unit and the response delay time of the laser beam specified based on the above-described light receiving signal.
  • the rider 3 includes at least the front (traveling direction) of the vehicle in the irradiation range of the laser light.
  • a range in which the laser light from the lidar 3 is irradiated and which is within the maximum ranging distance of the lidar 3 is also referred to as a “lidar detection range RL”.
  • the lidar 3 is provided with an angle adjustment mechanism (not shown), and is configured to adjust the angle (i.e., supine angle) in the vertical direction of the irradiation unit based on the control signal supplied from the control device 1.
  • the lidar 3 is provided on the ceiling of the vehicle 5 as an example, but is not limited to this and may be provided on any part of the vehicle 5.
  • the rider 3 may be fitted in the vicinity of the middle of two headlights (not shown) provided on the vehicle 5.
  • the lidar 3 functions as a detection device.
  • FIG. 2 shows a block configuration of the control device 1.
  • the control device 1 includes an input unit 11, a storage unit 12, an interface 13, a notification unit 14, a communication unit 15, and a control unit 16.
  • the control unit 16 and the other elements are configured to be able to transmit and receive signals via a bus or the like.
  • the input unit 11 is a button operated by the user, a touch panel, a remote controller, a voice input device, or the like, and receives various inputs such as switching between automatic operation and manual operation.
  • the storage unit 12 stores a program executed by the control unit 16 and information necessary for the control unit 16 to execute predetermined processing.
  • the storage unit 12 stores the map DB 20.
  • the map DB 20 may be used for navigation at the time of manual operation of the vehicle, or may be used for operation control of the vehicle at the time of automatic operation of the vehicle.
  • the map DB 20 includes road data represented by links and nodes, facility information, feature information on features to be detected by the rider 3 and the like, and the like.
  • the interface 13 supplies output data from various sensor groups provided in the vehicle 5 to the control unit 16 and supplies a control signal from the control unit 16 to a specific sensor of the sensor group.
  • the sensor group includes the attitude sensor 4 and the GPS receiver 7 in addition to the lidar 3 described above.
  • the attitude sensor 4 is, for example, a three-axis acceleration sensor, and is provided as a sensor for detecting the attitude of the vehicle 5.
  • a sensor for detecting the attitude of the rider 3 may be further provided.
  • the notification unit 14 is, for example, a display, a speaker, or the like that performs output based on control of the control unit 16.
  • the communication unit 15 performs data communication with an external device based on the control of the control unit 16.
  • the control unit 16 includes a CPU or the like that executes a program, and controls the entire measurement system. In the present embodiment, even when the vehicle 5 passes near a slope, the control unit 16 sets the lidar detection range RL a predetermined distance (also referred to as a “detection target distance Dt”) away from the vehicle 5 on the road.
  • the irradiation angle in the vertical direction of the lidar 3 (that is, the elevation angle of the laser light) is controlled to include the range.
  • the control unit 16 irradiates the irradiation angle in the vertical direction of the lidar 3 (i.e., the laser light). Control the supine angle).
  • the detection target distance Dt is, for example, the maximum distance at which the rider 3 can detect an object in the traveling direction of the vehicle 5, and is set to, for example, 100 m.
  • the detection target distance Dt may be longer or shorter than the maximum distance at which the rider 3 can detect an object. Further, the detection target distance Dt may be a predetermined fixed value or may be varied according to the behavior of the vehicle 5.
  • the control unit 16 functions as an acquisition unit, a first acquisition unit, a second acquisition unit, a control unit, a computer that executes a program, and the like.
  • FIG. 3 shows a schematic configuration example of the rider 3.
  • the lidar 3 is a lidar of a TOF (Time Of Flight) system, and mainly includes an optical transmission / reception unit 18 and a signal processing unit 19 as shown in FIG.
  • TOF Time Of Flight
  • the optical transmission / reception unit 18 mainly includes the synchronization control unit 21, the LD driver 22, the laser diode 23, the drive driver 25, the light receiving element 26, the current / voltage conversion circuit (transimpedance amplifier) 27, and the A / D.
  • a converter 28, a segmentor 29, and a crystal oscillator 30 are included.
  • the crystal oscillator 30 outputs a pulse-like clock signal “S1” to the synchronization control unit 21 and the A / D converter.
  • the synchronization control unit 21 outputs a pulse-like trigger signal “S2” to the LD driver 22.
  • the synchronization control unit 21 also outputs a segment extraction signal “S3”, which determines the timing at which the segmenter 29 described later extracts the output of the A / D converter 28, to the segmentator 29.
  • the LD driver 22 supplies a pulse current to the laser diode 23 in synchronization with the trigger signal S2 input from the synchronization control unit 21.
  • the laser diode 23 is, for example, an infrared pulse laser, and emits a light pulse based on the pulse current supplied from the LD driver 22.
  • the scanning unit L is configured as a scanner including, for example, transmission and reception optical systems, scans the light pulse emitted from the laser diode 23 in a range of predetermined horizontal angle and vertical angle, and is irradiated with the emitted light pulse.
  • the return light reflected by the object is guided to the light receiving element 26.
  • the scanning unit L emits a light pulse for each segment obtained by dividing the above-mentioned horizontal angle by an equal angle.
  • the scanning unit L adjusts the emission angle or the like of the light pulse based on the signal supplied from the drive driver 25.
  • the scanning unit L may be a mirror driven by a motor or a mirror of an electrostatic drive system.
  • the scanning unit L functions as an irradiation unit that irradiates an electromagnetic wave.
  • the light receiving element 26 is, for example, an avalanche photodiode, and generates a weak current according to the light amount of the reflected light from the object guided by the scanning unit L.
  • the light receiving element 26 supplies the generated weak current to the current voltage conversion circuit 27.
  • the light receiving element 26 functions as a receiving unit that receives the reflected electromagnetic wave.
  • the current-voltage conversion circuit 27 amplifies the weak current supplied from the light receiving element 26 and converts it into a voltage signal, and inputs the converted voltage signal to the A / D converter 28.
  • the A / D converter 28 converts the voltage signal supplied from the current voltage conversion circuit 27 into a digital signal based on the clock signal S1 supplied from the crystal oscillator 30, and supplies the converted digital signal to the segmentor 29.
  • the segmentor 29 generates a digital signal which is an output of the A / D converter 28 in a period in which the segment extraction signal S3 is asserted as a signal (also referred to as "segment signal Sseg") related to the light reception intensity for each segment.
  • the segmentor 29 supplies the generated segment signal Sseg to the signal processor 19.
  • the signal processing unit 19 generates point group information indicating the distance and the angle of the object based on the segment signal Sseg transmitted from each of the light transmitting / receiving units TR. Specifically, the signal processing unit 19 detects a peak from the waveform of the segment signal Sseg, and estimates the amplitude and delay time corresponding to the detected peak. Then, the signal processing unit 19 determines, among the peaks of the waveform indicated by the segment signal Sseg, the information of the distance corresponding to the delay time of the peak for which the estimated amplitude is equal to or greater than the predetermined threshold Is generated as information of each point constituting point cloud information.
  • the difference in height between the current position and the detection target point Pt is also referred to as the “difference in height H”.
  • the central angle in the vertical direction of the angle range in which the lidar 3 scans the pulse laser is also referred to as “central irradiation angle ⁇ L ”.
  • the central irradiation angle ⁇ L is not an absolute angle based on the horizontal plane, but is a relative angle based on the vehicle 5 (the traveling direction of the vehicle 5 is 0 °).
  • the “standard angle” refers to the normal central irradiation angle ⁇ L set when traveling on a flat road where the slope does not exist around the current position.
  • FIGS. 4A to 4C are cross-sectional views of the road taken along the traveling direction of the vehicle 5 when the vehicle 5 travels in the vicinity of the uphill of the gradient " ⁇ B ".
  • FIG. 4A shows a state in which the forward uphill is separated from the detection target distance Dt or more
  • FIG. 4B shows a state in which the forward uphill exists within the detection target distance Dt
  • FIG. 4C shows a state immediately before the vehicle 5 enters the uphill.
  • the detection target point Pt is determined by regarding the detection target distance Dt as the horizontal distance.
  • the solid line “LM” indicates the center line in the vertical direction of the irradiation range of the lidar 3
  • the solid lines “LT” and “LL” indicate the boundary line in the vertical direction of the irradiation range of the lidar 3.
  • the broken line “J” indicates the latitude and longitude corresponding to the current position of the vehicle 5 estimated by the control device 1
  • the broken line “S” indicates the latitude and longitude corresponding to the detection target point Pt estimated by the control device 1. .
  • the control device 1 calculates an altitude difference between the current position and the detection target point Pt (also referred to as “altitude difference H”). For example, the control device 1 specifies the height of the current position by extracting the height information of the point on the road corresponding to the current position estimated based on the output of the sensor group from the road data of the map DB 20. Similarly, the control device 1 extracts from the road data of the map DB 20 the elevation information of a point on the road that is separated from the current position by the detection target distance Dt in the horizontal distance in the traveling direction of the vehicle 5. Identify the altitude of the point Pt.
  • the detection target distance Dt may be a distance along the route. Then, the control device 1 calculates the height difference H by subtracting the height of the current position from the height of the detection target point Pt.
  • the start point (It is a change point of a slope or an altitude, and it is also called “slope start point Ps" hereafter) of the slope (gradient) which exists ahead of the vehicle 5 is the present. It is separated from the position by the detection target distance Dt or more, and the height difference H is zero. Therefore, in this case, the control device 1 determines that it is not necessary to change the lidar detection range RL, and sets the central irradiation angle ⁇ L to a predetermined standard angle (here, 0 ° as an example). Then, in the example of FIG. 4A, the preceding vehicle existing near the detection target point Pt is included in the rider detection range RL, and the control device 1 detects the presence of the preceding vehicle based on the output of the rider 3 It is possible.
  • a predetermined standard angle here, 0 ° as an example
  • the slope start point Ps exists between the current position of the vehicle 5 and the detection target point Pt, and the detection target point Pt exists on the slope.
  • the height difference H calculated by the control device 1 is a positive value larger than zero. Therefore, in this case, the control device 1 sets the central irradiation angle ⁇ L according to the calculated height difference H.
  • the control device 1 stores in advance a formula or map that defines the central irradiation angle ⁇ L suitable for each height difference H, and refers to the formula or map to set the central irradiation angle.
  • the control device 1 uses the gradient angle between the current position and the detection target point Pt (ie, tan ⁇ 1 (H / Dt)) is calculated and added to the central irradiation angle ⁇ L by the gradient angle.
  • the central irradiation angle ⁇ L increases as the vehicle 5 approaches the slope start point Ps.
  • the central irradiation angle ⁇ L is an angle “ ⁇ 1” (0 ⁇ 1 ⁇ B ) smaller than the slope ⁇ B of the slope, and the vehicle 5 is just before the slope start point Ps.
  • the central irradiation angle ⁇ L is approximately equal to the slope ⁇ B of the slope.
  • the preceding vehicle present at the detection target point Pt on the slope is preferably included in the rider detection range RL.
  • control device 1 increases the central irradiation angle ⁇ L as the height difference H increases, so that an object on the detection target point Pt existing at a higher altitude than the current position is suitably placed in the lidar detection range RL. It can be included.
  • FIGS. 5A and 5B show cross-sectional views of a road taken along the traveling direction of the vehicle 5 when the vehicle 5 travels in the vicinity of the downhill of the gradient "- ⁇ B ".
  • the slope start point Ps of the slope of the slope - ⁇ B exists between the current position of the vehicle 5 and the detection target point Pt, and the detection target point Pt is the slope Exist on.
  • the vehicle 5 is present at a point immediately before the slope start point Ps.
  • the height difference H which the control apparatus 1 calculates becomes a negative value smaller than zero.
  • the control device 1 calculates the height difference H, and then sets the central irradiation angle ⁇ L based on the calculated height difference H. For example, using the height difference H and the detection target distance Dt, the control device 1 calculates a slope angle (here, a negative value) between two points of the current position and the detection target point Pt, and only the slope angle Add to the central illumination angle ⁇ L. As a result, as the vehicle 5 approaches the slope start point Ps, the central illumination angle ⁇ L becomes smaller (ie, the absolute value of the negative value becomes larger), and the central illumination angle ⁇ L approaches the slope - ⁇ B of the slope. Then, in the example of FIG.
  • the central irradiation angle ⁇ L is an angle “ ⁇ 2” (0 ⁇ 1 ⁇ B ), and the vehicle 5 is present immediately before the slope start point Ps (FIG. In the example of B), the central irradiation angle ⁇ L is approximately equal to the angle ⁇ B. Then, in any of the examples of FIGS. 5A and 5B, the preceding vehicle present at the detection target point Pt on the slope is preferably included in the rider detection range RL.
  • the control device 1 may maintain the central irradiation angle ⁇ L at a standard angle without changing the central irradiation angle ⁇ L when the absolute value of the height difference H is equal to or less than a predetermined value.
  • the above-mentioned predetermined value takes into account, for example, the scanning angle range in the vertical direction of the lidar 3 (that is, the vertical viewing angle), and the object on the detection target point Pt is the lidar even if the central irradiation angle ⁇ L is maintained at the standard angle.
  • the height difference is set to be included in the detection range RL.
  • FIG. 5C shows a cross-sectional view of the road when the central irradiation angle ⁇ L is temporarily maintained at the standard angle when approaching an uphill.
  • ⁇ L the central irradiation angle
  • the control device 1 may set the central irradiation angle ⁇ L in consideration of the attitude of the vehicle 5 at the current position in addition to the height difference H. For example, when the road corresponding to the current position has a slope, the controller 1 corrects the central irradiation angle ⁇ L by the slope, since the rider 3 is inclined by the slope. In this case, the control device 1 detects, for example, the inclination in the traveling direction of the vehicle 5 based on the output of the attitude sensor 4 and corrects the central irradiation angle ⁇ L by the detected inclination. In another example, the control device 1 refers to the map DB 20 to acquire gradient information of the road where the current position is present, and corrects the central irradiation angle ⁇ L by the gradient indicated by the acquired gradient information.
  • the control device 1 may set the central irradiation angle ⁇ L by further considering the road gradient at the detection target point Pt. For example, when the downward road gradient indicated by the gradient information at the detection target point Pt acquired from the map DB 20 is equal to or greater than a predetermined angle (that is, when the detection target point Pt exists on a steep downward slope), If the central irradiation angle ⁇ L is set based on the height difference H, it is determined that the laser beam is blocked by the road in front of the slope start point Ps, and the laser beam is not blocked by the road (for example, standard angle) The central irradiation angle ⁇ L is set to Similarly, when the upward road gradient indicated by the gradient information at the detection target point Pt acquired from the map DB 20 is equal to or greater than a predetermined angle, the control device 1 sets the central irradiation angle ⁇ L based on the height difference H.
  • the lidar detection range RL becomes an angle (for example, standard angle) at which the preceding vehicle etc. can be detected near the slope start point Ps.
  • the central irradiation angle ⁇ L is set to
  • the control device 1 is a mechanism that mechanically (physically) adjusts the vertical direction of the lidar 3 using the angle adjustment mechanism provided in the lidar 3 when adjusting the central irradiation angle ⁇ L Adjustment may be performed, or electronic adjustment may be performed to change an actual scanning range described later.
  • the central irradiation angle ⁇ L is adjusted by physically moving the entire rider 3 with respect to the vehicle 5 by a mechanical adjustment unit configured by, for example, a motor (not shown).
  • a specific example of the above-mentioned electronic adjustment will be described with reference to FIG.
  • FIG. 6A shows the scannable range “SR” of the lidar 3 and the actual scan range “FOV when the central illumination angle ⁇ L is a standard angle on a virtual illumination plane perpendicular to the emission direction of the lidar 3.
  • the scannable range SR refers to a range in which scanning with a pulse laser is possible
  • the actual scan range FOV refers to a range in which scanning with a pulse laser is actually performed.
  • the arrows in the actual scanning range FOV indicate an example of the scanning direction.
  • the actual scanning range FOV is set to a range smaller than the scannable range SR, and the lidar 3 instructs a control signal to change the parameter of the internal signal of the optical transceiver 18.
  • the position adjustment of the actual scan range FOV is possible within the scannable range SR.
  • the upper side and the lower side of the actual scanning range FOV correspond to the boundary lines LT and LL shown in FIGS. 4 and 5, and the center line of the actual scanning range FOV along the longitudinal direction is shown in FIGS. It corresponds to the center line LM shown.
  • FIG. 6B shows the correspondence between the scannable range SR and the actual scan range FOV when the actual scan range FOV is moved upward within the scannable range SR based on the control signal supplied from the control device 1. Show the relationship. In this case, as the actual scanning range FOV moves upward within the scannable range SR, the central irradiation angle ⁇ L is added by a predetermined value from the standard angle, and the lidar detection range RL is greater than in the case of FIG. Move upwards.
  • FIG. 6C shows the correspondence between the scannable range SR and the actual scan range FOV when the actual scan range FOV is moved downward within the scannable range SR based on the control signal supplied from the control device 1. Show the relationship.
  • the central irradiation angle ⁇ L is subtracted from the standard angle by a predetermined value, and the lidar detection range RL is smaller than in the case of FIG. Move down.
  • the actual scanning range FOV does not move in the lateral direction, and the direction of the rider detection range RL with respect to the vehicle 5 is not changed.
  • the control device 1 can preferably adjust the central irradiation angle ⁇ L by electronic adjustment for moving the actual scanning range FOV up and down within the scannable range SR. Therefore, the control device 1 can suitably adjust the central irradiation angle ⁇ L of the lidar 3 by executing at least one of mechanical adjustment and electronic adjustment with respect to the lidar 3. For example, when it is determined that the central irradiation angle ⁇ L can not be set to the target angle only by the above-described electronic adjustment, the control device 1 performs mechanical adjustment of the lidar 3 using the angle adjustment mechanism. In this case, the control device 1 may store in advance information on the angular range of the central irradiation angle ⁇ L which can be adjusted by electronic adjustment. Further, in the present embodiment, when adjusting the actual scanning range FOV, the center line LM (the center irradiation angle ⁇ L ) is adjusted, but the boundary lines LT and LL may be adjusted.
  • FIG. 7 is a flow chart showing a processing procedure regarding control of the lidar detection range RL of the control device 1 in the first embodiment.
  • the control device 1 repeatedly executes the process of the flowchart of FIG. 7.
  • the control device 1 calculates the height difference H between the estimated current position and the detection target point Pt by referring to the map DB 20 (step S101) ).
  • the control device 1 refers to the map DB 20 and extracts the height of a point on the road corresponding to the estimated current position and the height of a point ahead by the detection target distance Dt from the point in question.
  • the height difference H is calculated.
  • the control device 1 adjusts the central irradiation angle ⁇ L based on the height difference H calculated in step S101 (step S102).
  • the control device 1 sets the central irradiation angle ⁇ L to be larger as the height difference H is larger.
  • the control device 1 can detect a preceding vehicle or the like near the detection target point Pt based on the output of the rider 3 even when the detection target point Pt is present on a slope.
  • the control device 1 performs central irradiation by the detected inclination.
  • the angle ⁇ L may be corrected.
  • the control device 1 controls the rider 3 capable of detecting an object present around the vehicle 5, and detects the current position of the vehicle 5 and the current position.
  • Information (first information) of the height difference H with the detection target point Pt separated by the target distance Dt is acquired.
  • the control device 1 controls the rider 3 to change the direction of the rider detection range RL of the rider 3 with respect to the vehicle 5.
  • the control device 1 can suitably detect an object present in the vicinity of the detection target point Pt based on the output of the rider 3 even when the detection target point Pt exists on the slope.
  • the control device 1 detects a slope when it approaches a slope start point Ps of a slope existing in the traveling direction within a predetermined distance (also referred to as a “threshold distance Lth”).
  • the central irradiation angle ⁇ L is set based on the gradient ⁇ B.
  • FIG. 8 (A) ⁇ (C) shows a cross-sectional view of the traveling road of the vehicle 5 when the near uphill slope theta B vehicle 5 travels.
  • FIG. 8 (A) shows a state where the slope start point Ps is present farther than the threshold distance Lth from the current position of the vehicle 5, and
  • FIG. 8 (B) shows the slope start point Ps being the current of the vehicle 5.
  • FIG. 8C shows a state after the vehicle 5 has passed the uphill slope start point Ps from the position within the threshold distance Lth.
  • the control device 1 determines the presence or absence of the slope start point Ps within the threshold distance Lth by acquiring, for example, gradient information or altitude information of the traveling road of the vehicle 5 from the map DB 20.
  • the threshold distance Lth is set, for example, to be shorter than the detection target distance Dt used in the first embodiment.
  • the control device 1 determines the start point of each link The slope of each link is calculated based on the difference in height between the and the end point and the horizontal distance difference based on the latitude and longitude, and the start point of the link whose absolute value of the slope is a predetermined angle or more is regarded as the slope start point Ps.
  • the control device 1 when there is link data including gradient information for each road having different gradients in the map DB 20, the control device 1 generates a link including gradient information indicating a gradient whose absolute value is equal to or greater than a predetermined angle.
  • the start point is regarded as a slope start point Ps.
  • the control device 1 determines that it is not necessary to adjust the central irradiation angle ⁇ L
  • the illumination angle ⁇ L is maintained at the standard angle.
  • the control device 1 determines that the slope start point Ps is within the threshold distance Lth from the current position, and determines the central irradiation angle ⁇ L based on the slope ⁇ B of the uphill. Do. Specifically, the control device 1 adds the central irradiation angle ⁇ L by the angle ⁇ B from the standard angle (here, 0 degree).
  • the center line LM passing through the center of the rider detection range RL is parallel to the road surface of the slope, so that a preceding vehicle or the like on the slope can be included in the rider detection range RL. Therefore, in this case, based on the output of the rider 3, the control device 1 can suitably detect a preceding vehicle or the like existing on a slope that is to be traveled.
  • the control device 1 determines the slope start point Ps based on the position information of the slope start point Ps and the current position information of the vehicle 5 estimated from the output of the sensor group, for example. Recognize that it has passed. In this case, the control device 1 returns the central irradiation angle ⁇ L to a standard angle (here, 0 degree).
  • a standard angle here, 0 degree.
  • the control device 1 After passing through the slope start point Ps, the control device 1 returns the central irradiation angle ⁇ L from the angle ⁇ B to the standard angle, and the center line LM passing the center of the irradiation range of the rider 3 with the road surface of the slope. Continue to be parallel. Thereby, the control device 1 can include the preceding vehicle or the like on the slope in the rider detection range RL, and can appropriately detect the preceding vehicle or the like based on the output of the rider 3.
  • FIGS. 9A and 9B show cross-sectional views of a road taken along the traveling direction of the vehicle when the vehicle travels in the vicinity of the downhill of the gradient ⁇ B.
  • FIG. 9 (A) shows a state where the current position of the vehicle 5 is within the threshold distance Lth from the slope start point Ps on the down slope
  • FIG. 9 (B) shows the vehicle on the slope start point Ps on the down slope. 5 shows the state after passing.
  • the control unit 1 determines that the slope starting point Ps approaches within a threshold distance Lth for downhill next passing central illumination angle based on the gradient - [theta] B of the downhill ⁇ Determine L. Specifically, the control device 1 subtracts the central irradiation angle ⁇ L from the standard angle by the angle ⁇ B. In this case, since the center line LM passing through the center of the rider detection range RL is parallel to the road surface of the slope, the leading vehicle or the like on the downhill can be included in the rider detection range RL. Therefore, based on the output of the rider 3, the control device 1 can suitably detect a preceding vehicle or the like present on the downhill to be traveled.
  • the control device 1 has passed the slope start point Ps based on, for example, the position information of the slope start point Ps and the current position information of the vehicle 5 estimated from the output of the sensor group Recognize that. In this case, the controller 1 returns the central irradiation angle ⁇ L to the standard angle. Thereby, the control device 1 can make the center line LM passing through the center of the irradiation range of the rider 3 parallel to the road surface of the slope and include the preceding vehicle or the like on the slope in the rider detection range RL.
  • FIG. 10 is a flowchart showing a processing procedure regarding control of the rider detection range RL of the control device 1 in the second embodiment.
  • the control device 1 repeatedly executes the process of the flowchart of FIG.
  • the control device 1 determines whether the vehicle 5 has approached a slope (step S201). Specifically, the control device 1 refers to the map DB 20 and determines whether or not there is a slope start point Ps within a threshold distance Lth from the current position estimated based on the output of the sensor group or the like. Then, when it is determined that the controller 1 does not approach the slope (step S201; No), that is, when the slope start point Ps does not exist within the threshold distance Lth from the current position, the central irradiation angle ⁇ L is set to the standard angle. (Step S202).
  • the control device 1 acquires information on the slope ⁇ B of the approaching slope from the map DB 20, and based on the slope ⁇ B , the central irradiation angle ⁇ L is set (step S203).
  • the control device 1 adds the angle ⁇ B (negative value in the case of downhill) to the central irradiation angle ⁇ L which is a standard angle.
  • the control device 1 may correct the central irradiation angle ⁇ L based on the attitude of the vehicle 5 at the current position (ie, the road gradient at the current position).
  • control device 1 determines whether the vehicle 5 has passed the slope start point Ps (step S204). Then, the control device 1, when the vehicle 5 is not passed through the hill start point Ps (step S204; No), subsequently executes a step S203, sets the value based on the central irradiation angle theta L gradient theta B.
  • the control device 1 when the vehicle 5 is judged to have passed the slope start point Ps; return (step S204 Yes), the central irradiation angle theta L standard angle (step S205).
  • the control device 1 can preferably include a leading vehicle or the like on the slope in the rider detection range RL.
  • the direction of the rider detection range RL based on central irradiation angle theta L set at step S202 and step S205, and functions as a first direction
  • the rider detection range based on the center irradiation angle theta L set in step S203
  • the direction of RL functions as a second direction.
  • the control device 1 controls the lidar 3 capable of detecting an object present around the vehicle 5, and in a direction in which the vehicle 5 may travel.
  • the slope information (first information) of the existing slope is acquired from the map DB 20.
  • the control device 1 controls the rider 3 to change the direction of the rider detection range RL of the rider 3 with respect to the vehicle 5 based on the acquired gradient information.
  • the control device 1 can accurately detect the object based on the output of the rider 3 even when approaching a slope.
  • the control device 1 is currently referred to as a slope based on the difference in height between the current position and the detection target point Pt (also referred to as “point-to-point slope ⁇ SJ ").
  • the central irradiation angle ⁇ L is set based on the gradient at the position (also referred to as “the vehicle position gradient ⁇ J ”).
  • the same components of the measurement system as those of the first and second embodiments are given the same reference numerals as appropriate, and the description thereof will be omitted.
  • FIG. 11 is a sectional view of a road schematically showing a method of determining the central irradiation angle ⁇ L in the third embodiment.
  • "h” shows the attachment height of the rider 3 from a road surface
  • "Ht” shows the target height which detects an object in detection object point Pt
  • "(theta) S " is detection The gradient at the target point Pt is shown.
  • a slope with a slope ⁇ B exists in front of the vehicle 5, and a preceding vehicle exists in the vicinity of the detection target point Pt on the road ahead of the slope.
  • control device 1 obtains the two-point gradient ⁇ SJ and the vehicle position gradient ⁇ J, and calculates the central irradiation angle ⁇ L based on the following equation (1).
  • control device 1 adds the two-point gradient ⁇ SJ used in the equation (1) to the height difference H and the detection target distance Dt, and further adds the mounting height h and the target height Ht.
  • geometrical calculation is performed based on the following equation (2).
  • the control device 1 can more accurately determine the central irradiation angle ⁇ L by using the mounting height h and the target height Ht.
  • the control device 1 extracts the height difference H from the map DB 20 by extracting the height information of the current position and the height information at the detection target point Pt, as in the first embodiment. calculate. Further, the information of the mounting height h and the target height Ht is stored in the storage unit 12 in advance, for example.
  • the control device 1 acquires the vehicle position gradient ⁇ J based on, for example, the information of the road gradient at the current position included in the map DB 20 or the output of the attitude sensor 4.
  • the vehicle 5 is on a flat road, so the vehicle position gradient ⁇ J is 0 degrees. Therefore, in this case, the control device 1 sets the central irradiation angle ⁇ L to the point-to-point gradient ⁇ SJ based on the equation (1).
  • the control device 1 can preferably include the preceding vehicle present in the vicinity of the detection target point Pt in the rider detection range RL.
  • FIG. 12 shows a cross-sectional view of a road when the vehicle 5 enters a slope from the state of FIG.
  • the control device 1 detects the vehicle position gradient ⁇ J equal to the gradient ⁇ B based on the output of the map DB 20 or the attitude sensor 4. Further, as in the case of FIG. 11, the control device 1 uses the equation (2) to determine the gradient ⁇ SJ between two points from the height difference H, the detection target distance Dt, the mounting height h, and the target height Ht. calculate. Then, the control device 1 sets the central irradiation angle ⁇ L to a value obtained by subtracting the vehicle position gradient ⁇ J (that is, the gradient ⁇ B ) from the two-point gradient ⁇ SJ based on the equation (1).
  • the control device 1 geometrically calculates the mounting height h in consideration of the inclination of the vehicle 5 from the mounting height on the flat road by using the slope ⁇ B of the slope. May be Even when the detection target point Pt exists on the slope, the control device 1 similarly sets the target height Ht in consideration of the road gradient of the detection target point Pt on the flat road stored in the storage unit 12 It may be geometrically calculated by using the gradient ⁇ B from the height.
  • FIG. 13 shows a cross-sectional view of a road when the vehicle 5 is traveling downhill.
  • the vehicle position gradient theta J is equal to the gradient - [theta] B.
  • the control device 1 detects the vehicle position gradient ⁇ J equal to the gradient ⁇ B based on the output of the map DB 20 or the attitude sensor 4. Further, as in the case of FIG. 11 and FIG. 12, the control device 1 uses the equation (2) to determine the gradient ⁇ SJ between two points, the height difference H (here, a negative value), the detection target distance Dt, and the attachment Calculated from the height h and the target height Ht.
  • the control device 1 uses the equation (2) to determine the gradient ⁇ SJ between two points, the height difference H (here, a negative value), the detection target distance Dt, and the attachment Calculated from the height h and the target height Ht.
  • the control device 1 can preferably include the preceding vehicle present near the detection target point Pt in the rider detection range RL.
  • FIG. 14 is a flowchart showing a processing procedure regarding control of the rider detection range RL of the control device 1 in the third embodiment.
  • the control device 1 repeatedly executes the process of the flowchart of FIG.
  • the control device 1 acquires the vehicle position gradient ⁇ J (step S301).
  • the controller 1 may acquire the vehicle position gradient theta J by referring to the gradient information of the road corresponding to the current position from the map DB 20, the front and rear of the vehicle 4 detected by the output of the orientation sensor 4 the direction of the inclination may be acquired as the vehicle position gradient theta J.
  • the controller 1 determines the gradient ⁇ SJ between two points from equation (2). It calculates (step S302).
  • the control device 1 sets the central irradiation angle ⁇ L according to equation (1) based on the host vehicle position gradient ⁇ J calculated in step S301 and the point-to-point gradient ⁇ SJ calculated in step S302 (step S303). By doing this, the control device 1 does not detect whether or not there is a slope in the traveling direction, and regardless of whether it is before, during or after passing the slope.
  • the point-to-point gradient ⁇ SJ can be appropriately set such that an object in the vicinity of the detection target point Pt is included in the rider detection range RL.
  • the control device 1 sets a center value obtained by adding “ ⁇ SJ ⁇ J ” on the right side of Formula (1) to the standard angle. it may define as the irradiation angle ⁇ L.
  • the control device 1 is to control the rider 3 capable of detecting an object present around the vehicle 5, and the current position of the vehicle 5 and the detection target point Pt
  • the information (first information) of the gradient ⁇ SJ between two points based on the difference in altitude and the information (second information) of the vehicle position gradient ⁇ J related to the attitude of the vehicle 5 are acquired.
  • the control device 1 controls the rider 3 to change the direction of the rider detection range RL of the rider 3 with respect to the vehicle 5 based on the acquired information of the point-to-point gradient ⁇ SJ and the vehicle position gradient ⁇ J .
  • the control device 1 can accurately detect an object present in the vicinity of the detection target point Pt based on the output of the rider 3 even when traveling around a slope.
  • the control device 1 has a road section forming a mountain or a valley between the current position and the detection target point Pt.
  • the central irradiation angle ⁇ L is set with reference to the top of the mountain or the bottom of the valley.
  • FIG. 15A is a schematic diagram in the case where the central irradiation angle ⁇ L is determined based on the first embodiment on a road where a mountain of peak position “Pp” exists between the current position and the detection target point Pt. is there.
  • the control device 1 sets the central irradiation angle ⁇ L to a standard angle It is set to (0 ° here).
  • the control device 1 can not detect the preceding vehicle 31 based on the output of the rider 3.
  • FIG. 15B is a schematic diagram in the case where the central irradiation angle ⁇ L is determined based on the first embodiment on the road where the valley of the valley bottom position “Pd” exists between the current position and the detection target point Pt. is there.
  • the control device 1 can capture an object on the detection target point Pt based on the height difference H between the current position and the detection target point Pt and the road gradient at the current position (here, the horizontal direction
  • the rider detection range RL is set to.
  • the laser light emitted from the lidar 3 is directed onto the detection target point Pt having an altitude higher than the valley bottom position Pd, and therefore, the leading vehicle 32 present near the valley bottom position Pd is not irradiated. Therefore, in this case, since the preceding vehicle 32 is not included in the rider detection range RL, the control device 1 can not detect the preceding vehicle 32 based on the output of the rider 3.
  • the peak top position Pp or the valley bottom position is determined in the method of determining the central irradiation angle ⁇ L based on the first embodiment.
  • An object near Pd can not be included in the lidar detection range RL.
  • the control device 1 when a peak position Pp or a valley position Pd satisfying a predetermined condition is present between the current position and the detection target point Pt, the control device 1 performs exceptional processing as a exception processing.
  • the central irradiation angle ⁇ L is determined based on the peak position Pp or the valley bottom position Pd.
  • the control device 1 the central irradiation as setting the center irradiation angle theta L as a reference as setting the center irradiation angle theta L the detection target point Pt on the basis of the summit position Pp or valley position Pd
  • An angular difference (also referred to as “irradiation angle difference ⁇ xy”) of the angle ⁇ L is calculated, and the above-described exception processing is performed when the irradiation angle difference ⁇ xy is equal to or more than a predetermined angle.
  • FIG. 16A is a cross-sectional view of a road in which the irradiation angle difference ⁇ xy used for the determination to switch to the exceptional processing is specified under the same situation as FIG. 15A.
  • the control device 1 first refers to the map DB 20 and determines the presence or absence of the peak position Pp on the path within the detection target distance Dt from the current position. For example, assuming that there are separate link data on the uphill and downhill of a mountain, the control device 1 determines that the link end point is higher than the link start point by a predetermined length, and the link start point When a link higher by a predetermined length than the end point of the link is adjacent, it is determined that the connection portion (i.e., node) of these links is the peak position Pp.
  • the control device 1 After the detection of the peak position Pp, the control device 1 sets the central irradiation angle ⁇ L (see dashed line Lx) based on the detection target point Pt and the central irradiation angle ⁇ L (see dashed line Ly) based on the peak position Pp. Calculate each.
  • the light path when the laser light is irradiated from the vehicle 5 at the central irradiation angle ⁇ L based on the detection target point Pt is indicated by a broken line Lx, and the vehicle 5 is illustrated at the central irradiation angle ⁇ L based on the peak position Pp.
  • the light path when the laser beam is irradiated from the point of time is indicated by a broken line Ly.
  • the broken line Lx functions as a first line
  • the broken line Ly functions as a second line.
  • the control device 1 adds the mounting height h of the rider 3 to the height of the road surface at the current position and the road height at the detection target point Pt or peak position Pp. The difference between the target height Ht and the height added is calculated as the height difference H in the first embodiment. Then, the control device 1 calculates the central irradiation angle ⁇ L based on the detection target point Pt and the center based on the peak position Pp based on the calculated angle difference according to the height difference H and the gradient of the current position. The irradiation angle ⁇ L is calculated.
  • the control device 1 includes a central irradiation angle theta L relative to the detection target point Pt, it calculates the irradiation angle difference ⁇ xy the center irradiation angle theta L relative to the summit position Pp, the irradiation angle difference ⁇ xy predetermined angle (also referred to as "threshold angle ⁇ th".) If the above, it determines the center irradiation angle theta L based on the summit position Pp.
  • the above-mentioned threshold angle ⁇ th is set to, for example, a half of the vertical viewing angle of the lidar 3 (that is, the angle range in which scanning is performed in the vertical direction). Thereby, the control device 1 prevents the object present on the road up to the peak position Pp from being not included in the rider detection range RL.
  • the control device 1 includes, in the calculation of the center irradiation angle theta L for determining the irradiation angle difference Shitaxy, the mounting height h and / or the target height Ht may not be taken into account. For example, the control device 1 does not have to add the target height Ht to the height difference to be calculated in the calculation of the central irradiation angle ⁇ L (refer to the broken line Ly) based on the peak position Pp. Similarly, the control device 1 does not have to add the target height Ht to the height difference to be calculated also in the calculation of the central irradiation angle ⁇ L (refer to the broken line Lx) based on the detection target point Pt.
  • FIG. 16B is a road sectional view clearly showing the irradiation range of the rider 3 when the irradiation angle difference ⁇ xy is larger than the threshold angle ⁇ th under the condition of FIG. 16A.
  • the control device 1 sets 1/2 of the vertical viewing angle “ ⁇ v” of the lidar 3 as the threshold angle ⁇ th, and the irradiation angle difference ⁇ xy is larger than 1/2 of the vertical viewing angle ⁇ v. Therefore, the central irradiation angle ⁇ L is determined based on the peak position Pp.
  • the center line LM indicating the irradiation direction at the central irradiation angle ⁇ L coincides with the broken line Ly in FIG.
  • the preceding vehicle 31 existing in the vicinity of the summit position Pp is included in the rider detection range RL, and the control device 1 can suitably detect the preceding vehicle 31 based on the output of the rider 3 .
  • FIG. 17A is a cross-sectional view of a road in which the irradiation angle difference ⁇ xy used for the determination to switch to the exceptional processing is specified under the same situation as FIG. 15B.
  • the control device 1 first refers to the map DB 20 and determines the presence or absence of the valley bottom position Pd on the route within the detection target distance Dt from the current position. For example, assuming that separate link data exist in the downhill and uphill of the valley, the control device 1 determines that the link end point is lower than the link start point by a predetermined length, and the link start point Is adjacent to a link lower than the end point of the link by a predetermined length, it is determined that the connection portion (i.e., node) of these links is the valley bottom position Pd.
  • the control device 1 calculates the central irradiation angle ⁇ L (see dashed line Lx) based on the detection target point Pt and the central irradiation angle ⁇ L (see dashed line Ly) based on the valley bottom position Pd.
  • the irradiation angle difference ⁇ xy is obtained by calculating
  • the light path when the laser light is irradiated from the vehicle 5 at the central irradiation angle ⁇ L with reference to the detection target point Pt is shown by a broken line Lx, and the vehicle 5 by the central irradiation angle ⁇ L with the valley bottom position Pd as the reference.
  • the light path when the laser beam is irradiated from the point of time is indicated by a broken line Ly.
  • the broken line Lx functions as a first line
  • the broken line Ly functions as a second line.
  • the control device 1 takes into consideration the mounting height h of the lidar 3, the target height Ht at the detection target point Pt and the valley bottom position Pd, as in the example of FIG. calculating the altitude difference H, based on the calculated altitude difference H and the slope or the like of the current position, is calculated each central irradiation angle theta L.
  • the control device 1 detects the detection target It is determined that there is a possibility that the road surface before the point Pt is not included in the rider detection range RL, and the central irradiation angle ⁇ L is determined based on the valley bottom position Pd.
  • FIG. 17 (B) is a road sectional view clearly showing the irradiation range of the rider 3 when the irradiation angle difference ⁇ xy is larger than the threshold angle ⁇ th under the condition of FIG. 17 (A).
  • the control device 1 sets half of the vertical viewing angle “ ⁇ v” of the lidar 3 as the threshold angle ⁇ th, and the irradiation angle difference ⁇ xy is larger than half of the vertical viewing angle ⁇ v. Therefore, the central irradiation angle ⁇ L is determined based on the valley bottom position Pd.
  • the center line LM indicating the irradiation direction at the central irradiation angle ⁇ L coincides with the broken line Ly in FIG. 17 (A).
  • the preceding vehicle 32 present in the vicinity of the valley bottom position Pd is included in the rider detection range RL, and the control device 1 can detect the preceding vehicle 32 based on the output of the rider 3.
  • FIG. 18 is a flowchart showing a processing procedure regarding control of the rider detection range RL of the control device 1 in the fourth embodiment.
  • the control device 1 repeatedly executes the process of the flowchart of FIG.
  • control device 1 determines whether or not there is a road section forming a mountain or valley before the detection target point Pt (step S401). For example, the control device 1 determines the presence or absence of the peak position Pp or the valley bottom position Pd by referring to the map DB 20 for altitude information of link data corresponding to a road along the route.
  • step S401 when the control device 1 determines that a road section forming a mountain or a valley is present in front of the detection target point Pt (step S401; Yes), the central irradiation angle ⁇ L and the summit are based on the detection target point Pt. position Pp or calculates the irradiation angle difference ⁇ xy the center irradiation angle theta L relative to the root position Pd (step S402). Then, the control device 1, when the irradiation angle difference ⁇ xy is threshold angle ⁇ th or more (step S403; Yes), adjusts the center irradiation angle theta L based on the summit position Pp or valley position Pd (step S404). In this case, the control device 1 sets the central irradiation angle ⁇ L based on the difference in height between the current position and the peak position Pp or the valley bottom position Pd, the gradient at the current position, or the like.
  • step S403 when the irradiation angle difference ⁇ xy is less than the threshold angle ⁇ th (step S403: No), or when the road section forming the peak position Pp and the valley bottom position Pd does not exist in front of the detection target point Pt. (Step S401; No), center irradiation angle theta L is adjusted on the basis of detection object point Pt (Step S405).
  • the control device 1 sets the central irradiation angle ⁇ L based on the height difference H between the current position and the detection target point Pt, the gradient at the current position, and the like.
  • the control device 1 controls the rider 3 capable of detecting an object present in the rider detection range RL, and is based on the current position of the vehicle 5 and the current position.
  • Information (first information) of a peak position Pp or a valley bottom position Pd existing between the detection target point Pt which is separated from the detection target distance Dt in the horizontal direction is acquired.
  • the control device 1 controls the rider 3 so as to change the direction of the rider detection range RL of the rider 3 with respect to the vehicle 5 based on the acquired information of the peak position Pp or the valley bottom position Pd.
  • the control device 1 can accurately detect an object present in the vicinity of the detection target point Pt based on the output of the rider 3.
  • the control device 1 sets the central irradiation angle ⁇ L based on the distance from the current position to the slope start point Ps (also referred to as “slope reaching distance Ds”). 4 Different from the embodiment.
  • slope reaching distance Ds also referred to as “slope reaching distance Ds”.
  • FIGS. 19A to 19C are cross-sectional views of the road showing the relative positional relationship between the current position “P” and the slope.
  • a broken line circle “CL” is a line connecting positions separated from the current position P by the detection target distance Dt, and coordinate values (0, 0), (Dt, 0) , (Ds, 0) indicate two-dimensional coordinates when the current position P is the origin, the horizontal direction is the X axis, and the vertical direction is the Y axis.
  • the control device 1 regards the intersection position between the broken line circle CL whose radius is the detection target distance Dt and the route of the vehicle 5 as the detection target point Pt, and the central irradiation angle ⁇ based on the detection target point Pt. Determine L. Therefore, in the present embodiment, the detection target point Pt is a point separated from the current position by the detection target distance Dt at a linear distance.
  • the control device 1 determines the central irradiation angle ⁇ L based on the coordinate value (Dt, 0) of the detection target point Pt.
  • the center line LM of the lidar detection range RL is a line connecting the coordinate value (0, 0) of the current position P and the coordinate value (Dt, 0), and the central irradiation angle ⁇ L is 0 °.
  • the control device 1 calculates the above-described intersection position, and determines the central irradiation angle ⁇ L so that the line connecting the intersection position and the current position P coincides with the center line LM of the rider detection range RL. Do. In this case, as described later, the control device 1, a coordinate value of the detection target point Pt as the intersection of the above is calculated based on the slope gradient theta B and slope reach Ds.
  • the current position P exists immediately before the slope start point Ps.
  • the center line LM of the irradiation range of the lidar 3 is substantially parallel to the slope, and the central irradiation angle ⁇ L at this time substantially matches the slope ⁇ B.
  • the control apparatus 1 when approaching within the detection target distance Dt in slope start point Ps of slope present on the traveling direction, the detection target distance Dt, slope reach Ds, based on the slope gradient theta B, the detection target Coordinate values (x t , y t ) of the point Pt are calculated from the equations (4) and (5). Thereby, the control device 1 can appropriately determine the central irradiation angle ⁇ L based on the equation (6). In this case, the controller 1 detects the slope start point Ps based on the current position estimated from the output of the sensor group and the road data of the map DB 20 as in the second embodiment.
  • control device 1 may reduce the processing load due to the calculation by approximately obtaining x t and y t in order to avoid the expression (5) for finding x t becoming complicated.
  • the control device 1 focuses on the fact hill slope theta B is up to 37 degrees in Japan, regarded as the detection object distance Dt of x t.
  • the following equation (7) is obtained based on the equation (4).
  • the control device 1 can suitably reduce the processing load for determining the central irradiation angle ⁇ L by determining the central irradiation angle ⁇ L based on the equation (8).
  • the combination of the gradient ⁇ B and the slope reaching distance Ds at which the error is maximum when the detection target distance Dt is 100 m is 37 ° and 50 m.
  • the central irradiation angle ⁇ L is 20.64 ° ( The error is 2.14 °).
  • FIG. 20 is a flowchart showing a processing procedure regarding control of the rider detection range RL of the control device 1 in the fifth embodiment.
  • the control device 1 repeatedly executes the process of the flowchart of FIG.
  • the control device 1 determines whether a slope reaching distance Ds, which is a distance to the slope start point Ps, has become equal to or less than a detection target distance Dt (step S501). Then, when the slope reaching distance Ds is equal to or less than the detection target distance Dt (step S501; Yes), the control device 1 executes the processing of steps S502 to S505. On the other hand, when the slope reaching distance Ds is longer than the detection target distance Dt (step S501; No), the control device 1 determines that it is not necessary to determine the central irradiation angle ⁇ L in consideration of the slope. L is set to the standard angle (step S506).
  • Y t may be calculated based on Then, the control device 1 sets the central irradiation angle ⁇ L based on the equation (6) (step S504). As a result, the control device 1 can preferably determine the lidar detection range RL with reference to a point on the slope where the detection target distance Dt is a linear distance from the current position.
  • the control device 1 determines whether or not the slope start point Ps has been reached (step S505). Then, when it is determined that the slope start point Ps has been reached (step S505; Yes), the control device 1 sets the central irradiation angle ⁇ L as a standard angle (step S506). On the other hand, the control device 1, when it has not reached the slope start point Ps (step S505; No), subsequently perform the steps S502 ⁇ S504, sets the central illumination angle theta L depending on the slope reach Ds.
  • the control device 1 controls the rider 3 capable of detecting an object present in the periphery of the vehicle 5, and in a direction in which the vehicle 5 may travel.
  • the gradient information on the slope ⁇ B of the existing slope is acquired from the map DB 20. Then, when the slope arrival distance Ds from the current position to the slope start point Ps becomes equal to or less than the detection target distance Dt, the control device 1 based on the gradient information on the slope ⁇ B and the distance information on the slope arrival distance Ds.
  • the lidar 3 is controlled to change the direction of the rider detection range RL of the rider 3 with respect to the vehicle 5. Thereby, even when approaching the slope, the control device 1 can accurately detect the object on the slope based on the output of the rider 3.
  • the signal processing unit 19 of the rider 3 may execute part or all of the processing performed by the control unit 16 of the control device 1.
  • the signal processing unit 19 may execute the processing of the flowcharts of FIGS. 7, 10, 14, 18, and 20 instead of the control device 1.
  • the signal processing unit 19 acquires altitude information, gradient information and the like regarding the road on the road from map data stored in the storage unit of the rider 3 (not shown) or other devices, and executes the processing of each flowchart. Then, the central irradiation angle ⁇ L is determined, and the internal parameters of the light transmitting / receiving unit 18 are changed so that the determined central irradiation angle ⁇ L is realized.
  • the signal processing unit 19 functions as a control device and a computer in the present invention.
  • a plurality of riders 3 may be provided for the vehicle 5.
  • FIG. 21 shows a configuration example of a measurement system according to the present modification.
  • the rider 3 that sets the front of the vehicle 5 as the rider detection range RL and the rider 3 a that sets the rear of the vehicle 5 as the rider detection range RL are provided at the same height as the headlights.
  • the control device 1 controls the lidar detection range RL for the rider 3 based on any of the above-described first to fifth embodiments. Further, the control device 1 identifies the detection target point "Pta" located behind the current position by the detection target distance "Dta" to be detected by the lidar 3a, and determines the current position and the detection target point Pta.
  • the central irradiation angle ⁇ L of the lidar 3a is determined by the same algorithm as in the first to fifth embodiments based on the difference in altitude and the road gradient at the current position. By doing this, the control device 1 can also control the rider detection range RL of the rider 3Aa suitably.
  • control device 1 may receive information necessary for processing from a server device storing information equivalent to the map DB 20 instead of including the map DB 20.
  • the control device 1 instead of adjusting the lidar detection range RL of the lidar 3, the control device 1 adjusts the detection range of an external sensor that emits electromagnetic waves or ultrasonic waves other than the lidar 3 It is also good. Even in this case, the control device 1 can accurately control the detection range of the external sensor of the target near the slope to improve the object detection accuracy.
  • the control device 1 takes the height difference H into consideration in consideration of the mounting height h of the lidar 3 and the target height Ht at the detection target point Pt, as in the third and fourth embodiments. It may be calculated. In this case, the control device 1 adds the target height Ht to the height difference based on the current position extracted from the map DB 20 and each height information of the detection target point Pt, and reduces the mounting height h. May be calculated as the height difference H in the first embodiment.
  • the controller 1 determines the central irradiation angle ⁇ L without considering the mounting height h and the target height Ht as in the first embodiment. Good.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif de commande 1 qui commande un LIDAR 3 qui peut détecter des objets qui sont dans la périphérie d'un véhicule 5. Le dispositif de commande 1 : acquiert la différence de hauteur H entre la position actuelle du véhicule 5 et un point cible de détection Pt qui est une distance cible de détection Dt à partir de la position actuelle ; et, sur la base de la différence de hauteur H acquise, commande le LIDAR 3 afin de modifier la direction, par rapport au véhicule 5, d'une portée de détection de LIDAR RL pour le LIDAR 3.
PCT/JP2018/044213 2017-12-01 2018-11-30 Dispositif de commande, dispositif de détection, procédé de commande, programme et support de stockage WO2019107550A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-232044 2017-12-01
JP2017232044 2017-12-01

Publications (1)

Publication Number Publication Date
WO2019107550A1 true WO2019107550A1 (fr) 2019-06-06

Family

ID=66664922

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/044213 WO2019107550A1 (fr) 2017-12-01 2018-11-30 Dispositif de commande, dispositif de détection, procédé de commande, programme et support de stockage

Country Status (1)

Country Link
WO (1) WO2019107550A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019211210A1 (de) * 2019-07-29 2021-02-04 Zf Friedrichshafen Ag Vorrichtung zur Lagesteuerung einer Sensoreinheit eines Fahrzeugs sowie zugehöriges Verfahren
WO2022188539A1 (fr) * 2021-03-11 2022-09-15 上海擎朗智能科技有限公司 Procédé et appareil de commande de robot mobile, robot mobile, et support de stockage

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004151080A (ja) * 2002-10-28 2004-05-27 Hyundai Motor Co Ltd 車間距離測定方法及び装置
JP2010162975A (ja) * 2009-01-14 2010-07-29 Fujitsu Ten Ltd 車両制御システム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004151080A (ja) * 2002-10-28 2004-05-27 Hyundai Motor Co Ltd 車間距離測定方法及び装置
JP2010162975A (ja) * 2009-01-14 2010-07-29 Fujitsu Ten Ltd 車両制御システム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019211210A1 (de) * 2019-07-29 2021-02-04 Zf Friedrichshafen Ag Vorrichtung zur Lagesteuerung einer Sensoreinheit eines Fahrzeugs sowie zugehöriges Verfahren
WO2022188539A1 (fr) * 2021-03-11 2022-09-15 上海擎朗智能科技有限公司 Procédé et appareil de commande de robot mobile, robot mobile, et support de stockage
US11619948B2 (en) 2021-03-11 2023-04-04 Keenon Robotics Co., Ltd. Control method of mobile robot, mobile robot, and storage medium

Similar Documents

Publication Publication Date Title
US11796657B2 (en) Control device, control method, program, and storage medium
JP2023101818A (ja) 制御装置、検知装置、制御方法、プログラム及び記憶媒体
US10965099B2 (en) Light control device, control method, program and storage medium
JP4244964B2 (ja) 車両用測距装置
US10739441B2 (en) System and method for adjusting a LiDAR system
US10093312B2 (en) Obstacle determining apparatus, moving body, and obstacle determining method
JP6447863B2 (ja) 移動体
US10877134B2 (en) LIDAR peak detection using splines for autonomous driving vehicles
US11372090B2 (en) Light detection and range (LIDAR) device with SPAD and APD sensors for autonomous driving vehicles
JP2019100855A (ja) 制御装置、検知装置、制御方法、プログラム及び記憶媒体
JP2023038220A (ja) 姿勢推定装置、制御方法、プログラム及び記憶媒体
WO2019107550A1 (fr) Dispositif de commande, dispositif de détection, procédé de commande, programme et support de stockage
US11520019B2 (en) Light signal detection device, range finding device, and detection method
JP2019100856A (ja) 制御装置、検知装置、制御方法、プログラム及び記憶媒体
JP2019100854A (ja) 制御装置、検知装置、制御方法、プログラム及び記憶媒体
CN112462751A (zh) 车辆控制装置、车辆控制方法及存储介质
JP7012693B2 (ja) 情報処理装置、車両システム、情報処理方法、およびプログラム
JP2019078688A (ja) 地物データ構造、記憶装置、制御装置、制御方法、プログラム及び記憶媒体
JP6390459B2 (ja) 光軸ずれ検出装置
JP7324925B2 (ja) 光制御装置、制御方法、プログラム及び記憶媒体
US20240036175A1 (en) Single photon detection based light detection and range (lidar) for autonomous driving vehicles
US20240067171A1 (en) Brake light activation determining device, storage medium storing computer program for determining brake light activation, and method for determining brake light activation
WO2019082699A1 (fr) Dispositif de commande, procédé de commande, programme et support d'informations
JP2023060126A (ja) 光制御装置
JP2022095980A (ja) センサ制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18884627

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18884627

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP