WO2018173855A1 - Module de capteur, système de capteur et procédé d'installation de système de capteur dans un véhicule - Google Patents

Module de capteur, système de capteur et procédé d'installation de système de capteur dans un véhicule Download PDF

Info

Publication number
WO2018173855A1
WO2018173855A1 PCT/JP2018/009718 JP2018009718W WO2018173855A1 WO 2018173855 A1 WO2018173855 A1 WO 2018173855A1 JP 2018009718 W JP2018009718 W JP 2018009718W WO 2018173855 A1 WO2018173855 A1 WO 2018173855A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
vehicle
sensor system
processor
output value
Prior art date
Application number
PCT/JP2018/009718
Other languages
English (en)
Japanese (ja)
Inventor
美昭 伏見
祐介 笠羽
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Priority to EP18770971.2A priority Critical patent/EP3605136A4/fr
Priority to JP2019507575A priority patent/JPWO2018173855A1/ja
Priority to US16/496,081 priority patent/US20200039531A1/en
Priority to CN201880019683.1A priority patent/CN110446941A/zh
Publication of WO2018173855A1 publication Critical patent/WO2018173855A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/0017Devices integrating an element dedicated to another function
    • B60Q1/0023Devices integrating an element dedicated to another function the element being a sensor, e.g. distance sensor, camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/06Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
    • B60Q1/068Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle by mechanical means
    • B60Q1/0683Adjustable by rotation of a screw
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration
    • B60W2050/0088Adaptive recalibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93277Sensor installation details in the lights
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/027Constructional details of housings, e.g. form, type, material or ruggedness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • G01S7/4091Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder during normal radar operation

Definitions

  • the present disclosure relates to a sensor module mounted on a vehicle, a sensor system mounted on the vehicle, and a method of mounting the sensor system on the vehicle.
  • a sensor for acquiring information outside the vehicle In order to realize driving support of a vehicle, it is necessary to mount a sensor for acquiring information outside the vehicle on the vehicle body. In order to acquire external information more accurately, different types of sensors may be used. Examples of such sensors include a camera and a LiDAR (Light Detection and Ranging) sensor (see, for example, Patent Document 1).
  • LiDAR Light Detection and Ranging
  • This disclosure aims to reduce the burden of the work of adjusting the detection reference position of a sensor after being mounted on a vehicle.
  • One aspect for achieving the above object is a sensor module mounted on a vehicle, A sensor for detecting information outside the vehicle; A support member supporting the sensor; An acceleration sensor supported by the support member; It has.
  • One aspect for achieving the above object is a sensor system mounted on a vehicle, A sensor for detecting information outside the vehicle; A support member supporting the sensor; An acceleration sensor supported by the support member; A memory for storing a first output value of the acceleration sensor at a first time point; A processor for obtaining a difference between the second output value of the acceleration sensor and the first output value at a second time point; It has.
  • the deviation amount can be grasped through the signal output of the acceleration sensor.
  • the signal output it becomes easy to automate the posture adjustment of the sensor in consideration of the deviation amount and the correction of the detection result by the sensor. Therefore, it is possible to reduce the burden of work for adjusting the detection reference position of the sensor after being mounted on the vehicle.
  • An example of the first time point is before the sensor system is mounted on a vehicle.
  • An example of the second time point is after the sensor system is mounted on a vehicle.
  • one aspect for achieving the above object is a method of mounting the above sensor system on a vehicle, A first step of storing, in the memory, a first output value of the acceleration sensor at the first time point before the sensor system is mounted on the vehicle; A second step of causing the processor to acquire a difference between the second output value of the acceleration sensor and the first output value at the second time point after the sensor system is mounted on the vehicle; Is included.
  • the first step may be performed by a first entity
  • the second step may be performed by a second entity different from the first entity
  • An example of the first entity is a manufacturer of the sensor system.
  • An example of the second entity is a manufacturer that assembles a vehicle using the sensor system as one component. In this case, the work burden of adjusting the detection reference position of the sensor by the second entity can be reduced.
  • the sensor system can be configured as follows. An adjustment mechanism for adjusting at least one of the position and orientation of the sensor; The processor causes the adjustment mechanism to perform the adjustment based on the difference.
  • the adjustment work for eliminating the deviation can be automated. Therefore, even if the number of sensors increases, it is possible to reduce the burden of work for adjusting the detection reference position of the sensor after the sensor system is mounted on the vehicle.
  • the sensor system can be configured as follows.
  • the processor causes the correction unit to perform the correction based on the difference.
  • a mechanism for adjusting at least one of the position and orientation of the sensor can be omitted. Therefore, not only can the burden of adjusting the detection reference position of the sensor after the sensor system is mounted on the vehicle be reduced, but also an increase in size and weight of the sensor system can be suppressed.
  • the above sensor system can be configured as follows.
  • a common housing for housing the sensor, the support member, and the acceleration sensor;
  • the processor is supported by the housing.
  • the function of the processor may be realized by a control device mounted on the vehicle. However, according to the above configuration, the processing load on the control device can be reduced.
  • the sensor system can be configured as follows.
  • the memory and the processor are supported by the support member.
  • the senor, the acceleration sensor, the memory, and the processor are easily modularized.
  • the above sensor system can be configured as follows.
  • a light source A light source adjustment mechanism for adjusting at least one of the position and posture of the light source;
  • a common housing that houses at least a part of the sensor, the support member, the acceleration sensor, the light source, and the light source adjustment mechanism; It has.
  • the lamp device and the sensor system can be easily integrated, and the above demand can be met.
  • Examples of the sensor include at least one of a LiDAR sensor, a millimeter wave radar, an ultrasonic sensor, and a camera.
  • the position in the vehicle of a sensor system is shown.
  • the structure of the sensor system which concerns on 1st embodiment is shown. It is a figure which shows the structure of the sensor system which concerns on 2nd embodiment. It is a figure which shows the structure of the sensor system which concerns on 3rd embodiment. It is a figure which shows the structure of the sensor system which concerns on 4th embodiment. It is a figure which shows the structure of each sensor module in the sensor system of FIG. It is a figure which shows the structure of the sensor system which concerns on 5th embodiment. It is a figure which shows the structure of each sensor module in the sensor system of FIG.
  • an arrow F indicates the forward direction of the illustrated structure.
  • Arrow B indicates the backward direction of the illustrated structure.
  • Arrow L indicates the left direction of the illustrated structure.
  • Arrow R indicates the right direction of the illustrated structure.
  • “Left” and “right” used in the following description indicate the left and right directions viewed from the driver's seat.
  • the “vertical direction” corresponds to a direction perpendicular to the paper surface.
  • the left front sensor system 1LF As shown in FIG. 1, the left front sensor system 1LF according to the first embodiment is mounted on the left front corner of the vehicle 100.
  • FIG. 2 schematically shows the configuration of the left front sensor system 1LF.
  • the front left sensor system 1LF is accommodated in a lamp chamber 13 defined by a housing 11 and a translucent member 12.
  • the left front sensor system 1LF includes a first sensor module 14.
  • the first sensor module 14 includes a first LiDAR sensor 41, a first acceleration sensor 42, a first support member 43, a first screw mechanism 44, and a first actuator 45.
  • the first LiDAR sensor 41 has a configuration for emitting non-visible light and a configuration for detecting return light as a result of reflection of the non-visible light on an object existing at least in front of the vehicle 100.
  • the front of the vehicle 100 is an example of the outside of the vehicle.
  • the first LiDAR sensor 41 may include a scanning mechanism that sweeps the invisible light by changing the emission direction (that is, the detection direction) as necessary.
  • infrared light having a wavelength of 905 nm is used as invisible light.
  • the first LiDAR sensor 41 can acquire the distance to the object associated with the return light based on, for example, the time from when the invisible light is emitted in a certain direction until the return light is detected. Further, by accumulating such distance data in association with the detection position, information related to the shape of the object associated with the return light can be acquired. In addition to or instead of this, it is possible to acquire information relating to attributes such as the material of the object associated with the return light based on the difference in wavelength between the outgoing light and the return light. In addition to or instead of this, for example, information on the color of the object (such as a white line on the road surface) can be acquired based on the difference in reflectance of the return light from the road surface.
  • the first LiDAR sensor 41 is a sensor that detects at least information ahead of the vehicle 100.
  • the first LiDAR sensor 41 is configured to output a signal corresponding to the attribute (intensity, wavelength, etc.) of the detected return light.
  • the above information is acquired by appropriately processing the signal output from the first LiDAR sensor 41 by an information processing unit (not shown).
  • the information processing unit may be included in the left front sensor system 1LF or may be mounted on the vehicle 100.
  • the first acceleration sensor 42 is supported by the first support member 43 together with the first LiDAR sensor 41.
  • the first acceleration sensor 42 is configured to output a signal A1 corresponding to the attitude of the first support member 43, that is, the attitude of the first LiDAR sensor 41.
  • the first screw mechanism 44 is a mechanism for adjusting the attitude of the first LiDAR sensor 41 with respect to the housing 11 by adjusting the attitude of the first support member 43.
  • the first screw mechanism 44 includes a first horizontal adjustment screw 441 and a first vertical adjustment screw 442.
  • the first leveling screw 441 extends through the housing 11.
  • the first leveling screw 441 is connected to the first support member 43 via a joint (not shown).
  • the head portion of the first horizontal adjustment screw 441 is disposed outside the housing 11.
  • the rotation of the first horizontal adjustment screw 441 causes the posture of the first support member 43 to be within a horizontal plane (within the plane including the front-rear direction and the left-right direction in FIG. ) Is converted into a movement that changes.
  • the “horizontal plane” used here does not have to coincide with a strict horizontal plane. Since the structure of the joint itself is well known, detailed description is omitted.
  • the first vertical adjustment screw 442 extends through the housing 11.
  • the first vertical adjustment screw 442 is connected to the first support member 43 via a joint (not shown).
  • the head portion of the first vertical adjustment screw 442 is disposed outside the housing 11.
  • the rotation of the first vertical adjustment screw 442 causes the posture of the first support member 43 to be in the vertical plane (including the front-rear direction and the up-down direction in FIG. It is converted into motion that changes in the plane).
  • the “vertical plane” used here does not have to coincide with a strict vertical plane. Since the structure of the joint itself is well known, detailed description is omitted.
  • the first actuator 45 is a device for adjusting the detection reference position of the first LiDAR sensor 41.
  • the first actuator 45 is disposed in the lamp chamber 13 and is coupled to the first LiDAR sensor 41.
  • the left front sensor system 1LF includes a second sensor module 15.
  • the second sensor module 15 includes a second LiDAR sensor 51, a second acceleration sensor 52, a second support member 53, a second screw mechanism 54, and a second actuator 55.
  • the second LiDAR sensor 51 has a configuration for emitting non-visible light and a configuration for detecting return light as a result of reflection of the non-visible light at least on an object existing on the left side of the vehicle 100.
  • the left side of the vehicle 100 is an example of the outside of the vehicle.
  • the second LiDAR sensor 51 can include a scanning mechanism that sweeps the invisible light by changing the emission direction (that is, the detection direction) as necessary. Since the configuration of the second LiDAR sensor 51 is substantially the same as that of the first LiDAR sensor 41, repeated description is omitted.
  • the second LiDAR sensor 51 is a sensor that detects at least information on the left side of the vehicle 100.
  • the second LiDAR sensor 51 is configured to output a signal corresponding to the attribute (intensity, wavelength, etc.) of the detected return light.
  • the above information is acquired by appropriately processing the signal output by the second LiDAR sensor 51 by an information processing unit (not shown).
  • the information processing unit may be included in the left front sensor system 1LF or may be mounted on the vehicle 100.
  • the second acceleration sensor 52 is supported by the second support member 53 together with the second LiDAR sensor 51.
  • the second acceleration sensor 52 is configured to output a signal A2 corresponding to the attitude of the second support member 53, that is, the attitude of the second LiDAR sensor 51.
  • the second screw mechanism 54 is a mechanism for adjusting the attitude of the second LiDAR sensor 51 relative to the housing 11 by adjusting the attitude of the second support member 53.
  • the second screw mechanism 54 includes a second horizontal adjustment screw 541 and a second vertical adjustment screw 542.
  • the second leveling screw 541 extends through the housing 11.
  • the second leveling screw 541 is connected to the second support member 53 via a joint (not shown).
  • the head portion of the second horizontal adjustment screw 541 is disposed outside the housing 11.
  • the rotation of the second horizontal adjustment screw 541 causes the posture of the second support member 53 to be within a horizontal plane (within the plane including the front-rear direction and the left-right direction in FIG. ) Is converted into a movement that changes.
  • the “horizontal plane” used here does not have to coincide with a strict horizontal plane. Since the structure of the joint itself is well known, detailed description is omitted.
  • the second vertical adjustment screw 542 extends through the housing 11.
  • the second vertical adjustment screw 542 is connected to the second support member 53 via a joint (not shown).
  • the head portion of the second vertical adjustment screw 542 is disposed outside the housing 11.
  • the rotation of the second vertical adjustment screw 542 causes the posture of the second support member 53 to be in the vertical plane (including the front-rear direction and the vertical direction in the figure) by the joint. It is converted into motion that changes in the plane).
  • the “vertical plane” used here does not have to coincide with a strict vertical plane. Since the structure of the joint itself is well known, detailed description is omitted.
  • the second actuator 55 is a device for adjusting the detection reference position of the second LiDAR sensor 51.
  • the second actuator 55 is disposed in the lamp chamber 13 and is coupled to the second LiDAR sensor 51.
  • the front left sensor system 1LF includes a processor 16 and a memory 17.
  • the processor 16 may include a CPU, MPU, GPU and the like.
  • the processor 16 can include a plurality of processor cores.
  • Examples of the memory 17 include ROM and RAM.
  • the ROM can store a program for executing the above processing.
  • the program can include an artificial intelligence program.
  • An example of an artificial intelligence program is a learned neural network by deep learning.
  • the processor 16 can specify at least a part of a program stored in the ROM, expand it on the RAM, and execute the above-described processing in cooperation with the RAM.
  • At least a part of the functions of the processor 16 may be realized by at least one hardware resource different from the processor 16 and the memory 17. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the memory 17 is supported by the housing 11.
  • the memory 17 may be supported on the outer surface of the housing 11 or may be disposed in the lamp chamber 13.
  • the postures of the first sensor module 14 and the second sensor module 15 with respect to the housing 11 are adjusted. Specifically, the detection reference position of the first LiDAR sensor 41 is adjusted by changing the posture of the first support member 43 with respect to the housing 11 using the first screw mechanism 44. Similarly, the detection reference position of the second LiDAR sensor 51 is adjusted by changing the posture of the second support member 53 with respect to the housing 11 using the second screw mechanism 54.
  • One time point before the left front sensor system 1LF is mounted on the vehicle 100 is an example of a first time point.
  • the first acceleration sensor 42 outputs a signal A1 (t1) corresponding to the attitude of the first support member 43 corresponding to the adjustment result of the detection reference position of the first LiDAR sensor 41. That is, the signal A1 (t1) corresponds to the output value V11 of the first acceleration sensor 42 before the left front sensor system 1LF is mounted on the vehicle 100.
  • the signal A1 (t1) is input to the memory 17.
  • the memory 17 stores the output value V11 of the first acceleration sensor 42 corresponding to the signal A1 (t1).
  • the output value V11 is an example of a first output value.
  • the second acceleration sensor 52 outputs a signal A2 (t1) corresponding to the attitude of the second support member 53 corresponding to the adjustment result of the detection reference position of the second LiDAR sensor 51. That is, the signal A2 (t1) corresponds to the output value V21 of the second acceleration sensor 52 before the left front sensor system 1LF is mounted on the vehicle 100.
  • the signal A2 (t1) is input to the memory 17.
  • the memory 17 stores the output value V21 of the second acceleration sensor 52 corresponding to the signal A2 (t1).
  • the output value V21 is an example of a first output value.
  • the left front sensor system 1LF is mounted on the vehicle 100.
  • the detection reference position of each LiDAR sensor may deviate from a desired position due to the tolerance of the body parts or the positional deviation of the left front sensor system 1LF with respect to the vehicle body. Therefore, after the left front sensor system 1LF is mounted on the vehicle 100, the detection reference position of the first LiDAR sensor 41 and the detection reference position of the second LiDAR sensor 51 are readjusted. In other words, at least one of the position and posture of the left front sensor system 1LF with respect to the vehicle body of the vehicle 100 is adjusted.
  • One time point after the left front sensor system 1LF is mounted on the vehicle 100 is an example of a second time point.
  • the first acceleration sensor 42 outputs a signal A1 (t2) corresponding to the posture of the first support member 43 corresponding to the mounting posture of the left front sensor system 1LF with respect to the vehicle body. That is, the signal A1 (t2) corresponds to the output value V12 of the first acceleration sensor 42 after the front left sensor system 1LF is mounted on the vehicle 100.
  • the output value V12 is an example of a second output value.
  • the processor 16 acquires the output value V12 of the first acceleration sensor 42.
  • the signal A1 (t2) output from the first acceleration sensor 42 may be input to the processor 16 or may be input to the memory 17. In the former case, the processor 16 directly acquires the output value V12. In the latter case, the processor 16 acquires the output value V12 via the memory 17.
  • the processor 16 acquires the difference D1 between the output value V12 and the output value V12.
  • the difference D1 reflects a shift in the detection reference position of the first LiDAR sensor 41 caused by mounting the left front sensor system 1LF on the vehicle 100.
  • the processor 16 calculates a correction amount of at least one of the position and orientation of the first LiDAR sensor 41 necessary for eliminating the deviation of the detection reference position of the first LiDAR sensor 41 based on the acquired difference D1.
  • the processor 16 outputs a signal P1.
  • the signal P1 is input to the first actuator 45.
  • the signal P1 causes the first actuator 45 to perform an operation necessary to adjust at least one of the position and orientation of the first LiDAR sensor 41 by the calculated correction amount. Thereby, the readjustment of the detection reference position of the first LiDAR sensor 41 changed by mounting the left front sensor system 1LF on the vehicle 100 is completed.
  • the first actuator 45 is an example of an adjustment mechanism.
  • the second acceleration sensor 52 outputs a signal A2 (t2) corresponding to the posture of the second support member 53 corresponding to the mounting posture of the left front sensor system 1LF with respect to the vehicle body. That is, the signal A2 (t2) corresponds to the output value V22 of the second acceleration sensor 52 after the front left sensor system 1LF is mounted on the vehicle 100.
  • the output value V22 is an example of a second output value.
  • the processor 16 acquires the output value V22 of the second acceleration sensor 52.
  • the signal A2 (t2) output from the second acceleration sensor 52 may be input to the processor 16 or may be input to the memory 17. In the former case, the processor 16 directly acquires the output value V22. In the latter case, the processor 16 acquires the output value V22 via the memory 17.
  • the processor 16 acquires the difference D2 between the output value V21 and the output value V22.
  • the difference D2 reflects a shift in the detection reference position of the second LiDAR sensor 51 caused by mounting the left front sensor system 1LF on the vehicle 100.
  • the processor 16 calculates a correction amount of at least one of the position and orientation of the second LiDAR sensor 51 necessary for eliminating the deviation of the detection reference position of the second LiDAR sensor 51 based on the acquired difference D2.
  • the processor 16 outputs a signal P2.
  • the signal P2 is input to the second actuator 55.
  • the signal P2 causes the second actuator 55 to perform an operation necessary to adjust at least one of the position and orientation of the second LiDAR sensor 51 by the calculated correction amount. Thereby, the readjustment of the detection reference position of the second LiDAR sensor 51 changed by mounting the left front sensor system 1LF on the vehicle 100 is completed.
  • the second actuator 55 is an example of an adjustment mechanism.
  • the adjustment of the detection reference position of each LiDAR sensor before the front left sensor system 1LF is mounted on the vehicle 100 can be performed by the manufacturer of the front left sensor system 1LF.
  • the adjustment of the detection reference position of each LiDAR sensor after the left front sensor system 1LF is mounted on the vehicle 100 can be performed, for example, by a manufacturer that assembles the vehicle 100 using the left front sensor system 1LF as one component. In this case, the work of adjusting the detection reference position of each sensor by the latter can be reduced.
  • the manufacturer of the left front sensor system 1LF is an example of a first entity.
  • the manufacturer that assembles the vehicle 100 is an example of a second entity.
  • the function of the processor 16 may be realized by a control device mounted on the vehicle 100 or may be realized by a processor supported by the housing 11. In the latter case, the processor 16 may be supported on the outer surface of the housing 11 or may be disposed in the lamp chamber 13. In this case, the processing load of the control device mounted on the vehicle 100 can be reduced.
  • the front left sensor system 1LF includes a lamp unit 18.
  • the lamp unit 18 is accommodated in the housing 11.
  • the lamp unit 18 includes a light source and an optical system.
  • the optical system includes at least one of a lens and a reflector.
  • Examples of light sources include lamp light sources and semiconductor light emitting elements.
  • Examples of lamp light sources include incandescent lamps, halogen lamps, discharge lamps, neon lamps, and the like.
  • Examples of the semiconductor light emitting element include a light emitting diode, a laser diode, and an organic EL element. The light emitted from the light source passes through the optical system and is emitted from the lamp unit 18. The light emitted from the lamp unit 18 passes through the translucent member 12 and illuminates a predetermined area outside the vehicle 100.
  • the left front sensor system 1LF includes a third screw mechanism 19.
  • the third screw mechanism 19 is a mechanism for adjusting the posture of the lamp unit 18.
  • the third screw mechanism 19 includes a third horizontal adjustment screw 191 and a third vertical adjustment screw 192.
  • the third screw mechanism 19 is an example of a light source adjustment mechanism.
  • the third horizontal adjustment screw 191 extends through the housing 11.
  • the third leveling screw 191 is connected to the lamp unit 18 through a joint (not shown).
  • the head portion of the third horizontal adjustment screw 191 is disposed outside the housing 11.
  • the rotation of the third horizontal adjustment screw 191 causes the posture of the lamp unit 18 to be within the horizontal plane (within the plane including the front-rear direction and the left-right direction in the figure) by the joint. It is converted into a moving motion.
  • the “horizontal plane” used here does not have to coincide with a strict horizontal plane. Since the structure of the joint itself is well known, detailed description is omitted.
  • the third vertical adjustment screw 192 extends through the housing 11.
  • the third vertical adjustment screw 192 is connected to the lamp unit 18 via a joint (not shown).
  • the head portion of the third vertical adjustment screw 192 is disposed outside the housing 11.
  • the rotation of the third vertical adjusting screw 192 causes the posture of the lamp unit 18 to be in the vertical plane (in the plane including the front-rear direction and the vertical direction in FIG. ) Is converted into a movement that changes.
  • the “vertical plane” used here does not have to coincide with a strict vertical plane. Since the structure of the joint itself is well known, detailed description is omitted.
  • the lamp device and the sensor system can be easily integrated, and the above demand can be met.
  • the left front sensor system 1LF is given as an example of the sensor system.
  • the right front sensor system 1RF arranged at the right front corner of the vehicle 100 shown in FIG. 1 the left rear sensor system 1LB arranged at the left rear corner, and the right rear sensor system 1RB arranged at the right rear corner of the vehicle 100 are shown.
  • the configuration described with reference to the left front sensor system 1LF is also applicable.
  • the right front sensor system 1RF may have a configuration that is symmetrical to the left front sensor system 1LF.
  • the left rear sensor system 1LB may have a symmetric configuration with the left front sensor system 1LF.
  • the right rear sensor system 1RB may have a bilaterally symmetric configuration with the left rear sensor system 1LB. This description is similarly applied to the following embodiments.
  • FIG. 3 schematically shows the configuration of the left front sensor system 2LF according to the second embodiment. Constituent elements that are the same as or equivalent to those of the left front sensor system 1LF according to the first embodiment are given the same reference numerals, and repeated descriptions are omitted.
  • the front left sensor system 2LF includes a first sensor module 24.
  • the first sensor module 24 includes a first LiDAR sensor 41, a first acceleration sensor 42, a first support member 43, and a first screw mechanism 44.
  • the left front sensor system 2LF includes a second sensor module 25.
  • the second sensor module 25 includes a second LiDAR sensor 51, a second acceleration sensor 52, a second support member 53, and a second screw mechanism 54.
  • the left front sensor system 2LF includes a processor 26.
  • the processor 26 include a CPU, MPU, GPU and the like. At least a part of the functions of the processor 26 may be realized by at least one hardware resource different from the processor 26 and the memory 17. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the postures of the first sensor module 24 and the second sensor module 25 with respect to the housing 11 are adjusted. Specifically, the detection reference position of the first LiDAR sensor 41 is adjusted by changing the posture of the first support member 43 with respect to the housing 11 using the first screw mechanism 44. Similarly, the detection reference position of the second LiDAR sensor 51 is adjusted by changing the posture of the second support member 53 with respect to the housing 11 using the second screw mechanism 54.
  • One time point before the left front sensor system 2LF is mounted on the vehicle 100 is an example of a first time point.
  • the first acceleration sensor 42 outputs a signal A1 (t1) corresponding to the attitude of the first support member 43 corresponding to the adjustment result of the detection reference position of the first LiDAR sensor 41. That is, the signal A1 (t1) corresponds to the output value V11 of the first acceleration sensor 42 before the left front sensor system 2LF is mounted on the vehicle 100.
  • the signal A1 (t1) is input to the memory 17.
  • the memory 17 stores the output value V11 of the first acceleration sensor 42 corresponding to the signal A1 (t1).
  • the output value V11 is an example of a first output value.
  • the second acceleration sensor 52 outputs a signal A2 (t1) corresponding to the attitude of the second support member 53 corresponding to the adjustment result of the detection reference position of the second LiDAR sensor 51. That is, the signal A2 (t1) corresponds to the output value V21 of the second acceleration sensor 52 before the left front sensor system 2LF is mounted on the vehicle 100.
  • the signal A2 (t1) is input to the memory 17.
  • the memory 17 stores the output value V21 of the second acceleration sensor 52 corresponding to the signal A2 (t1).
  • the output value V21 is an example of a first output value.
  • the left front sensor system 2LF is mounted on the vehicle 100.
  • the detection reference position of each LiDAR sensor may deviate from a desired position due to the tolerance of the vehicle body parts or the positional deviation of the left front sensor system 2LF with respect to the vehicle body. Therefore, after the left front sensor system 2LF is mounted on the vehicle 100, the detection reference position of the first LiDAR sensor 41 and the detection reference position of the second LiDAR sensor 51 are readjusted. In other words, at least one of the position and posture of the left front sensor system 2LF with respect to the vehicle body of the vehicle 100 is adjusted.
  • One point in time after the left front sensor system 2LF is mounted on the vehicle 100 is an example of a second point in time.
  • the first acceleration sensor 42 outputs a signal A1 (t2) corresponding to the posture of the first support member 43 corresponding to the mounting posture of the left front sensor system 2LF with respect to the vehicle body. That is, the signal A1 (t2) corresponds to the output value V12 of the first acceleration sensor 42 after the front left sensor system 2LF is mounted on the vehicle 100.
  • the output value V12 is an example of a second output value.
  • the processor 26 acquires the output value V12 of the first acceleration sensor 42.
  • the signal A1 (t2) output from the first acceleration sensor 42 may be input to the processor 26 or may be input to the memory 17. In the former case, the processor 26 directly acquires the output value V12. In the latter case, the processor 26 acquires the output value V12 via the memory 17.
  • the processor 26 acquires the difference D1 between the output value V12 and the output value V12.
  • the difference D1 reflects a shift in the detection reference position of the first LiDAR sensor 41 caused by mounting the left front sensor system 2LF on the vehicle 100.
  • a mechanism for adjusting the attitude of the first LiDAR sensor 41 is not provided. Therefore, when a shift in the detection reference position of the first LiDAR sensor 41 is detected, the information acquired by the first LiDAR sensor 41 is not changed, but the posture of the first LiDAR sensor 41 is not changed so as to eliminate the shift. Correct the side.
  • the first LiDAR sensor 41 is configured to output a signal L1 corresponding to the attribute (intensity, wavelength, etc.) of the detected return light.
  • the signal L1 is input to the processor 26. Based on the acquired difference D1, the processor 26 corrects the signal L1 so as to be a signal that would have been obtained when there was no deviation in the detection reference position of the first LiDAR sensor 41.
  • the processor 26 indirectly re-adjusts the detection reference position of the first LiDAR sensor 41 that has been changed by mounting the left front sensor system 2LF on the vehicle 100.
  • the second acceleration sensor 52 outputs a signal A2 (t2) corresponding to the posture of the second support member 53 corresponding to the mounting posture of the left front sensor system 2LF with respect to the vehicle body. That is, the signal A2 (t2) corresponds to the output value V22 of the second acceleration sensor 52 after the left front sensor system 2LF is mounted on the vehicle 100.
  • the output value V22 is an example of a second output value.
  • the processor 26 acquires the output value V22 of the second acceleration sensor 52.
  • the signal A2 (t2) output from the second acceleration sensor 52 may be input to the processor 26 or may be input to the memory 17. In the former case, the processor 26 directly acquires the output value V22. In the latter case, the processor 26 acquires the output value V22 via the memory 17.
  • the processor 26 acquires the difference D2 between the output value V21 and the output value V22.
  • the difference D2 reflects a shift in the detection reference position of the second LiDAR sensor 51 caused by mounting the left front sensor system 2LF on the vehicle 100.
  • a mechanism for adjusting the attitude of the second LiDAR sensor 51 is not provided. Therefore, when a deviation of the detection reference position of the second LiDAR sensor 51 is detected, the information acquired by the second LiDAR sensor 51 is not changed, but the posture of the second LiDAR sensor 51 is not changed so as to eliminate the deviation. Correct the side.
  • the second LiDAR sensor 51 is configured to output a signal L2 corresponding to the attribute (intensity, wavelength, etc.) of the detected return light.
  • the signal L2 is input to the processor 26. Based on the acquired difference D2, the processor 26 corrects the signal L2 so as to be a signal that would have been obtained when there was no deviation in the detection reference position of the second LiDAR sensor 51.
  • the adjustment of the detection reference position of each LiDAR sensor before the left front sensor system 2LF is mounted on the vehicle 100 can be performed by the manufacturer of the left front sensor system 2LF.
  • the adjustment of the detection reference position of each LiDAR sensor after the left front sensor system 2LF is mounted on the vehicle 100 can be performed, for example, by a manufacturer that assembles the vehicle 100 using the left front sensor system 2LF as one component. In this case, the work of adjusting the detection reference position of each sensor by the latter can be reduced.
  • the manufacturer of the left front sensor system 2LF is an example of a first entity.
  • the manufacturer that assembles the vehicle 100 is an example of a second entity.
  • a mechanism for adjusting at least one of the position and orientation of each LiDAR sensor can be omitted. Therefore, an increase in size and weight of the left front sensor system 2LF can be suppressed.
  • the function of the processor 26 may be realized by a control device mounted on the vehicle 100 or may be realized by a processor supported by the housing 11. In the latter case, the processor 26 may be supported on the outer surface of the housing 11 or may be disposed in the lamp chamber 13. In this case, the processing load of the control device mounted on the vehicle 100 can be reduced.
  • FIG. 4 schematically shows the configuration of the left front sensor system 3LF according to the third embodiment. Constituent elements that are the same as or equivalent to those of the left front sensor system 1LF according to the first embodiment are given the same reference numerals, and repeated descriptions are omitted.
  • the left front sensor system 3LF includes a first sensor module 34.
  • the first sensor module 34 includes a first camera 46, a millimeter wave radar 47, and an actuator 48 in addition to the first LiDAR sensor 41, the first acceleration sensor 42, the first support member 43, and the first screw mechanism 44. ing.
  • the first acceleration sensor 42 is supported by the first support member 43 together with the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47.
  • the first acceleration sensor 42 is disposed between the first LiDAR sensor 41 and the first camera 46.
  • the first camera 46 is a device that photographs at least the front of the vehicle 100. That is, the first camera 46 is a sensor that detects at least information ahead of the vehicle 100.
  • the front of the vehicle 100 is an example of the outside of the vehicle.
  • the first camera 46 may be a visible light camera or an infrared light camera.
  • the first camera 46 is configured to output a video signal C1 corresponding to the captured video.
  • Information at least in front of the vehicle 100 detected by the first camera 46 is acquired by appropriately processing the video signal C1 by an information processing unit (not shown).
  • the information processing unit may be included in the left front sensor system 3LF, or may be mounted on the vehicle 100.
  • the millimeter wave radar 47 has a configuration for transmitting a millimeter wave and a configuration for receiving a reflected wave resulting from the reflection of the millimeter wave by an object existing at least in front of the vehicle 100.
  • the front of the vehicle 100 is an example of the outside of the vehicle.
  • the millimeter wave radar 47 may include a scanning mechanism that changes the transmission direction (that is, the detection direction) as necessary and sweeps the millimeter wave.
  • a millimeter wave having a frequency of 76 GHz is used. Examples of other frequencies include 24 GHz, 26 GHz, and 79 GHz.
  • the millimeter wave radar 47 can acquire the distance to the object associated with the reflected wave based on the time from when the millimeter wave is transmitted in a certain direction until the reflected wave is received, for example. Further, by accumulating such distance data in association with the detection position, it is possible to acquire information related to the motion of the object associated with the reflected wave.
  • the millimeter wave radar 47 is a sensor that detects at least information ahead of the vehicle 100.
  • the millimeter wave radar 47 outputs a signal corresponding to the attribute (intensity, etc.) of the received reflected wave.
  • the above information is acquired by appropriately processing the signal output from the millimeter wave radar 47 by an information processing unit (not shown).
  • the information processing unit may be included in the left front sensor system 3LF, or may be mounted on the vehicle 100.
  • the actuator 48 is a device for adjusting the detection reference position of the millimeter wave radar 47.
  • the actuator 48 is disposed in the lamp chamber 13 and is coupled to the millimeter wave radar 47.
  • the front left sensor system 3LF includes a second sensor module 35.
  • the second sensor module 35 includes a second camera 56 in addition to the second LiDAR sensor 51, the second acceleration sensor 52, the second support member 53, and the second screw mechanism 54.
  • the second acceleration sensor 52 is supported by the second support member 53 together with the second LiDAR sensor 51 and the second camera 56.
  • the second acceleration sensor 52 is disposed between the second LiDAR sensor 51 and the second camera 56.
  • the second camera 56 is a device that photographs at least the left side of the vehicle 100. That is, the second camera 56 is a sensor that detects at least information on the left side of the vehicle 100.
  • the left side of the vehicle 100 is an example of the outside of the vehicle.
  • the second camera 56 may be a visible light camera or an infrared light camera.
  • the second camera 56 is configured to output a video signal C2 corresponding to the captured video.
  • Information at least in front of the vehicle 100 detected by the second camera 56 is acquired by appropriately processing the video signal C2 by an information processing unit (not shown).
  • the information processing unit may be included in the left front sensor system 3LF, or may be mounted on the vehicle 100.
  • the front left sensor system 3LF includes a processor 36.
  • the processor 36 include a CPU, MPU, GPU and the like. At least a part of the functions of the processor 36 may be realized by at least one hardware resource different from the processor 36 and the memory 17. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the postures of the first sensor module 34 and the second sensor module 35 with respect to the housing 11 are adjusted. Specifically, by changing the attitude of the first support member 43 with respect to the housing 11 using the first screw mechanism 44, the detection reference positions of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 are changed. Adjustments are made. Similarly, the detection reference positions of the second LiDAR sensor 51 and the second camera 56 are adjusted by changing the posture of the second support member 53 with respect to the housing 11 using the second screw mechanism 54.
  • One point in time before the left front sensor system 3LF is mounted on the vehicle 100 is an example of a first point in time.
  • the first acceleration sensor 42 outputs a signal A1 (t1) corresponding to the attitude of the first support member 43 corresponding to the adjustment result of the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47.
  • the signal A1 (t1) corresponds to the output value V11 of the first acceleration sensor 42 before the left front sensor system 3LF is mounted on the vehicle 100.
  • the signal A1 (t1) is input to the memory 17.
  • the memory 17 stores the output value V11 of the first acceleration sensor 42 corresponding to the signal A1 (t1).
  • the output value V11 is an example of a first output value.
  • the second acceleration sensor 52 outputs a signal A2 (t1) corresponding to the attitude of the second support member 53 corresponding to the adjustment result of the detection reference position of the second LiDAR sensor 51 and the second camera 56. That is, the signal A2 (t1) corresponds to the output value V21 of the second acceleration sensor 52 before the left front sensor system 3LF is mounted on the vehicle 100.
  • the signal A2 (t1) is input to the memory 17.
  • the memory 17 stores the output value V21 of the second acceleration sensor 52 corresponding to the signal A2 (t1).
  • the output value V21 is an example of a first output value.
  • the left front sensor system 3LF is mounted on the vehicle 100.
  • the detection reference position of each sensor may deviate from a desired position due to the tolerance of the vehicle body parts or the positional deviation of the left front sensor system 3LF with respect to the vehicle body. Therefore, after the left front sensor system 3LF is mounted on the vehicle 100, the detection reference position of each sensor is readjusted. In other words, at least one of the position and posture of the left front sensor system 3LF with respect to the vehicle body of the vehicle 100 is adjusted.
  • One time point after the left front sensor system 3LF is mounted on the vehicle 100 is an example of a second time point.
  • the first acceleration sensor 42 outputs a signal A1 (t2) corresponding to the posture of the first support member 43 corresponding to the mounting posture of the left front sensor system 3LF with respect to the vehicle body. That is, the signal A1 (t2) corresponds to the output value V12 of the first acceleration sensor 42 after the front left sensor system 3LF is mounted on the vehicle 100.
  • the output value V12 is an example of a second output value.
  • the processor 36 acquires the output value V12 of the first acceleration sensor 42.
  • the signal A1 (t2) output from the first acceleration sensor 42 may be input to the processor 36 or may be input to the memory 17. In the former case, the processor 36 directly acquires the output value V12. In the latter case, the processor 36 acquires the output value V12 via the memory 17.
  • the processor 36 acquires the difference D1 between the output value V12 and the output value V12.
  • the difference D1 reflects a shift in the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 caused by mounting the left front sensor system 3LF on the vehicle 100.
  • a mechanism for adjusting the postures of the first LiDAR sensor 41 and the first camera 46 is not provided. Therefore, when the deviation of the detection reference position between the first LiDAR sensor 41 and the first camera 46 is detected, the posture of the first LiDAR sensor 41 and the first camera 46 is not changed so as to eliminate the deviation. The side of information acquired by the first LiDAR sensor 41 and the first camera 46 is corrected.
  • the processor 36 uses the first LiDAR sensor so as to obtain a signal that would have been obtained if there was no deviation in the detection reference position of the first LiDAR sensor 41 based on the acquired difference D1.
  • the signal L1 output from 41 is corrected.
  • the processor 36 is output from the first camera 46 based on the acquired difference D1 so as to be a signal that would have been obtained if there was no deviation in the detection reference position of the first camera 46.
  • the corrected video signal C1 is corrected.
  • substantially the same information is obtained as when at least one of the position and orientation of the first LiDAR sensor 41 and the first camera 46 is changed so as to eliminate the deviation of the detection reference position. That is, it can be said that the processor 36 indirectly re-adjusts the detection reference positions of the first LiDAR sensor 41 and the first camera 46 that have been changed by mounting the front left sensor system 3LF on the vehicle 100.
  • the second acceleration sensor 52 outputs a signal A2 (t2) corresponding to the posture of the second support member 53 corresponding to the mounting posture of the left front sensor system 3LF with respect to the vehicle body. That is, the signal A2 (t2) corresponds to the output value V22 of the second acceleration sensor 52 after the front left sensor system 3LF is mounted on the vehicle 100.
  • the output value V22 is an example of a second output value.
  • the processor 36 acquires the output value V22 of the second acceleration sensor 52.
  • the signal A ⁇ b> 2 (t ⁇ b> 2) output from the second acceleration sensor 52 may be input to the processor 36 or may be input to the memory 17. In the former case, the processor 36 directly acquires the output value V22. In the latter case, the processor 36 acquires the output value V22 via the memory 17.
  • the processor 36 acquires the difference D2 between the output value V21 and the output value V22.
  • the difference D2 reflects a shift in the detection reference position between the second LiDAR sensor 51 and the second camera 56 caused by mounting the left front sensor system 3LF on the vehicle 100.
  • a mechanism for adjusting the postures of the second LiDAR sensor 51 and the second camera 56 is not provided. Therefore, when the deviation of the detection reference position between the second LiDAR sensor 51 and the second camera 56 is detected, the posture of the second LiDAR sensor 51 and the second camera 56 is not changed so as to eliminate the deviation. The side of information acquired by the second LiDAR sensor 51 and the second camera 56 is corrected.
  • the processor 36 uses the second LiDAR sensor so as to obtain a signal that would have been obtained when there was no deviation in the detection reference position of the second LiDAR sensor 51.
  • the signal L2 output from 51 is corrected.
  • the processor 36 is output from the second camera 56 so as to obtain a signal that would have been obtained when there was no deviation in the detection reference position of the second camera 56 based on the acquired difference D2.
  • the corrected video signal C2 is corrected.
  • substantially the same information is obtained as when at least one of the position and orientation of the second LiDAR sensor 51 and the second camera 56 is changed so as to eliminate the deviation of the detection reference position. That is, it can be said that the processor 36 indirectly re-adjusts the detection reference positions of the second LiDAR sensor 51 and the second camera 56 that have been changed by mounting the front left sensor system 3LF on the vehicle 100.
  • the difference D1 between the output value V12 and the output value V12 of the first acceleration sensor 42 also reflects a shift in the detection reference position of the millimeter wave radar 47 caused by mounting the left front sensor system 3LF on the vehicle 100.
  • the processor 36 calculates a correction amount of at least one of the position and orientation of the millimeter wave radar 47 necessary for eliminating the deviation of the detection reference position of the millimeter wave radar 47 based on the acquired difference D1.
  • the processor 36 outputs a signal P.
  • the signal P is input to the actuator 48.
  • the signal P causes the actuator 48 to perform an operation necessary to adjust at least one of the position and orientation of the millimeter wave radar 47 by the calculated correction amount. Thereby, the readjustment of the detection reference position of the millimeter wave radar 47 changed by mounting the left front sensor system 3LF on the vehicle 100 is completed.
  • the actuator 48 is an example of an adjustment mechanism.
  • Adjustment work to eliminate can be automated. Therefore, it is possible to reduce the burden of the work of adjusting the detection reference position of each sensor after the left front sensor system 3LF is mounted on the vehicle 100.
  • the adjustment of the detection reference position of each sensor before the left front sensor system 3LF is mounted on the vehicle 100 can be performed by the manufacturer of the left front sensor system 3LF.
  • the adjustment of the detection reference position of each sensor after the left front sensor system 3LF is mounted on the vehicle 100 can be performed, for example, by a manufacturer that assembles the vehicle 100 using the left front sensor system 3LF as one component. In this case, the work of adjusting the detection reference position of each sensor by the latter can be reduced.
  • the manufacturer of the left front sensor system 3LF is an example of a first entity.
  • the manufacturer that assembles the vehicle 100 is an example of a second entity.
  • a mechanism for adjusting at least one of the position and orientation of each LiDAR sensor and each camera can be omitted. Therefore, an increase in size and weight of the left front sensor system 3LF can be suppressed.
  • the function of the processor 36 may be realized by a control device mounted on the vehicle 100 or may be realized by a processor supported by the housing 11. In the latter case, the processor 36 may be supported on the outer surface of the housing 11 or may be disposed in the lamp chamber 13. In this case, the processing load of the control device mounted on the vehicle 100 can be reduced.
  • FIG. 5 schematically shows the configuration of the left front sensor system 4LF according to the fourth embodiment. Constituent elements that are the same as or equivalent to those of the left front sensor system 3LF according to the third embodiment are given the same reference numerals, and repeated descriptions are omitted.
  • the left front sensor system 3LF includes a first sensor module 64.
  • the first sensor module 64 includes a first LiDAR sensor 41, a first acceleration sensor 42, a first support member 43, a first screw mechanism 44, a first camera 46, a millimeter wave radar 47, and an actuator 48, An information processing device 49 is provided.
  • FIG. 6A shows a functional configuration of the first sensor module 64.
  • the first information processing device 49 includes a first processor 491 and a first memory 492.
  • the first processor 491 may include a CPU, MPU, GPU, and the like.
  • the first processor 491 can include a plurality of processor cores.
  • Examples of the first memory 492 may include a ROM and a RAM.
  • the ROM can store a program for executing the above processing.
  • the program can include an artificial intelligence program.
  • An example of an artificial intelligence program is a learned neural network by deep learning.
  • the first processor 491 can specify at least a part of a program stored in the ROM, develop it on the RAM, and execute the above-described processing in cooperation with the RAM.
  • At least a part of the functions of the first processor 491 may be realized by at least one hardware resource different from the first processor 491 and the first memory 492. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the first information processing device 49 has a single casing or substrate.
  • the first information processing device 49 is supported by the first support member 43 together with the first LiDAR sensor 41, the first acceleration sensor 42, the first camera 46, and the millimeter wave radar 47.
  • the first acceleration sensor 42 is provided in the housing of the first information processing device 49 or on the substrate.
  • the first information processing device 49 is supported by the first support member 43 so that the first acceleration sensor 42 is disposed between the first LiDAR sensor 41 and the first camera 46.
  • the first acceleration sensor 42 may be provided outside the housing or the substrate of the first information processing device 49.
  • the front left sensor system 4LF includes a second sensor module 65.
  • the second sensor module 65 includes a second information processing device 59 in addition to the second LiDAR sensor 51, the second acceleration sensor 52, the second support member 53, the second screw mechanism 54, and the second camera 56. .
  • FIG. 6B shows the functional configuration of the second sensor module 65.
  • the second information processing device 59 includes a second processor 591 and a second memory 592.
  • Examples of the second processor 591 include a CPU, MPU, GPU, and the like.
  • the second processor 591 can include a plurality of processor cores.
  • Examples of the second memory 592 include a ROM and a RAM.
  • the ROM can store a program for executing the above processing.
  • the program can include an artificial intelligence program.
  • An example of an artificial intelligence program is a learned neural network by deep learning.
  • the second processor 591 can specify at least a part of a program stored in the ROM, expand it on the RAM, and execute the above-described processing in cooperation with the RAM.
  • At least a part of the functions of the second processor 591 may be realized by at least one hardware resource different from the second processor 591 and the second memory 592. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the second information processing apparatus 59 has a single casing or substrate.
  • the second information processing device 59 is supported by the second support member 53 together with the second LiDAR sensor 51, the second acceleration sensor 52, and the second camera 56.
  • the second acceleration sensor 52 is provided in the housing of the second information processing device 59 or on the substrate.
  • the second information processing device 59 is supported by the second support member 53 so that the second acceleration sensor 52 is disposed between the second LiDAR sensor 51 and the second camera 56.
  • the second acceleration sensor 52 may be provided outside the housing or the substrate of the second information processing device 59.
  • the postures of the first sensor module 64 and the second sensor module 65 with respect to the housing 11 are adjusted. Specifically, by changing the attitude of the first support member 43 with respect to the housing 11 using the first screw mechanism 44, the detection reference positions of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 are changed. Adjustments are made. Similarly, the detection reference positions of the second LiDAR sensor 51 and the second camera 56 are adjusted by changing the posture of the second support member 53 with respect to the housing 11 using the second screw mechanism 54.
  • One time point before the left front sensor system 4LF is mounted on the vehicle 100 is an example of a first time point.
  • the first acceleration sensor 42 outputs a signal A1 (t1) corresponding to the attitude of the first support member 43 corresponding to the adjustment result of the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47.
  • the signal A1 (t1) corresponds to the output value V11 of the first acceleration sensor 42 before the left front sensor system 4LF is mounted on the vehicle 100.
  • the signal A1 (t1) is input to the first memory 492.
  • the first memory 492 stores the output value V11 of the first acceleration sensor 42 corresponding to the signal A1 (t1).
  • the output value V11 is an example of a first output value.
  • the second acceleration sensor 52 outputs a signal A2 (t1) corresponding to the attitude of the second support member 53 corresponding to the adjustment result of the detection reference position of the second LiDAR sensor 51 and the second camera 56. That is, the signal A2 (t1) corresponds to the output value V21 of the second acceleration sensor 52 before the left front sensor system 4LF is mounted on the vehicle 100.
  • the signal A2 (t1) is input to the second memory 592.
  • the second memory 592 stores the output value V21 of the second acceleration sensor 52 corresponding to the signal A2 (t1).
  • the output value V21 is an example of a first output value.
  • the left front sensor system 4LF is mounted on the vehicle 100.
  • the detection reference position of each sensor may deviate from a desired position due to the tolerance of the body parts or the positional deviation of the left front sensor system 4LF with respect to the vehicle body. Therefore, after the left front sensor system 4LF is mounted on the vehicle 100, the detection reference position of each sensor is readjusted. In other words, at least one of the position and posture of the left front sensor system 4LF with respect to the vehicle body of the vehicle 100 is adjusted.
  • One time point after the left front sensor system 4LF is mounted on the vehicle 100 is an example of a second time point.
  • the first acceleration sensor 42 outputs a signal A1 (t2) corresponding to the posture of the first support member 43 corresponding to the mounting posture of the left front sensor system 4LF with respect to the vehicle body. That is, the signal A1 (t2) corresponds to the output value V12 of the first acceleration sensor 42 after the front left sensor system 4LF is mounted on the vehicle 100.
  • the output value V12 is an example of a second output value.
  • the first processor 491 acquires the output value V12 of the first acceleration sensor 42.
  • the signal A1 (t2) output from the first acceleration sensor 42 may be input to the first processor 491 or may be input to the first memory 492. In the former case, the first processor 491 directly acquires the output value V12. In the latter case, the first processor 491 acquires the output value V12 via the first memory 492.
  • the first processor 491 acquires the difference D1 between the output value V12 and the output value V12.
  • the difference D1 reflects a shift in the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 caused by mounting the left front sensor system 4LF on the vehicle 100.
  • a mechanism for adjusting the postures of the first LiDAR sensor 41 and the first camera 46 is not provided. Therefore, when the deviation of the detection reference position between the first LiDAR sensor 41 and the first camera 46 is detected, the posture of the first LiDAR sensor 41 and the first camera 46 is not changed so as to eliminate the deviation. The side of information acquired by the first LiDAR sensor 41 and the first camera 46 is corrected.
  • the first processor 491 sets the first processor 491 so as to obtain a signal that would have been obtained when there was no deviation in the detection reference position of the first LiDAR sensor 41.
  • the signal L1 output from the LiDAR sensor 41 is corrected.
  • the first processor 491 determines from the first camera 46 a signal that would have been obtained if there was no deviation in the detection reference position of the first camera 46 based on the acquired difference D1.
  • the output video signal C1 is corrected.
  • the first processor 491 indirectly re-adjusts the detection reference positions of the first LiDAR sensor 41 and the first camera 46 that have been changed by mounting the left front sensor system 4LF on the vehicle 100.
  • the second acceleration sensor 52 outputs a signal A2 (t2) corresponding to the posture of the second support member 53 corresponding to the mounting posture of the left front sensor system 4LF with respect to the vehicle body. That is, the signal A2 (t2) corresponds to the output value V22 of the second acceleration sensor 52 after the left front sensor system 4LF is mounted on the vehicle 100.
  • the output value V22 is an example of a second output value.
  • the second processor 591 acquires the output value V22 of the second acceleration sensor 52.
  • the signal A2 (t2) output from the second acceleration sensor 52 may be input to the second processor 591 or may be input to the second memory 592. In the former case, the second processor 591 directly acquires the output value V22. In the latter case, the second processor 591 acquires the output value V22 via the second memory 592.
  • the second processor 591 acquires the difference D2 between the output value V21 and the output value V22.
  • the difference D2 reflects a shift in the detection reference position between the second LiDAR sensor 51 and the second camera 56 caused by mounting the left front sensor system 4LF on the vehicle 100.
  • a mechanism for adjusting the postures of the second LiDAR sensor 51 and the second camera 56 is not provided. Therefore, when the deviation of the detection reference position between the second LiDAR sensor 51 and the second camera 56 is detected, the posture of the second LiDAR sensor 51 and the second camera 56 is not changed so as to eliminate the deviation. The side of information acquired by the second LiDAR sensor 51 and the second camera 56 is corrected.
  • the second processor 591 sets the second processor 591 so as to obtain a signal that would have been obtained when there was no deviation in the detection reference position of the second LiDAR sensor 51.
  • the signal L2 output from the LiDAR sensor 51 is corrected.
  • the second processor 591 determines from the second camera 56 a signal that would have been obtained if there was no deviation in the detection reference position of the second camera 56 based on the acquired difference D2.
  • the output video signal C2 is corrected.
  • the second processor 591 indirectly re-adjusts the detection reference positions of the second LiDAR sensor 51 and the second camera 56 that have been changed by mounting the left front sensor system 4LF on the vehicle 100.
  • the difference D1 between the output value V12 and the output value V12 of the first acceleration sensor 42 also reflects a shift in the detection reference position of the millimeter wave radar 47 caused by mounting the left front sensor system 4LF on the vehicle 100.
  • the first processor 491 calculates a correction amount of at least one of the position and orientation of the millimeter wave radar 47 necessary for eliminating the deviation of the detection reference position of the millimeter wave radar 47 based on the acquired difference D1.
  • the first processor 491 outputs a signal P.
  • the signal P is input to the actuator 48.
  • the signal P causes the actuator 48 to perform an operation necessary to adjust at least one of the position and orientation of the millimeter wave radar 47 by the calculated correction amount. Thereby, the readjustment of the detection reference position of the millimeter wave radar 47 changed by mounting the left front sensor system 4LF on the vehicle 100 is completed.
  • the actuator 48 is an example of an adjustment mechanism.
  • Adjustment work to eliminate can be automated. Therefore, it is possible to reduce the burden of the work of adjusting the detection reference position of each sensor after the left front sensor system 4LF is mounted on the vehicle 100.
  • the adjustment of the detection reference position of each sensor before the left front sensor system 4LF is mounted on the vehicle 100 can be performed by the manufacturer of the left front sensor system 4LF.
  • the adjustment of the detection reference position of each sensor after the left front sensor system 4LF is mounted on the vehicle 100 can be performed by, for example, a manufacturer that assembles the vehicle 100 using the left front sensor system 4LF as one component. In this case, the work of adjusting the detection reference position of each sensor by the latter can be reduced.
  • the manufacturer of the left front sensor system 4LF is an example of a first entity.
  • the manufacturer that assembles the vehicle 100 is an example of a second entity.
  • a mechanism for adjusting at least one of the position and orientation of each LiDAR sensor and each camera can be omitted. Therefore, an increase in size and weight of the left front sensor system 4LF can be suppressed.
  • the first processor 491 and the first memory 492 are supported by the first support member 43, and the second processor 591 and the second memory 592 are supported by the second support member 53. ing. Therefore, regarding the processing performed by the first processor 491 and the first memory 492 and the processing performed by the second processor 591 and the second memory 592, the load on the control device mounted on the vehicle 100 can be reduced.
  • FIG. 7 schematically shows the configuration of the left front sensor system 5LF according to the fifth embodiment. Constituent elements that are the same as or equivalent to those of the left front sensor system 3LF according to the third embodiment are given the same reference numerals, and repeated descriptions are omitted.
  • the left front sensor system 3LF includes a first sensor module 74.
  • the first sensor module 74 includes a first LiDAR sensor 41, a first acceleration sensor 42, a first screw mechanism 44, a first camera 46, and a millimeter wave radar 47.
  • the first sensor module 74 includes a first support member 740.
  • the first support member 740 is a single casing or substrate.
  • the first LiDAR sensor 41, the first acceleration sensor 42, the first camera 46, and the millimeter wave radar 47 are provided in the casing or on the substrate.
  • the first acceleration sensor 42 is disposed between the first LiDAR sensor 41 and the first camera 46.
  • the first screw mechanism 44 is directly or indirectly coupled to the casing or the substrate.
  • FIG. 8A shows a functional configuration of the first sensor module 74.
  • the first sensor module 74 further includes a first processor 741, a first memory 742, a first communication unit 743, and a first power feeding unit 744.
  • the first processor 741, the first memory 742, the first communication unit 743, and the first power feeding unit 744 are provided in the casing or the substrate as the first support member 740.
  • the first processor 741 can include a plurality of processor cores.
  • Examples of the first memory 742 include a ROM and a RAM.
  • the ROM can store a program for executing the above processing.
  • the program can include an artificial intelligence program.
  • An example of an artificial intelligence program is a learned neural network by deep learning.
  • the first processor 741 can specify at least a part of a program stored in the ROM, expand it on the RAM, and execute the above-described processing in cooperation with the RAM.
  • At least a part of the functions of the first processor 741 may be realized by at least one hardware resource different from the first processor 741 and the first memory 742. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the first processor 741 is communicably connected to a control device (not shown) mounted on the vehicle 100 via the first communication unit 743.
  • the first processor 741 receives a control signal from the control device via the first communication unit 743, and performs operations of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 based on the control signal. Configured to control.
  • the first power supply unit 744 is supplied with power from a power source (not shown) mounted on the vehicle 100, and receives the power from the first LiDAR sensor 41, the first acceleration sensor 42, the first camera 46, and the millimeter wave radar 47.
  • the first processor 741 and the first memory 742 are configured to be supplied.
  • the front left sensor system 3LF includes a second sensor module 75.
  • the second sensor module 75 includes a second LiDAR sensor 51, a second acceleration sensor 52, a second screw mechanism 54, and a second camera 56.
  • the second sensor module 75 includes a second support member 750.
  • the second support member 750 is a single casing or substrate.
  • the second LiDAR sensor 51, the second acceleration sensor 52, and the second camera 56 are provided in the casing or on the substrate.
  • the second acceleration sensor 52 is disposed between the second LiDAR sensor 51 and the second camera 56.
  • the second screw mechanism 54 is directly or indirectly coupled to the casing or the substrate.
  • FIG. 8B shows a functional configuration of the second sensor module 75.
  • the second sensor module 75 further includes a second processor 751, a second memory 752, a second communication unit 753, and a second power feeding unit 754.
  • the second processor 751, the second memory 752, the second communication unit 753, and the second power supply unit 754 are provided in the casing as the second support member 750 or on the substrate.
  • the second processor 751 a CPU, MPU, GPU or the like can be exemplified.
  • the second processor 751 can include a plurality of processor cores.
  • Examples of the second memory 752 include a ROM and a RAM.
  • the ROM can store a program for executing the above processing.
  • the program can include an artificial intelligence program.
  • An example of an artificial intelligence program is a learned neural network by deep learning.
  • the second processor 751 can specify at least a part of a program stored in the ROM, expand it on the RAM, and execute the above-described processing in cooperation with the RAM.
  • At least a part of the functions of the second processor 751 may be realized by at least one hardware resource different from the second processor 751 and the second memory 752. Examples of such hardware resources may include integrated circuits such as ASIC and FPGA.
  • the second processor 751 is communicably connected to a control device (not shown) mounted on the vehicle 100 via the second communication unit 753.
  • the second processor 751 is configured to receive a control signal from the control device via the second communication unit 753 and control the operations of the second LiDAR sensor 51 and the second camera 56 based on the control signal. ing.
  • the second power feeding unit 754 receives power from a power source (not shown) mounted on the vehicle 100 and also supplies the power to the second LiDAR sensor 51, the second acceleration sensor 52, the second camera 56, and the second processor 751. And it is comprised so that it may supply to the 2nd memory 752.
  • the postures of the first sensor module 74 and the second sensor module 75 with respect to the housing 11 are adjusted. Specifically, by changing the posture of the first support member 740 with respect to the housing 11 using the first screw mechanism 44, the detection reference positions of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 are changed. Adjustments are made. Similarly, the detection reference positions of the second LiDAR sensor 51 and the second camera 56 are adjusted by changing the posture of the second support member 750 with respect to the housing 11 using the second screw mechanism 54.
  • One time point before the left front sensor system 5LF is mounted on the vehicle 100 is an example of a first time point.
  • the first acceleration sensor 42 outputs a signal A1 (t1) corresponding to the attitude of the first support member 740 corresponding to the adjustment result of the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47.
  • the signal A1 (t1) corresponds to the output value V11 of the first acceleration sensor 42 before the left front sensor system 5LF is mounted on the vehicle 100.
  • the signal A1 (t1) is input to the first memory 742.
  • the first memory 742 stores the output value V11 of the first acceleration sensor 42 corresponding to the signal A1 (t1).
  • the output value V11 is an example of a first output value.
  • the second acceleration sensor 52 outputs a signal A2 (t1) corresponding to the attitude of the second support member 750 corresponding to the adjustment result of the detection reference position of the second LiDAR sensor 51 and the second camera 56. That is, the signal A2 (t1) corresponds to the output value V21 of the second acceleration sensor 52 before the left front sensor system 5LF is mounted on the vehicle 100.
  • the signal A2 (t1) is input to the second memory 752.
  • the second memory 752 stores the output value V21 of the second acceleration sensor 52 corresponding to the signal A2 (t1).
  • the output value V21 is an example of a first output value.
  • the left front sensor system 5LF is mounted on the vehicle 100.
  • the detection reference position of each sensor may deviate from a desired position due to the tolerance of the vehicle body parts or the positional deviation of the left front sensor system 5LF with respect to the vehicle body. Therefore, after the left front sensor system 5LF is mounted on the vehicle 100, the detection reference position of each sensor is readjusted. In other words, at least one of the position and posture of the left front sensor system 5LF with respect to the vehicle body of the vehicle 100 is adjusted.
  • One point in time after the left front sensor system 5LF is mounted on the vehicle 100 is an example of a second point in time.
  • the first acceleration sensor 42 outputs a signal A1 (t2) corresponding to the posture of the first support member 43 corresponding to the mounting posture of the left front sensor system 5LF with respect to the vehicle body. That is, the signal A1 (t2) corresponds to the output value V12 of the first acceleration sensor 42 after the front left sensor system 5LF is mounted on the vehicle 100.
  • the output value V12 is an example of a second output value.
  • the first processor 741 acquires the output value V12 of the first acceleration sensor 42.
  • the signal A1 (t2) output from the first acceleration sensor 42 may be input to the first processor 741 or may be input to the first memory 742. In the former case, the first processor 741 directly acquires the output value V12. In the latter case, the first processor 741 acquires the output value V12 via the first memory 742.
  • the first processor 741 acquires the difference D1 between the output value V12 and the output value V12.
  • the difference D1 reflects a shift in the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 caused by mounting the left front sensor system 4LF on the vehicle 100.
  • a mechanism for adjusting the postures of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 is not provided. Accordingly, when a deviation in the detection reference position of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 is detected, the first LiDAR sensor 41, the first camera 46, and the millimeter are set so as to eliminate the deviation. Instead of changing the attitude of the wave radar 47, the information side acquired by the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 is corrected.
  • the first processor 741 sets the first processor 741 so as to obtain a signal that would have been obtained when there was no deviation in the detection reference position of the first LiDAR sensor 41.
  • the signal L1 output from the LiDAR sensor 41 is corrected.
  • the first processor 741 determines from the first camera 46 that the signal that would have been obtained if there was no deviation in the detection reference position of the first camera 46 based on the acquired difference D1.
  • the output video signal C1 is corrected.
  • the first processor 741 determines that the signal that would have been obtained if there was no deviation in the detection reference position of the millimeter wave radar 47 based on the acquired difference D1 from the millimeter wave radar 47.
  • the output signal M is corrected.
  • the first processor 741 indirectly re-adjusts the detection reference positions of the first LiDAR sensor 41, the first camera 46, and the millimeter wave radar 47 that have been changed by mounting the left front sensor system 4LF on the vehicle 100. It can be said that.
  • the second acceleration sensor 52 outputs a signal A2 (t2) corresponding to the posture of the second support member 53 corresponding to the mounting posture of the left front sensor system 5LF with respect to the vehicle body. That is, the signal A2 (t2) corresponds to the output value V22 of the second acceleration sensor 52 after the left front sensor system 5LF is mounted on the vehicle 100.
  • the output value V22 is an example of a second output value.
  • the second processor 751 acquires the output value V22 of the second acceleration sensor 52.
  • the signal A2 (t2) output from the second acceleration sensor 52 may be input to the second processor 751 or may be input to the second memory 752. In the former case, the second processor 751 directly acquires the output value V22. In the latter case, the second processor 751 acquires the output value V22 via the second memory 752.
  • the second processor 751 acquires the difference D2 between the output value V21 and the output value V22.
  • the difference D2 reflects a shift in the detection reference position between the second LiDAR sensor 51 and the second camera 56 caused by mounting the left front sensor system 4LF on the vehicle 100.
  • a mechanism for adjusting the postures of the second LiDAR sensor 51 and the second camera 56 is not provided. Therefore, when the deviation of the detection reference position between the second LiDAR sensor 51 and the second camera 56 is detected, the posture of the second LiDAR sensor 51 and the second camera 56 is not changed so as to eliminate the deviation. The side of information acquired by the second LiDAR sensor 51 and the second camera 56 is corrected.
  • the second processor 751 sets the second processor 751 so as to obtain a signal that would have been obtained when there was no deviation in the detection reference position of the second LiDAR sensor 51.
  • the signal L2 output from the LiDAR sensor 51 is corrected.
  • the second processor 751 determines from the second camera 56 that the signal that would have been obtained if there was no deviation in the detection reference position of the second camera 56 based on the acquired difference D2.
  • the output video signal C2 is corrected.
  • the second processor 751 indirectly re-adjusts the detection reference positions of the second LiDAR sensor 51 and the second camera 56 that have been changed by mounting the left front sensor system 5LF on the vehicle 100.
  • the adjustment of the detection reference position of each sensor before the front left sensor system 5LF is mounted on the vehicle 100 can be performed by the manufacturer of the front left sensor system 5LF.
  • the adjustment of the detection reference position of each sensor after the left front sensor system 5LF is mounted on the vehicle 100 can be performed, for example, by a manufacturer that assembles the vehicle 100 using the left front sensor system 5LF as one component. In this case, the work of adjusting the detection reference position of each sensor by the latter can be reduced.
  • the manufacturer of the left front sensor system 5LF is an example of a first entity.
  • the manufacturer that assembles the vehicle 100 is an example of a second entity.
  • a mechanism for adjusting at least one of the position and orientation of each sensor can be omitted. Therefore, an increase in size and weight of the left front sensor system 5LF can be suppressed.
  • the first processor 741 and the first memory 742 are supported by the first support member 740, and the second processor 751 and the second memory 752 are supported by the second support member 750. ing. Therefore, regarding the processing performed by the first processor 741 and the first memory 742 and the processing performed by the second processor 751 and the second memory 752, the load on the control device mounted on the vehicle 100 can be reduced.
  • sensors provided in the sensor system include LiDAR sensors, cameras, and millimeter wave radars.
  • an ultrasonic sensor can also be employed as the sensor.
  • the ultrasonic sensor has a configuration for transmitting an ultrasonic wave (several tens of kHz to several GHz) and a configuration for receiving a reflected wave obtained by reflecting the ultrasonic wave on an object existing outside the vehicle 100.
  • the ultrasonic sensor may include a scanning mechanism that sweeps ultrasonic waves by changing a transmission direction (that is, a detection direction) as necessary.
  • the ultrasonic sensor can acquire the distance to the object associated with the reflected wave based on, for example, the time from when the ultrasonic wave is transmitted in a certain direction until the reflected wave is received. Further, by accumulating such distance data in association with the detection position, it is possible to acquire information related to the motion of the object associated with the reflected wave.
  • the ultrasonic sensor is a sensor that detects information outside the vehicle 100.
  • the ultrasonic sensor outputs a signal corresponding to the attribute (intensity, etc.) of the received reflected wave.
  • the above information is acquired by appropriately processing the signal output from the ultrasonic sensor by the information processing unit.
  • the information processing unit may be included in the sensor system or may be mounted on the vehicle 100.
  • the posture of the lamp unit 18 is adjusted by the third screw mechanism 19.
  • the third screw mechanism 19 can be replaced by an appropriate actuator mechanism at least partially housed in the housing 11.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un premier capteur LiDAR (41) qui détecte des informations concernant l'extérieur d'un véhicule. Un premier élément de maintien (43) maintient le premier capteur LiDAR (41) et un premier capteur d'accélération (42). Une mémoire (17) mémorise une première valeur de sortie provenant du premier capteur d'accélération (42) à un premier instant. Un processeur (16) acquiert la différence entre la première valeur de sortie et une seconde valeur de sortie du premier capteur d'accélération (42) à un second instant.
PCT/JP2018/009718 2017-03-21 2018-03-13 Module de capteur, système de capteur et procédé d'installation de système de capteur dans un véhicule WO2018173855A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP18770971.2A EP3605136A4 (fr) 2017-03-21 2018-03-13 Module de capteur, système de capteur et procédé d'installation de système de capteur dans un véhicule
JP2019507575A JPWO2018173855A1 (ja) 2017-03-21 2018-03-13 センサモジュール、センサシステム、およびセンサシステムの車両への搭載方法
US16/496,081 US20200039531A1 (en) 2017-03-21 2018-03-13 Sensor module, sensor system, and method of installing sensor system in vehicle
CN201880019683.1A CN110446941A (zh) 2017-03-21 2018-03-13 传感器模块、传感器系统、及向车辆搭载传感器系统的搭载方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-054102 2017-03-21
JP2017054102 2017-03-21

Publications (1)

Publication Number Publication Date
WO2018173855A1 true WO2018173855A1 (fr) 2018-09-27

Family

ID=63586462

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/009718 WO2018173855A1 (fr) 2017-03-21 2018-03-13 Module de capteur, système de capteur et procédé d'installation de système de capteur dans un véhicule

Country Status (5)

Country Link
US (1) US20200039531A1 (fr)
EP (1) EP3605136A4 (fr)
JP (1) JPWO2018173855A1 (fr)
CN (1) CN110446941A (fr)
WO (1) WO2018173855A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100656A1 (fr) * 2018-11-14 2020-05-22 株式会社小糸製作所 Système de caméra infrarouge, module de caméra infrarouge, et véhicule
WO2020116239A1 (fr) * 2018-12-04 2020-06-11 株式会社小糸製作所 Système de caméra infrarouge et véhicule
JP2021105602A (ja) * 2019-12-27 2021-07-26 パイオニア株式会社 ライダ装置
US11634064B1 (en) 2021-11-02 2023-04-25 Stanley Electric Co., Ltd. Vehicular lamp fitting and radar structure
RU2804063C2 (ru) * 2019-04-30 2023-09-26 Идак Холдингз, Инк. Способы, устройства и системы для усовершенствованной передачи данных по восходящей линии связи по сконфигурированным предоставлениям

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3605031B1 (fr) * 2018-08-02 2021-04-07 VEGA Grieshaber KG Capteur radar destiné à la mesure de niveau de remplissage ou de niveau limite
US11851088B2 (en) * 2020-03-11 2023-12-26 Baidu Usa Llc Method for determining capability boundary and associated risk of a safety redundancy autonomous system in real-time
US11702104B2 (en) * 2020-04-22 2023-07-18 Baidu Usa Llc Systems and methods to determine risk distribution based on sensor coverages of a sensor system for an autonomous driving vehicle
SE544537C2 (en) * 2020-11-10 2022-07-05 Scania Cv Ab Multi-modality corner vehicle sensor module
CN114639262B (zh) * 2020-12-15 2024-02-06 北京万集科技股份有限公司 感知设备的状态检测方法、装置、计算机设备和存储介质
JP7444045B2 (ja) * 2020-12-17 2024-03-06 トヨタ自動車株式会社 音源探査システムおよび音源探査方法
DE102021104596A1 (de) 2021-02-25 2022-08-25 Motherson Innovations Company Limited Außenverkleidungsbauteil eines Kraftfahrzeugs, Kraftfahrzeug mit einem solchen Außenverkleidungsbauteil sowie Computerprogrammprodukt zum Betreiben einer Positionsbestimmungseinrichtung eines derartigen Außenverkleidungsbauteils

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004085258A (ja) * 2002-08-23 2004-03-18 Hitachi Ltd レーダ装置
JP2010519545A (ja) * 2007-02-21 2010-06-03 オートリブ エー・エス・ピー・インク センサのミスアラインメント検知および測定システム
JP2010185769A (ja) 2009-02-12 2010-08-26 Toyota Motor Corp 物体検出装置
JP2010243219A (ja) * 2009-04-01 2010-10-28 Fujitsu Ten Ltd レーダ装置およびレーダ調整方法
JP2013019799A (ja) * 2011-07-12 2013-01-31 Denso Corp 車両用制御装置
US20140333473A1 (en) * 2013-05-13 2014-11-13 Robert Bosch Gmbh Method and device for ascertaining and compensating for a misalignment angle of a radar sensor of a vehicle
JP2017054102A (ja) 2015-09-07 2017-03-16 日亜化学工業株式会社 光学部品及び発光装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001260777A (ja) * 2000-03-21 2001-09-26 Denso Corp 車両用前照灯装置
US7197388B2 (en) * 2003-11-06 2007-03-27 Ford Global Technologies, Llc Roll stability control system for an automotive vehicle using an external environmental sensing system
JP2006242622A (ja) * 2005-03-01 2006-09-14 Matsushita Electric Ind Co Ltd 車載用レーダ装置および車両搭載方法
US20120173185A1 (en) * 2010-12-30 2012-07-05 Caterpillar Inc. Systems and methods for evaluating range sensor calibration data
DE102013222291A1 (de) * 2013-11-04 2015-05-07 Conti Temic Microelectronic Gmbh Verfahren und Vorrichtung zur Schätzung der Einbauwinkel eines in einem Fahrzeug montierten bildgebenden Sensors
CN104648228B (zh) * 2013-11-25 2017-08-11 株式会社小糸制作所 车用灯具的控制装置
EP2902802B1 (fr) * 2014-01-31 2016-10-26 S.M.S. Smart Microwave Sensors GmbH Dispositif de détection
DE102015005570A1 (de) * 2015-04-29 2016-11-03 Audi Ag Verfahren zur Justage und/oder Kalibrierung eines Umgebungssensors, Umgebungssensor und Kraftfahrzeug

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004085258A (ja) * 2002-08-23 2004-03-18 Hitachi Ltd レーダ装置
JP2010519545A (ja) * 2007-02-21 2010-06-03 オートリブ エー・エス・ピー・インク センサのミスアラインメント検知および測定システム
JP2010185769A (ja) 2009-02-12 2010-08-26 Toyota Motor Corp 物体検出装置
JP2010243219A (ja) * 2009-04-01 2010-10-28 Fujitsu Ten Ltd レーダ装置およびレーダ調整方法
JP2013019799A (ja) * 2011-07-12 2013-01-31 Denso Corp 車両用制御装置
US20140333473A1 (en) * 2013-05-13 2014-11-13 Robert Bosch Gmbh Method and device for ascertaining and compensating for a misalignment angle of a radar sensor of a vehicle
JP2017054102A (ja) 2015-09-07 2017-03-16 日亜化学工業株式会社 光学部品及び発光装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3605136A4

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100656A1 (fr) * 2018-11-14 2020-05-22 株式会社小糸製作所 Système de caméra infrarouge, module de caméra infrarouge, et véhicule
CN111186377A (zh) * 2018-11-14 2020-05-22 株式会社小糸制作所 红外线相机系统、红外线相机模块以及车辆
JPWO2020100656A1 (ja) * 2018-11-14 2021-10-07 株式会社小糸製作所 赤外線カメラシステム、赤外線カメラモジュール及び車両
WO2020116239A1 (fr) * 2018-12-04 2020-06-11 株式会社小糸製作所 Système de caméra infrarouge et véhicule
JPWO2020116239A1 (ja) * 2018-12-04 2021-11-04 株式会社小糸製作所 赤外線カメラシステム及び車両
JP7382344B2 (ja) 2018-12-04 2023-11-16 株式会社小糸製作所 赤外線カメラシステム及び車両
RU2804063C2 (ru) * 2019-04-30 2023-09-26 Идак Холдингз, Инк. Способы, устройства и системы для усовершенствованной передачи данных по восходящей линии связи по сконфигурированным предоставлениям
JP2021105602A (ja) * 2019-12-27 2021-07-26 パイオニア株式会社 ライダ装置
US11634064B1 (en) 2021-11-02 2023-04-25 Stanley Electric Co., Ltd. Vehicular lamp fitting and radar structure

Also Published As

Publication number Publication date
JPWO2018173855A1 (ja) 2020-01-23
EP3605136A4 (fr) 2020-12-16
CN110446941A (zh) 2019-11-12
US20200039531A1 (en) 2020-02-06
EP3605136A1 (fr) 2020-02-05

Similar Documents

Publication Publication Date Title
WO2018173855A1 (fr) Module de capteur, système de capteur et procédé d'installation de système de capteur dans un véhicule
JP7061071B2 (ja) センサシステム、センサモジュール、およびランプ装置
EP3514444A1 (fr) Système de capteur
US20200276930A1 (en) Lighting system and sensor system
US11623558B2 (en) Sensor system
US10120185B2 (en) Image projection apparatus and compensation method
CN110573906A (zh) 距离测量系统
US11248767B2 (en) Sensor system, sensor module, and lamp device
US20190346538A1 (en) Lamp device
US20200236338A1 (en) Sensor system
JP2014063063A (ja) 表示装置
CN111725142A (zh) 用于3d感测应用的集成电子模块以及包括集成电子模块的3d扫描设备
WO2019172117A1 (fr) Système de capteur et dispositif de génération de données d'image
JP7189682B2 (ja) センサシステムおよび検査方法
KR102158025B1 (ko) 카메라 보정모듈, 카메라 시스템 및 카메라 시스템의 제어 방법
KR102664801B1 (ko) 자체 보정 이미저를 갖는 차량 비전 시스템
JP2021096225A (ja) 投光装置、物体検出装置及び移動体
JP2019200649A (ja) センサデータ生成装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18770971

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019507575

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018770971

Country of ref document: EP

Effective date: 20191021