CN209471245U - Sensor-based system and vehicle - Google Patents

Sensor-based system and vehicle Download PDF

Info

Publication number
CN209471245U
CN209471245U CN201821694319.9U CN201821694319U CN209471245U CN 209471245 U CN209471245 U CN 209471245U CN 201821694319 U CN201821694319 U CN 201821694319U CN 209471245 U CN209471245 U CN 209471245U
Authority
CN
China
Prior art keywords
vehicle
lidar
unit
control unit
lidar unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201821694319.9U
Other languages
Chinese (zh)
Inventor
山本修己
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koito Manufacturing Co Ltd
Original Assignee
Koito Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koito Manufacturing Co Ltd filed Critical Koito Manufacturing Co Ltd
Application granted granted Critical
Publication of CN209471245U publication Critical patent/CN209471245U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)
  • Traffic Control Systems (AREA)

Abstract

The utility model provides the calculated load for being able to suppress electronic control unit and improves the sensor-based system of the precision of ambient condition information.Set on can have in the lighting system (4a) of the vehicle (1) of automatic driving mode downward driving: LiDAR unit (44a) is constituted in a manner of obtaining the dot array data for indicating the ambient enviroment of vehicle (1);LiDAR control unit (430a), by based on the dot array data obtained from LiDAR unit (44a) come specific information relevant to object existing around vehicle (1) in a manner of constituted.LiDAR control unit (430a) controls LiDAR unit (44a), so that only the scanning resolution of the LiDAR unit in first angle region existing for the object increases in the detection zone of LiDAR unit (44a).

Description

Sensor-based system and vehicle
Technical field
The utility model relates to sensor-based systems.It can be in the vehicle of automatic driving mode downward driving more particularly, it relates to be located at On sensor-based system.In addition, the utility model relates to can be in the vehicle for the automatic driving mode downward driving for having sensor-based system.
Background technique
Currently, being just keen to the research of automatic driving technology in various countries, discussing in various countries can be used in vehicle (hereinafter, " vehicle " refers to automobile.) prepare in automatic driving mode downward driving in the regulation of highway.Herein, in automatic Pilot mould Under formula, Vehicular system automatically controls the traveling of vehicle.Specifically, under automatic driving mode, Vehicular system is based on from camera shooting Information (the surrounding ring for the expression vehicle-periphery that the sensors such as machine, radar (such as laser radar or millimetre-wave radar) obtain Border information) automatically carry out course changing control (control of the direction of travel of vehicle), control for brake and accelerate control (braking of vehicle, At least one of the control of acceleration and deceleration).On the other hand, under manual drive mode described below, most of existing skill Art type vehicle is such, traveling of driver's control vehicle.Specifically, in manual drive mode, according to driver's The traveling of operation (steering operation, accelerates operation at brake operating) control vehicle, Vehicular system do not carry out steering operation, system automatically Dynamic operation and acceleration operation.It should be noted that the driving mode of vehicle does not refer to the only concept existing for a part of vehicle, and It is the concept present in all vehicles that are also included of vehicle for the prior art type for not having Function for Automatic Pilot, example Such as, classified according to control method for vehicle etc..
In this way, in the future, it is contemplated that be just in the vehicle of automatic driving mode downward driving (hereinafter, being properly termed as " automatic Drive vehicle ".) and just in the vehicle of manual drive mode downward driving (hereinafter, being properly termed as " manual drive vehicle ".) be mixed On highway.
As an example of automatic Pilot technology, subsequent vehicle automatic tracing is disclosed in patent document 1 and first drives a vehicle and travels Automatic tracing driving system.In the automatic tracing driving system, first driving and subsequent vehicle are each provided with lighting system, for preventing In the lighting system that only formerly the word-information display of other vehicles insertion is formerly driven a vehicle between driving and subsequent vehicle, and indicate The word-information display of the automatic tracing traveling meaning is in the lighting system of subsequent vehicle.
Patent document 1:(Japan) Japanese Laid-Open Patent Publication 9-277887 bulletin
But in the development of automatic Pilot technology, the confirmation precision that leap improves vehicle-periphery just becomes project. In the case where particular vehicle ambient enviroment, LiDAR unit is used, and electronic control unit (ECU) can be based on mono- from LiDAR Member obtain dot array data and obtain the ambient condition information of vehicle (for example, relevant to the object existing for vehicle periphery Information).In order to improve vehicle ambient condition information precision, it is contemplated that increase LiDAR unit scanning resolution, another party Face, as the increase of the scanning resolution of LiDAR unit causes the calculating of the electronic control unit of calculation processing dot array data negative Lotus rapidly increases.In this way, there is the increase of the precision of tradeoff vehicle-periphery information and the calculated load of electronic control unit.
Utility model content
The utility model aim is to provide the calculated load for being able to suppress electronic control unit and improves ambient enviroment letter The sensor-based system of the precision of breath.It in addition, and it is an object of the present invention to provide can be in the automatic driving mode downward driving for having the sensor-based system Vehicle.
For solving the means of technical problem
The sensor-based system of one embodiment of the disclosure, being set to can have in the vehicle of automatic driving mode downward driving: LiDAR unit is constituted in a manner of obtaining the dot array data for indicating the ambient enviroment of the vehicle;LiDAR control unit, with Based on the dot array data obtained from the LiDAR unit come specific letter relevant to object existing around the vehicle Breath;
The LiDAR control unit controls the LiDAR unit so that only described right in the detection zone of the LiDAR unit The scanning resolution of the LiDAR unit in the first angle region as existing for object increases.
According to above structure, first angle existing for object (for example, pedestrian etc.) in the detection zone of LiDAR unit The scanning resolution of the LiDAR unit in region increases.In this way, the scanning of the LiDAR unit by making first angle region is differentiated Rate increases, and on the other hand, increases the scanning resolution of the detection zone other than first angle region not, is able to suppress LiDAR On the basis of the calculated load of control unit (electronic control unit), the precision of information relevant to object can be made to increase.Cause This, is capable of providing the calculated load for being able to suppress electronic control unit and improves the sensor-based system of the precision of ambient condition information.
In addition, based on the dot array data obtained from the LiDAR unit cannot the specific object attribute feelings Under condition, the LiDAR control unit controls the LiDAR unit so that the LiDAR unit in the first angle region is swept Retouch resolution ratio increase.
According to above structure, even if being unable to the attribute of special object object based on the dot array data obtained from LiDAR unit In the case of, it also can be by the scanning resolution of the LiDAR unit in first angle region existing for increase object come reliably The attribute of special object object.
In addition, the LiDAR control unit can control the LiDAR unit, so that until being capable of the specific object Attribute before, the scanning resolution of the LiDAR unit in the first angle region gradually increases.
According to above structure, until due to attribute until capableing of special object object, the LiDAR unit in first angle region Scanning resolution gradually increase, therefore be capable of the attribute of reliably special object object.
In addition, the LiDAR control unit is configured to based on the dot array data re-fetched from the LiDAR unit more The position of the new object, first angle region described in the location updating based on the object being updated.
The base being updated according to above structure, the position based on the dot array data object re-fetched from LiDAR unit On plinth, the location updating first angle region based on the object being updated.In this way, even if the situation mobile in object Under, it can also increase the scanning resolution of the LiDAR unit in first angle region existing for mobile object object.
Can provide have a Vehicular system can be in the vehicle of automatic driving mode downward driving.
According to above-mentioned, it is capable of providing the calculated load for being able to suppress electronic control unit and improves the precision of ambient condition information Vehicle.
Utility model effect
According to the utility model, it is capable of providing the calculated load for being able to suppress electronic control unit and improves ambient enviroment letter The sensor-based system of the precision of breath.
Detailed description of the invention
Fig. 1 is that have the embodiments of the present invention (hereinafter referred to as present embodiment.) Vehicular system vehicle Top view.
Fig. 2 is the block diagram for indicating the Vehicular system of present embodiment.
Fig. 3 is the figure for indicating the functional block of control unit of left front lighting system.
Fig. 4 is with the detection zone of video camera, the detection zone of LiDAR unit, milli for then illustrating left front lighting system The figure of the detection zone of metric wave laser.
Fig. 5 is the flow chart for the control method for illustrating the LiDAR unit of present embodiment.
Fig. 6 is the figure for indicating the appearance existing for the detection zone pedestrian of LiDAR unit.
Fig. 7 is the figure for indicating angular regions existing for pedestrian.
Description of symbols
1: vehicle
2: Vehicular system
3: vehicle control section
4a: left front lighting system (lighting system)
4b: right forward lighting systems (lighting system)
4c: left back lighting system (lighting system)
4d: right back illumination system (lighting system)
5: sensor
10: wireless communication part
11: storage device
12: turning to actuator
13: transfer
14: brake actuator
15: brake apparatus
16: accelerating actuator
17: accelerator
22a, 22b, 22c, 22d: translucent cover
24a, 24b, 24c, 24d: shell
40a, 40b, 40c, 40d: control unit
42a, 42b, 42c, 42d: lighting unit
43a, 43b, 43c, 43d: video camera
44a, 44b, 44c, 44d:LiDAR unit
45a, 45b, 45c, 45d: millimetre-wave radar
410a: lighting control section
420a: camera control portion
430a:LiDAR control unit
440a: millimetre-wave radar control unit
450a: ambient condition information fusion portion
Specific embodiment
Hereinafter, referring to attached drawing to the embodiments of the present invention (hereinafter simply referred to as " present embodiment ".) said It is bright.It should be noted that for convenience of explanation, in the component that has been described above in description of the present embodiment and having the same The component of reference numeral, the description thereof will be omitted.In addition, for convenience of explanation, the size of this each component shown in the drawings exists and reality The different situation of size of each component on border.
In addition, in description of the present embodiment, for convenience of description, be appropriate to mention that " left and right directions ", " front-rear direction ", " up and down direction ".These directions are the opposite directions set to vehicle 1 as shown in Figure 1.Herein, " front-rear direction " is to include The direction at " front " and " rear "." left and right directions " is the direction comprising " left " and " right "." up and down direction " be comprising " on The direction of side " and " lower section ".In addition, being also appropriate to mention that " horizontal direction ", horizontal direction " refer to relative to " up and down direction " vertically Direction, include " left and right directions " and " front-rear direction ".
Firstly, being illustrated referring to Fig.1 to the vehicle of present embodiment 1.Fig. 1 is the vehicle 1 for indicating to have Vehicular system 2 Top view schematic diagram.As shown in Figure 1, vehicle 1 is that can have vehicle in the vehicle (automobile) of automatic driving mode downward driving System 2.Vehicular system 2 at least has vehicle control section 3, left front lighting system 4a (hereinafter simply referred to as " lighting system 4a".), right forward lighting systems 4b (hereinafter simply referred to as " lighting system 4b ".), left back lighting system 4c (hereinafter simply referred to as " shine Bright system 4c ".), right back illumination system 4d (hereinafter simply referred to as " lighting system 4d ".).
Lighting system 4a is located at the front left side of vehicle 1.In particular, lighting system 4a has: being located at the front left side of vehicle 1 Shell 24a, the translucent cover 22a for being installed on shell 24a.Lighting system 4b is located at the forward right side of vehicle 1.In particular, lighting system 4b has the shell 24b for the forward right side for being located at vehicle 1, is mounted on the translucent cover 22b of shell 24b.Lighting system 4c is located at vehicle 1 Left rear side.In particular, lighting system 4c has: being located at the shell 24c of the left rear side of vehicle 1, be installed on the light transmission of shell 24c Cover 22c.Lighting system 4d is located at the right lateral side of vehicle 1.In particular, lighting system 4d has: being located at the shell of the right lateral side of vehicle 1 Body 24d, the translucent cover 22d for being installed on shell 24d.
Then, Vehicular system 2 shown in FIG. 1 is concretely demonstrated referring to Fig. 2.Fig. 2 is the vehicle for indicating present embodiment The block diagram of system 2.As shown in Fig. 2, Vehicular system 2 has: vehicle control section 3, lighting system 4a~4d, sensor 5, HMI (Human Machine Interface, man-machine interface) 8, GPS (Global Positioning System, global positioning system System) 9, wireless communication part 10, storage device 11.Vehicular system 2 is also equipped with: turning to actuator 12, transfer 13, brake actuation Device 14, accelerates actuator 16, accelerator 17 at brake apparatus 15.In addition, Vehicular system 2 has the structure in a manner of supplying electric power At battery (not shown).
Vehicle control section 3 is configured to control the traveling of vehicle 1.Vehicle control section 3 is for example single by least one electronic control First (ECU:Electronic Control Unit, electronic control unit) is constituted.Electronic control unit may include: include one At least one microcontroller of a above processor and more than one memory includes active components and the passive members such as transistor Other electronic circuits of part.Processor is, for example, CPU (Central Processing Unit, central processing unit), MPU (Micro Processing Unit, microprocessing unit), GPU (Graphics Processing Unit, graphics processing unit) And/or TPU (Tensor Processing Unit, tensor processing unit).CPU can be constituted with multiple CPU cores.GPU can be by Multiple GPU cores are constituted.Memory includes ROM (Read Only Memory, read-only memory), RAM (Random Access Memory, random access memory).It can store vehicle control program in ROM.For example, vehicle control program may include use In artificial intelligence (AI) program of automatic Pilot.AI program is that have training or nothing by using neural networks such as deep learnings The program that the rote learning of training is constructed.Vehicle control program, vehicle control data and/or expression can be temporarily stored in RAM The ambient condition information of the ambient enviroment of vehicle.Processor is configured to will be from the vehicle control program institute being stored on ROM Specified program is unfolded on RAM, and executes various processing with RAM collaborative work.
In addition, electronic control unit (ECU) can be by ASIC (Application Specific Integrated Circuit, specific integrated circuit) or FPGA (Field-Programmable Gate Array, field programmable gate array) etc. At least one integrated circuit is constituted.Further, electronic control unit can be by least one microcontroller and at least one The combination of integrated circuit (FPGA etc.) is constituted.
Lighting system 4a (an example of sensor-based system) is also equipped with: control unit 40a, lighting unit 42a, video camera 43a, LiDAR (Light Detection and Ranging, laser radar) unit 44a (an example of laser radar), millimetre-wave radar 45a.As shown in Figure 1, control unit 40a, lighting unit 42a, video camera 43a, LiDAR unit 44a, millimetre-wave radar 45a are configured In the space S a formed by shell 24a and translucent cover 22a.It should be noted that control unit 40a can be only fitted to space S a with The defined place of outer vehicle 1.For example, control unit 40a can be integrally formed with vehicle control section 3.
Control unit 40a is for example made of at least one electronic control unit (ECU).Electronic control unit may include: contain There are at least one microcontroller of more than one processor and more than one memory, other electronic circuits (such as brilliant Body pipe etc.).Processor is, for example, CPU, MPU, GPU and/or TPU.CPU can be made of multiple CPU cores.GPU can be by multiple GPU core is constituted.Memory includes ROM and RAM.It can store in ROM special for the ambient enviroment of the ambient enviroment of particular vehicle 1 Determine program.For example, ambient enviroment specific program is that have training or without training by using neural networks such as deep learnings The program that rote learning is constructed.In the picture number that RAM can temporarily store ambient enviroment specific program, be obtained by video camera 43a According to, take by the LiDAR unit 44a three-dimensional mapping (mapping) data (dot array data) obtained and/or by millimetre-wave radar 45a The detection data etc. obtained.Processor is configured to the program specified by the ambient enviroment specific program being stored on ROM It is unfolded on RAM, and execute each processing with RAM collaborative work.In addition, electronic control unit (ECU) can by ASIC or At least one integrated circuit such as FPGA is constituted.Further, electronic control unit can by least one microcontroller and extremely The combination of a few integrated circuit (FPGA etc.) is constituted.
Lighting unit 42a is configured to form light light distribution patterns and projecting light towards the outside of vehicle 1 (front).Illumination Unit 42a has the light source and optical system for projecting light.Light source for example can be rectangular (for example, N row × M column, N > by being arranged in 1, M > 1) a plurality of light-emitting elements constitute.Light-emitting component be, for example, LED (Light Emitting Diode, light emitting diode), LD (Laser Diode, laser diode) or organic EL element.Optical system may include to reflect light shot from the light source Reflecting element that mode to the front of lighting unit 42a is constituted and to reflect the light directly projected from light source or be reflected by reflecting element Light the lens that constitute of mode at least one party.It is manual drive mode or driving auxiliary mould in the driving mode of vehicle 1 In the case where formula, lighting unit 42a is configured to form driver with light distribution patterns (for example, dipped headlight is with matching in the front of vehicle Light pattern or high beam light distribution patterns).In this way, lighting unit 42a plays the function as left side headlamp unit.Another party Face, in the case where the driving mode of vehicle 1 is that height drives auxiliary mode or fully automated driving mode, lighting unit 42a It is configured to form video camera light distribution patterns in the front of vehicle 1.
Control unit 40a is configured to independently be supplied respectively to telecommunications to a plurality of light-emitting elements set on lighting unit 42a Number ((Pulse Width Modulation, pulse width modulation) signal for example, PWM).It is supplied in such manner, it is possible to be selected independently The Duty ratio of electric signal can be adjusted to the light-emitting component of electric signal, and in each light-emitting component.That is, control unit 40a can The light-emitting component that light or extinguish is selected in being arranged as rectangular a plurality of light-emitting elements, and can determine to light The brightness of light-emitting component.Therefore, control unit 40a can change the shape from the lighting unit 42a light distribution patterns projected forwards And brightness.
Video camera 43a is configured to detect the ambient enviroment of vehicle 1.In particular, video camera 43a is configured to obtaining expression vehicle On the basis of the image data of 1 ambient enviroment, control unit 40a is sent by the image data.Control unit 40a is based on being sent out The image data sent, particular ambient environmental information.Herein, ambient condition information may include with existing for the outside of vehicle 1 The relevant information of object.For example, ambient condition information may include the attribute phase with the object existing for the outside of vehicle 1 The information of pass and with object relative to the relevant information at a distance from vehicle 1 and position.Video camera 43a is for example by CCD (Charge-Coupled Device, charge-coupled device) or CMOS (complementary type MOS:Metal Oxide Semiconductor, complementary metal oxide semiconductor) etc. photographing elements constitute.Video camera 43a is configured to monocular and takes the photograph Camera is also configured to stereo camera.In the case where video camera 43a is stereoscopic camera, control unit 40a passes through utilization Parallax, based on more than two image datas acquired by stereo camera, can particular vehicle 1 deposited in the outside of vehicle 1 The distance between object (for example, pedestrian etc.).In addition, in the present embodiment, being equipped with one in lighting system 4a Video camera 43a, but more than two video camera 43a can also be set in lighting system 4a.
LiDAR unit 44a is configured to detect the ambient enviroment of vehicle 1.In particular, LiDAR unit 44a is configured to obtaining On the basis of the dot array data (3D maps data) for indicating the ambient enviroment of vehicle 1, control unit is sent by the dot array data 40a.Control unit 40a is based on the dot array data sent, particular ambient environmental information.Herein, ambient condition information may include Information relevant to the object existing for the outside of vehicle 1.For example, ambient condition information may include and in the outer of vehicle 1 The relevant information of the attribute of object existing for portion and object relative to the relevant information at a distance from vehicle 1 and position, with it is right As the relevant information of the moving direction of object.
More specifically, LiDAR unit 44a is obtaining each injection angle (level angle θ, vertical angle with laser) Laser (light pulse) flight time (TOF:Time of Flight, flight time) relevant information of Δ T1 on the basis of, Based on information relevant to flight time Δ T1, can obtain and each injection angle (level angle θ, vertical angle) The relevant information of distance D between LiDAR unit 44a (vehicle 1) and the object existing for the outside of vehicle 1.Herein, for example, Flight time Δ T1 can be calculated as follows.
T1-LiDAR unit projects laser at the time of flight time Δ T1=laser (light pulse) returns to LiDAR unit T0 at the time of (light pulse)
In this way, LiDAR unit 44a can obtain the dot array data for indicating the ambient enviroment of vehicle 1 (3D maps data).
In addition, LiDAR unit 44a for example has: the sharp laser light source that is constituted in a manner of projecting laser, so that laser exists The optical systems such as light deflector, lens that the mode that horizontal direction and Vertical Square scan up is constituted are reflected by the object with receiving The light receiver that the mode of laser is constituted.The central wavelength of the laser projected from laser source is not particularly limited.For example, laser It can be non-visible light of the central wavelength near 900nm.Light deflector for example can be MEMS (Micro Electro Mechanical System, MEMS) reflector, polygon mirror.Light receiver is, for example, photodiode.It needs Illustrate, LiDAR unit 44a can not also obtain dot array data by light deflector scanning laser.For example, LiDAR is mono- First 44a can use phased array mode or flash of light (flash) mode to obtain dot array data.In addition, in the present embodiment, Lighting system 4a is equipped with a LiDAR unit 44a, but more than two LiDAR units can also be arranged in lighting system 4a 44a.For example, setting in lighting system 4a there are two in the case where LiDAR unit 44a, it can be, a LiDAR unit 44a is constituted The ambient enviroment in front region to detect vehicle 1, another LiDAR unit 44a are configured to detect the side region of vehicle 1 In ambient enviroment.
In addition, LiDAR unit 44a (can be provided in the horizontal direction with defined angle separation delta θ in the horizontal direction Scanning resolution) and with angle spacing as defined in the up-down direction(scanning resolution as defined in the up-down direction) Scanning laser.As it is explained in detail hereinafter, LiDAR unit 44a can be such that the angle spacing in the predetermined angular region existing for object (sweeps Retouch resolution ratio) increase.It should be noted that, although " horizontal direction " of LiDAR unit 44a and " upper and lower in the present embodiment Direction " is using " horizontal direction " and " up and down direction " with vehicle 1 unanimously and as premise, but they not necessarily want one It causes.
Millimetre-wave radar 45a is configured to detect the ambient enviroment of vehicle 1.In particular, millimetre-wave radar 45a is configured to taking On the basis of the detection data that must indicate the ambient enviroment of vehicle 1, control unit 40a is sent by the detection data.Control unit 40a Based on the detection data sent, particular ambient environmental information.Herein, ambient condition information may include and in the outer of vehicle 1 The relevant information of object existing for portion.For example, ambient condition information may include and the object existing for the outside of vehicle 1 Speed relative to vehicle 1 of the relevant information in position and object relative to vehicle 1 of the relevant information of attribute and object Spend relevant information.
For example, millimetre-wave radar 45a can pass through impulse modulation mode, FM-CW (Frequency Moduleted- Continuous Wave, CW with frequency modulation) mode or double frequency CW mode obtain millimetre-wave radar 45a (vehicle 1) in vehicle 1 Outside existing for the distance between object D.Using impulse modulation mode, millimetre-wave radar 45a is obtaining milli On the basis of information relevant to the flight time Δ T2 of millimeter wave under each injection angle of metric wave, it is based on and flight time Δ The relevant information of T2 can obtain millimetre-wave radar 45a (vehicle 1) under each injection angle and existing for the outside of vehicle 1 The relevant information of the distance between object D.Herein, for example, flight time Δ T2 can be calculated as follows.
T3- millimetre-wave radar projects millimeter wave at the time of flight time Δ T2=millimeter wave returns to millimetre-wave radar Moment t2
In addition, frequency number f0 of the millimetre-wave radar 45a based on the millimeter wave projected from millimetre-wave radar 45a and back to milli The frequency number f1 of the millimeter wave of metre wave radar 45a can obtain the object existing for the outside of vehicle 1 relative to millimetre-wave radar The relevant information of relative velocity V of 45a (vehicle 1).
In addition, in the present embodiment, being equipped with a millimetre-wave radar 45a in lighting system 4a, but can also illuminate More than two millimetre-wave radar 45a are arranged in system 4a.For example, lighting system 4a can have: short distance millimetre-wave radar 45a, middle distance millimetre-wave radar 45a, millimetre-wave radar 45a is used over long distances.
Lighting system 4b is also equipped with: control unit 40b, lighting unit 42b, video camera 43b, LiDAR unit 44b, millimeter wave Radar 45b.As shown in Figure 1, control unit 40b, lighting unit 42b, video camera 43b, LiDAR unit 44b, millimetre-wave radar 45b Configuration is in the space S b formed by shell 24b and translucent cover 22b.It should be noted that control unit 40b can be only fitted to space The defined place of vehicle 1 other than Sb.For example, control unit 40b can be integrally formed with vehicle control section 3.Control unit 40b can To have the function and structure as control unit 40a.Lighting unit 42b can have the function as lighting unit 42a And structure.At this point, lighting unit 42a is used as left side front lit unit to play a role, and on the other hand, lighting unit 42b It plays a role as right side front lit unit.Video camera 43b can have the function and structure as video camera 43a.LiDAR Unit 44b can have the function and structure as LiDAR unit 44a.Millimetre-wave radar 45b can have millimetre-wave radar 45a the same function and structure.
Lighting system 4c is also equipped with: control unit 40c, lighting unit 42c, video camera 43c, LiDAR unit 44c, millimeter wave Radar 45c.As shown in Figure 1, control unit 40c, lighting unit 42c, video camera 43c, LiDAR unit 44c, millimetre-wave radar 45c Configuration is in the space S c formed by shell 24c and translucent cover 22c (in lamp house).It should be noted that control unit 40c can match Set the defined place of the vehicle 1 other than space S c.For example, control unit 40c can be integrally formed with vehicle control section 3.Control Portion 40c processed can have the function and structure as control unit 40a.
Lighting unit 42c is configured to form light distribution patterns and projecting light to the outside of vehicle 1 (rear).Lighting unit 42c includes the light source and optical system for projecting light.Light source is for example by being arranged as rectangular (for example, N row × M is arranged, N > 1, M > 1) A plurality of light-emitting elements constitute.Light-emitting component is, for example, LED, LD or organic EL element.Optical system may include with will be from light source Reflecting element that the mode that the light of injection is reflected into the front of lighting unit 42c is constituted and with reflect the light directly projected from light source or At least one party in lens constituted by the mode for the light that reflecting element reflects.It is manual drive mode in the driving mode of vehicle 1 Or in the case where driving auxiliary mode, lighting unit 42c can extinguish.It on the other hand, is that height is driven in the driving mode of vehicle 1 In the case where sailing auxiliary mode or fully automated driving mode, lighting unit 42c is configured to be formed at the rear of vehicle 1 and take the photograph Camera light distribution patterns.
Video camera 43c can have the function and structure as video camera 43a.LiDAR unit 44c can have with LiDAR unit 44a the same function and structure.Millimetre-wave radar 45c can have the same function of millimetre-wave radar 45a and knot Structure.
Lighting system 4d is also equipped with: control unit 40d, lighting unit 42d, video camera 43d, LiDAR unit 44d, millimeter wave Radar 45d.As shown in Figure 1, control unit 40d, lighting unit 42d, video camera 43d, LiDAR unit 44d, millimetre-wave radar 45d Configuration is in the space S d formed by shell 24d and translucent cover 22d (in lamp house).It should be noted that control unit 40d can match Set the defined place of the vehicle 1 other than space S d.For example, control unit 40d can be integrally formed with vehicle control section 3.Control Portion 40d processed can have the function and structure as control unit 40c.Lighting unit 42d can have and lighting unit 42c mono- The function and structure of sample.Video camera 43d can have the function and structure as video camera 43c.LiDAR unit 44d can be with With the function and structure as LiDAR unit 44c.Millimetre-wave radar 45d can have the same function of millimetre-wave radar 45c Energy and structure.
Sensor 5 can have acceleration transducer, velocity sensor and rotary sensor etc..Sensor 5 is configured to examine The running condition information for indicating the driving status of vehicle 1 is simultaneously output to vehicle control section 3 by the driving status of measuring car 1.In addition, Sensor 5 can also have the direction whether detection driver is landed in the seat occupancy sensor of driver's seat, the face for detecting driver Face to sensor, detect ambient weather state ambient weather sensor and detection it is interior whether the force-feeling sensor of someone Deng.Further, sensor 5 can have is constituted in a manner of the brightness (luminous intensity etc.) for detecting 1 ambient enviroment of vehicle Illuminance transducer.Illuminance transducer for example can determine ambient enviroment according to the size of the photoelectric current exported from photodiode Brightness.
HMI (Human Machine Interface, man-machine interface) 8 is by receiving the defeated of the input operation from driver Enter portion and the output section for exporting running condition information etc. to driver to constitute.Input unit includes deflecting roller, accelerator pedal, braking Pedal, the driving mode switching switch of driving mode for allowing hand over vehicle 1 etc..Output section includes to show that driving status is believed Breath, display for constituting of mode of illumination condition of ambient condition information and lighting system 4 etc..
GPS (Global Positioning System, global positioning system) 9 is configured to obtain the current location of vehicle 1 The current location information of the acquirement is simultaneously output to vehicle control section 3 by information.Wireless communication part 10 is configured to, and connects from other vehicles Receipts and other existing vehicle-related informations (for example, other vehicle driving informations etc.) around vehicle 1, and will be with vehicle 1 Relevant information (for example, this vehicle driving information etc.) is sent to other vehicles (vehicle inter-vehicle communication).In addition, 10 structure of wireless communication part Infrastructure information is received as from the infrastructure equipments such as semaphore or mark lamp, and by this vehicle driving information of vehicle 1 It is sent to infrastructure equipment (road inter-vehicle communication).In addition, wireless communication part 10 is configured to the portable electricity carried from pedestrian Sub- equipment (smart phones, tablets, wearable devices etc.) receives information relevant to pedestrian, and by this garage of vehicle 1 It sails information and is sent to handheld electronic apparatus (step inter-vehicle communication).Vehicle 1 can pass through self-organizing (ad-hoc) mode and other Vehicle, infrastructure equipment or handheld electronic apparatus direct communication, can also be via access point communication.Wireless communication standard example For example Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), LPWA or Li-Fi.In addition, vehicle 1 can also use the 5th Generation Mobile Communication System (5G) and other vehicles, infrastructure equipment or handheld electronic apparatus logical Letter.
Storage device 11 is to deposit outside hard disk drive (HDD) or SSD (Solid State Drive, solid state hard disk) etc. Storage device.It can store 2D or 3D cartographic information and/or vehicle control program in storage device 11.For example, 3D cartographic information can To be made of dot array data.Storage device 11 is configured to according to the requirement from vehicle control section 3, by cartographic information or vehicle control Processing procedure sequence is output to vehicle control section 3.Cartographic information or vehicle control program can be logical by wireless communication part 10 and network etc. Communication network upgrades.
In vehicle 1 in the case where automatic driving mode downward driving, vehicle control section 3 is based on running condition information, surrounding Environmental information, current location information and/or cartographic information etc. automatically generate steering controling signal, acceleration control signal and braking Control at least one of signal.It turns to actuator 12 to be configured to receive steering controling signal from vehicle control section 3, and is based on Received steering controling signal controls transfer 13.Brake actuator 14 is configured to receive control for brake from vehicle control section 3 Signal, and brake apparatus 15 is controlled based on received brake control signal.Actuator 16 is accelerated to be configured to from vehicle control section 3 Acceleration control signal is received, and accelerator 17 is controlled based on received acceleration control signal.In this way, in automatic driving mode Under, the traveling of vehicle 1 is automatically controlled by Vehicular system 2.
On the other hand, in vehicle 1 in the case where manual drive mode downward driving, vehicle control section 3 corresponds to driver Relative to the manual operation of accelerator pedal, brake pedal and deflecting roller, steering controling signal, acceleration control signal and braking are generated Control signal.In this way, under manual drive mode, since the manual operation by driver generates steering controling signal, acceleration Signal and brake control signal are controlled, therefore the traveling of vehicle 1 is controlled by driver.
Then, the driving mode of vehicle 1 is illustrated.Driving mode is by automatic driving mode and manual drive mode structure At.Automatic driving mode drives auxiliary mode by fully automated driving mode, height, driving auxiliary mode is constituted.Completely certainly Under dynamic driving mode, Vehicular system 2 automatically carries out course changing control, control for brake and accelerates to control all traveling control, and And there is no the states that driver can drive vehicle 1.In the case where height drives auxiliary mode, Vehicular system 2 is automatically turned To control, control for brake and accelerate to control all traveling control, although and there are the states that driver can drive vehicle 1 But vehicle 1 is not driven.In the case where driving auxiliary mode, Vehicular system 2 automatically carries out course changing control, control for brake and accelerates control The traveling of middle a part controls, and driver drives vehicle 1 under the driving of Vehicular system 2 auxiliary.On the other hand, manual Under driving mode, Vehicular system 2 does not carry out traveling control automatically, and driver is auxiliary in the driving not from Vehicular system 2 Help lower driving vehicle 1.
In addition, the driving mode of vehicle 1 can be switched by operation driving mode switching switch.In this case, vehicle Control unit 3 cuts for the operation of switch, in four kinds of driving modes (fully automated driving modes, height driving mode according to driver Degree drive auxiliary mode, drive auxiliary mode, manual drive mode) between switch vehicle 1 driving mode.In addition, vehicle 1 Driving mode the travelable section that can be travelled based on automatic Pilot vehicle or can forbid what automatic Pilot vehicle travelled to forbid going It sails the related information in section or the related information of ambient weather state and is automatically switched.In this case, 3 base of vehicle control section Switch the driving mode of vehicle 1 in these information.Further, the driving mode of vehicle 1 can be by using sensing of taking a seat Device or face are automatically switched to sensor etc..In this case, vehicle control section 3 can be based on from seat occupancy sensor or face Switch the driving mode of vehicle 1 to the output signal of sensor.
Then, it is illustrated referring to function of the Fig. 3 to control unit 40a.Fig. 3 is the control unit 40a for indicating lighting system 4a Functional block figure.As shown in figure 3, control unit 40a is configured to control lighting unit 42a, video camera 43a, LiDAR unit respectively The movement of 44a, millimetre-wave radar 45a.In particular, control unit 40a has: lighting control section 410a, camera control portion 420a, LiDAR control unit 430a, millimetre-wave radar control unit 440a, ambient condition information fusion portion 450a.
Lighting control section 410a is configured to project defined light distribution figure towards the front region of vehicle 1 with lighting unit 42a The mode of case controls lighting unit 42a.For example, lighting control section 410a can change according to the driving mode of vehicle 1 from illumination The light distribution patterns that unit 42a is projected.
Camera control portion 420a is configured to control the movement of video camera 43a, and based on the figure exported from video camera 43a The ambient condition information of the vehicle 1 of the detection zone S1 (referring to Fig. 4) of video camera 43a is generated as data (around hereinafter referred to as Environmental information I1.).LiDAR control unit 430a is configured to control the movement of LiDAR unit 44a, and is based on from LiDAR unit The dot array data of 44a output and the ambient condition information for generating the vehicle 1 of the detection zone S2 (referring to Fig. 4) of LiDAR unit 44a (following, referred to as ambient condition information I2.).Millimetre-wave radar control unit 440a is configured to control the movement of millimetre-wave radar 45a, And the detection zone S3 of millimetre-wave radar 45a is generated based on the detection data exported from millimetre-wave radar 45a (referring to Fig. 4) Vehicle 1 ambient condition information (it is following, be known as ambient condition information I3.).
Ambient condition information fusion portion 450a is configured to by merging ambient condition information I1 respectively, I2, I3 and generate and melt Ambient condition information If after conjunction.Herein, as shown in figure 4, ambient condition information If may include the detection zone of video camera 43a The detection zone S2 of domain S1, LiDAR unit 44a, millimetre-wave radar 45a detection zone S3 combination after detection zone Sf The related information of object existing for the outside of vehicle 1.For example, ambient condition information If may include with the attribute of object, Object relative to the distance between the position of vehicle 1, vehicle 1 and object, object relative to vehicle 1 speed and/or Information relevant to the moving direction of object.Ambient condition information fusion portion 450a is configured to send ambient condition information If To vehicle control section 3.
Then, reference Fig. 5 to Fig. 7 is to the control method of the LiDAR unit 44a of present embodiment (that is, it is mono- to increase LiDAR The processing of the scanning resolution of first 44a) it is illustrated.Fig. 5 is the control for illustrating the LiDAR unit 44a of present embodiment The flow chart of method.Fig. 6 is to indicate the sample existing for the detection zone S2 pedestrian (an example of object) of LiDAR unit 44a The figure of son.Fig. 7 is the figure for indicating angular regions Sa existing for pedestrian P.It should be noted that in Fig. 6 and Fig. 7, in order to say Bright convenience, to the detection zone illustration omitted of the sensor other than LiDAR unit 44a.In the present embodiment, only to LiDAR The control method of unit 44a is illustrated, the control method of LiDAR unit 44a also can be suitable for LiDAR unit 44b~ 44d.That is, control unit 40b~40d with as the control method of LiDAR unit 44a method control LiDAR unit 44b~ 44d。
As shown in figure 5, in step sl, LiDAR control unit 430a based on the dot array data obtained from LiDAR unit 44a, Determine the peripheral region (specifically, the detection zone S2 of LiDAR unit 44a) of vehicle 1 with the presence or absence of object (for example, Pedestrian, other vehicles etc.).LiDAR unit 44a is with angle delta θ defined in the horizontal direction of vehicle 1 and in vehicle 1 Defined angle in up and down directionScanning laser.In this way, LiDAR unit 44a by with defined angle delta θ,Scanning Laser can generate dot array data.More smaller than defined angle spacing, the spatial resolution of dot array data is higher.
In the case where the judgement result of step S1 is YES, LiDAR control unit 430a executes the processing of step S2.It is another Aspect, in the case where the judgement result of step S1 is NO, LiDAR control unit 430a is standby until the judgement result of step S1 is YES.It should be noted that can replace LiDAR control unit 430a, vehicle control section 3 is based on around control unit 40a transmission Environmental information If determines to whether there is object in the peripheral region of vehicle 1.
Then, in step s 2, determined whether based on dot array data can be specific in vehicle 1 by LiDAR control unit 430a The attribute of object existing for peripheral region.For example, in the case that object is pedestrian (or bicycle), the category of object Property be pedestrian (or bicycle).In addition, the attribute of object is vehicle in the case where object is other vehicles.At this In embodiment, as shown in fig. 6, since there are pedestrian, the categories of object in the detection zone of LiDAR unit 44a Property is pedestrian.In the case where the judgement result of step S2 is YES, terminate a series of processing as shown in Figure 5.Another party Face (is in step s 2 NO), executes the processing of step S3 in the case where LiDAR control unit 430a is unable to special object object. It should be noted that can replace LiDAR control unit 430a, vehicle control section 3 is based on ambient condition information And if determines whether energy The attribute of enough special object objects.
Then, LiDAR control unit 430a is based on dot array data, the position (step S3) of specific pedestrian P (object).This Place, the position of pedestrian P can be relative position (coordinate) of the pedestrian P relative to vehicle 1, be also possible on terrestrial space Pedestrian P position (coordinate).It should be noted that LiDAR control unit 430a can replace pedestrian P location information or In addition the location information of pedestrian P, specific information and pedestrian P in relation to the distance between vehicle 1 and pedestrian P is relative to vehicle The information of 1 angle.It is possible to further replace LiDAR control unit 430a, vehicle control section 3 is based on ambient condition information The position of the specific pedestrian P of If.
Then, in step s 4, LiDAR control unit 430a only makes the (ginseng of angular regions Sa existing for pedestrian P (object) According to Fig. 7) LiDAR unit 44a scanning resolution increase.Specifically, firstly, LiDAR control unit 430a is based on pedestrian P Location information determine angular regions Sa (an example in first angle region).Angular regions Sa is the angle for covering pedestrian P entirety Spend region.For example, in the horizontal direction of vehicle 1, the case where angular range in the region occupied by pedestrian P is Δ θ 1 Under, it is Δ (θ 1+ α) (α > 0) in the angular range of the horizontal direction upper angle region Sa of vehicle 1.Angle delta α for example can for 0 < Δα<Δθ1.In this case, the angular range of angular range Sa is bigger and smaller than 2 Δ α than Δ α.
Then, LiDAR control unit 430a controls LiDAR unit 44a so that the LiDAR unit 44a's of angle field Sa sweeps Retouch resolution ratio increase.For example, in the case where the angle separation delta θ of the horizontal direction of detection zone S2 is 0.5 °, LiDAR control Portion 430a can control LiDAR unit 44a so that the angle separation delta θ of the horizontal direction of angle field Sa is 0.1 °.In this way, LiDAR control unit 430a can angular regions Sa make LiDAR unit 44a horizontal direction scanning resolution increase.In addition, LiDAR control unit 430a can angular regions Sa make LiDAR unit 44a up and down direction scanning resolution increase.For example, In the case where the angle separation delta θ of the up and down direction of detection zone S2 is 3 °, LiDAR control unit 430a can control LiDAR Unit 44a is so that the angle separation delta θ of the up and down direction of angle field Sa is 1 °.In this way, LiDAR control unit 430a can make The scanning resolution of the up and down direction of angular regions Sa increases.
Later, in the state of the scanning resolution increase of the LiDAR unit 44a of only angular regions Sa, LiDAR unit 44a re-fetches the dot array data (next frame of dot array data) for indicating the ambient enviroment of vehicle 1.Herein, mono- by LiDAR In the dot array data (next frame of dot array data) that member re-fetches, other than the spatial resolution of angular regions Sa is than angular regions Detection zone S2 spatial resolution it is high.Therefore, it can be obtained and the object existing for angular regions Sa with degree of precision (pedestrian P) relevant information (in particular, attribute information).
Then, for LiDAR control unit 430a based on the dot array data re-fetched from LiDAR unit 44a, determining whether can The attribute (step S5) of special object object.LiDAR control unit 430a is being capable of the attribute of special object object based on dot array data In the case of (that is, in the case where being judged to capableing of the attribute of special object object), terminate a series of processing as shown in Figure 5.Separately On the one hand, LiDAR control unit 430a is in the case where being unable to the attribute of determine object object based on dot array data (that is, being determined as In the case where the attribute for being unable to special object object), the processing of step S3, S4 is executed again.
Specifically, in step s3, LiDAR control unit 430a is based on the dot matrix number re-fetched from LiDAR unit 44a According to the position of update pedestrian P (object).Later, LiDAR control unit 430a believes in the position of the pedestrian P based on update On the basis of breath updates angular regions Sa, further increase the scanning resolution of the LiDAR unit 44a of only angular regions Sa.
For example, in the case where the angle separation delta θ of the horizontal direction of the angular regions Sa when time point is 0.1 °, LiDAR Control unit 430a can control LiDAR unit 44a so that the angle separation delta θ of the horizontal direction of angular regions Sa is 0.05 °.This Sample, LiDAR control unit 430a can be such that the scanning resolution of the horizontal direction of LiDAR unit 44a gradually increases.Further, LiDAR control unit 430a can be such that the scanning resolution of the up and down direction of LiDAR unit 44a gradually increases.Later, in LiDAR In the state that control unit 430a further increases the scanning resolution of the LiDAR unit 44a of angular regions Sa, LiDAR unit 44a re-fetches the dot array data for indicating the ambient enviroment of vehicle 1.Later, LiDAR control unit 430a is based on the point re-fetched Whether battle array data judging is capable of the attribute of special object object.In the case where the judgement result of step S5 is NO, step is executed again The processing of rapid S3, S4.
In this way, according to the present embodiment, angular regions Sa existing for pedestrian P in the detection zone S2 of LiDAR unit 44a LiDAR unit 44a scanning resolution increase.Therefore, divided by increasing the scanning of the LiDAR unit 44a of angular regions Sa On the other hand resolution does not increase the scanning resolution of the detection zone S2 other than angular regions Sa, can inhibit LiDAR control The precision of information relevant to pedestrian P is improved on the basis of the calculated load of portion 430a (ECU) processed.Therefore, it is capable of providing energy Enough inhibit the calculated load of LiDAR control unit 430a and improves the lighting system 4a of the precision of ambient condition information.
In addition, in the case where being unable to the attribute of special object object based on the dot array data obtained from LiDAR unit 44a (be in step s 2 NO), LiDAR control unit 430a control LiDAR unit 44a so that angular regions Sa LiDAR unit 44a Scanning resolution increase.In particular, LiDAR control unit 430a controls LiDAR unit 44a, so that until being capable of special object Until attribute (until the judgement result of step S5 becomes YES) of object, the scanning of the LiDAR unit 44a of angular regions Sa is differentiated Rate gradually increases.It, being capable of reliably special object object in this way, the angular regions Sa as existing for pedestrian P is gradually increased Attribute.
In addition, on the basis of updating the position of pedestrian P based on the dot array data re-fetched from LiDAR unit 44a, The location updating angular regions Sa of pedestrian P based on update.In this way, even if can increase in the case where pedestrian P is mobile The scanning resolution of the LiDAR unit 44a of angular regions Sa existing for mobile pedestrian P.
It should be noted that in the present embodiment, for convenience of description, an example table using pedestrian P as object Show, but object can be other vehicle (comprising cart, tricycle), traffic infrastructure equipment, barriers etc..In addition, In the case where multiple objects are there are in the detection zone S2 of LiDAR unit 44a, respectively cover in multiple objects at least One multiple angular regions Sa can be located in detection zone S2.In this case, LiDAR control unit 430a can increase it is multiple The scanning resolution of the respective LiDAR unit 44a of angular regions Sa.
More than, although the embodiments of the present invention is illustrated, the technical scope of the utility model is not answered This is explained with limiting by description of the present embodiment.Present embodiment is only an example, can be readily appreciated by one skilled in the art, In the range of the utility model documented by the range of Patent right requirement, it is able to carry out the change of the embodiment of various kinds.This The technical scope of utility model should the range of utility model documented by the range based on Patent right requirement and its equivalent Range determines.
In the present embodiment, although the driving mode of vehicle is to drive auxiliary comprising fully automated driving mode, height Mode drives auxiliary mode, manual drive mode to be illustrated, but the driving mode of vehicle should not be limited to these four moulds Formula.The differentiation of the driving mode of vehicle can be suitably changed according to the decree or rule of the automatic Pilot of various countries.Similarly, originally " fully automated driving mode " documented by the explanation of embodiment, " height drive auxiliary mode ", " drive auxiliary mode " Respective definition is only an example, can suitably change their definition according to the decree or rule of the automatic Pilot of various countries.

Claims (5)

1. a kind of sensor-based system, being set to can be in the vehicle of automatic driving mode downward driving, which is characterized in that has:
LiDAR unit is constituted in a manner of obtaining the dot array data for indicating the ambient enviroment of the vehicle;
LiDAR control unit, with based on the dot array data obtained from the LiDAR unit come specific and around the vehicle The relevant information of existing object;
The LiDAR control unit controls the LiDAR unit so that the only object in the detection zone of the LiDAR unit The scanning resolution of the LiDAR unit in existing first angle region increases.
2. sensor-based system as described in claim 1, which is characterized in that
Based on the dot array data obtained from the LiDAR unit cannot the specific object attribute in the case where, it is described LiDAR control unit controls the LiDAR unit so that the scanning resolution of the LiDAR unit in the first angle region increases Add.
3. sensor-based system as claimed in claim 2, which is characterized in that
The LiDAR control unit controls the LiDAR unit, so that until the attribute for capableing of the specific object, institute The scanning resolution for stating the LiDAR unit in first angle region gradually increases.
4. sensor-based system as claimed in any one of claims 1 to 3, which is characterized in that
The LiDAR control unit is configured to update the object based on the dot array data re-fetched from the LiDAR unit Position, first angle region described in the location updating based on the object being updated.
5. a kind of vehicle, can be in automatic driving mode downward driving, which is characterized in that have:
Such as the described in any item sensor-based systems of Claims 1 to 4.
CN201821694319.9U 2017-10-26 2018-10-18 Sensor-based system and vehicle Active CN209471245U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-207499 2017-10-26
JP2017207499 2017-10-26

Publications (1)

Publication Number Publication Date
CN209471245U true CN209471245U (en) 2019-10-08

Family

ID=68086246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201821694319.9U Active CN209471245U (en) 2017-10-26 2018-10-18 Sensor-based system and vehicle

Country Status (1)

Country Link
CN (1) CN209471245U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109709530A (en) * 2017-10-26 2019-05-03 株式会社小糸制作所 Sensor-based system and vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109709530A (en) * 2017-10-26 2019-05-03 株式会社小糸制作所 Sensor-based system and vehicle

Similar Documents

Publication Publication Date Title
CN109709530A (en) Sensor-based system and vehicle
US10602331B2 (en) Inter-vehicle communication system, vehicle system, vehicle illumination system and vehicle
CN110944874B (en) Lighting system for vehicle and vehicle
JP7132945B2 (en) Vehicle communication system, vehicle module, front composite module, and vehicle lighting
CN110154881B (en) Lighting system for vehicle and vehicle
CN110072733A (en) Vehicle lighting system, Vehicular system, vehicle and data communication system
CN109969078A (en) Vehicle lighting system, vehicle, Vehicular system and vehicle load-and-vehicle communication system
US10933802B2 (en) Vehicle illumination system and vehicle
CN110803100B (en) Display system for vehicle and vehicle
CN109969077A (en) Vehicle lighting system and vehicle
US10636302B2 (en) Vehicle illumination device, vehicle and illumination control system
CN209471245U (en) Sensor-based system and vehicle
CN209471244U (en) Sensor-based system and vehicle
CN110382296A (en) Lighting device
CN110154880B (en) Lighting system for vehicle and vehicle
CN110271480A (en) Vehicular system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant