CN211468304U - Infrared camera system and vehicle - Google Patents

Infrared camera system and vehicle Download PDF

Info

Publication number
CN211468304U
CN211468304U CN201921964211.1U CN201921964211U CN211468304U CN 211468304 U CN211468304 U CN 211468304U CN 201921964211 U CN201921964211 U CN 201921964211U CN 211468304 U CN211468304 U CN 211468304U
Authority
CN
China
Prior art keywords
vehicle
infrared
camera
infrared camera
irradiation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201921964211.1U
Other languages
Chinese (zh)
Inventor
村松隆雄
渡边重之
佐藤诚晃
久保田晃宜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koito Manufacturing Co Ltd
Original Assignee
Koito Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koito Manufacturing Co Ltd filed Critical Koito Manufacturing Co Ltd
Application granted granted Critical
Publication of CN211468304U publication Critical patent/CN211468304U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The utility model provides an infrared camera system and vehicle can not bring the all ring edge borders in the side region of great uncomfortable ground detection vehicle to other vehicles, pedestrian etc.. A left infrared camera module (6L) provided in a vehicle (1) is provided with: left infrared radiation means configured to be directed toward the vehicle (1) The left side area emits infrared rays; and a left infrared camera configured to acquire infrared image data representing the surrounding environment of the left area of the vehicle (1). Irradiation distance (D) of infrared ray emitted from left infrared ray irradiation unitL1) Detection distance (D) from the left infrared cameraL2) Long.

Description

Infrared camera system and vehicle
Technical Field
The present disclosure relates to an infrared camera system and a vehicle including the same.
Background
Currently, research on an automatic driving technology of automobiles is being actively conducted in each country, and legal perfection for enabling vehicles (hereinafter, "vehicles" refer to automobiles) to run on roads in an automatic driving mode is being studied in each country. Here, in the automatic driving mode, the vehicle system automatically controls the running of the vehicle. Specifically, in the automatic driving mode, the vehicle system automatically performs at least one of steering control (control of the traveling direction of the vehicle), braking control, and acceleration control (control of braking, acceleration, and deceleration of the vehicle) based on information (surrounding environment information) indicating the surrounding environment of the vehicle obtained from a sensor such as a camera, a radar (e.g., a laser radar, a millimeter wave radar), or the like. On the other hand, in the manual driving mode described below, the running of the vehicle is controlled by the driver, like most conventional type vehicles. Specifically, in the manual driving mode, the travel of the vehicle is controlled in accordance with the operation (steering operation, braking operation, acceleration operation) by the driver, and the vehicle system does not automatically perform steering control, braking control, and acceleration control. The driving mode of the vehicle is not a concept of a vehicle that exists only in a part of the vehicles, but a concept of a vehicle that exists in all vehicles including a conventional vehicle having no automatic driving function, and is classified according to a vehicle control method and the like, for example.
As described above, it is expected that a vehicle traveling in the automatic driving mode (hereinafter, appropriately referred to as an "automatic driving vehicle") and a vehicle traveling in the manual driving mode (hereinafter, appropriately referred to as a "manual driving vehicle") will coexist on a road in the future.
As an example of the automatic driving technique, patent document 1 discloses an automatic follow-up running system in which a following vehicle runs automatically following a preceding vehicle. In this automatic follow-up running system, the preceding vehicle and the following vehicle are provided with an illumination system, and character information for preventing another vehicle from being inserted between the preceding vehicle and the following vehicle is displayed in the illumination system of the preceding vehicle, and character information indicating that the vehicle is to be automatically followed up run is displayed in the illumination system of the following vehicle.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 9-277887
SUMMERY OF THE UTILITY MODEL
Problem to be solved by utility model
However, in the development of the automatic driving technique, it is necessary to significantly increase the recognition range of the vehicle with respect to the surrounding environment. In this regard, studies are currently being made to mount a plurality of different kinds of sensors (e.g., cameras, LiDAR units, millimeter wave radars, etc.) on a vehicle. For example, studies are currently underway to mount a LiDAR unit and a visible light camera on the front and rear of a vehicle, respectively. In order to increase the recognition range of the vehicle with respect to the surrounding environment in the area lateral to the vehicle, it is conceivable to mount the LiDAR unit and the visible light camera also on the left side surface and the right side surface of the vehicle.
However, as the number of sensors mounted on a vehicle increases, the recognition range of the vehicle with respect to the surrounding environment increases, and the vehicle price significantly increases. In particular, since the price of LiDAR units is expensive, the price of vehicles increases dramatically with the increase in the number of LiDAR units mounted on the vehicle.
In order to solve this problem, it is conceivable to mount only the visible light camera on the left side surface and the right side surface of the vehicle. On the other hand, when the vehicle is traveling at night, in order to acquire image data indicating the surrounding environment in the side area of the vehicle by the visible light camera, it is necessary to emit visible light toward the side area of the vehicle. However, when visible light is emitted toward the side area of the vehicle, there is a concern that a large sense of discomfort may be given to other vehicles, pedestrians, and the like. Therefore, when the vehicle is traveling at night, it is practically difficult to emit visible light toward the side area of the vehicle, and therefore it is difficult to detect the surrounding environment of the side area of the vehicle using the visible light cameras mounted on the left and right sides of the vehicle. From the above-described viewpoint, there is room for further improvement in a sensing system of a vehicle.
An object of the present disclosure is to provide an infrared camera system and a vehicle that can detect the surrounding environment in a side area of the vehicle without giving a large sense of discomfort to other vehicles, pedestrians, and the like.
Means for solving the problems
An infrared camera system according to an aspect of the present disclosure is provided in a vehicle, and includes: an infrared irradiation unit configured to emit infrared rays toward the side area of the vehicle; and an infrared camera configured to acquire infrared image data indicating an ambient environment in a side area of the vehicle. The irradiation distance of the infrared rays emitted from the infrared ray irradiation unit is longer than the detection distance of the infrared ray camera.
According to the above configuration, even when the running environment of the vehicle is dark, the ambient environment in the side area of the vehicle can be detected using the infrared camera mounted on the vehicle. Further, since the light for the infrared camera emits infrared light to the outside instead of visible light, it is possible to prevent a situation in which a large sense of discomfort is given to other vehicles, pedestrians, and the like. In this way, it is possible to provide an infrared camera system that can detect the surrounding environment in the side area of the vehicle without giving a large sense of discomfort to other vehicles, pedestrians, and the like.
Further, since the irradiation distance of the infrared rays emitted from the infrared ray irradiation unit is longer than the detection distance of the infrared ray camera, the infrared ray camera can reliably capture the object existing within the detection range of the infrared ray camera. In other words, the vehicle can reliably determine the surrounding environment within the detection range of the infrared camera based on the infrared image data.
The infrared camera may be mounted on a rear pillar of the vehicle.
According to the above configuration, since the infrared camera is mounted on the rear pillar, it is difficult to attach dirt to the infrared camera. Therefore, it is not necessary to additionally provide a cleaner device for cleaning the infrared camera, and therefore the manufacturing cost of the infrared camera system can be suppressed. In addition, since the infrared camera is less likely to be defective due to adhesion of dirt, the reliability of the infrared camera can be improved.
Further, the infrared irradiation unit may include: a left infrared irradiation unit mounted on a left rear pillar of the vehicle; and a right infrared radiation unit mounted on a right rear pillar of the vehicle. The infrared camera may also include: a left infrared camera mounted on the left rear pillar; and a right infrared camera mounted on the right rear pillar. The irradiation distance of the infrared ray emitted from the left infrared ray irradiation unit may be longer than the detection distance of the left infrared ray camera. The irradiation distance of the infrared ray emitted from the right infrared ray irradiation unit may be longer than the detection distance of the right infrared ray camera.
According to the above configuration, since the irradiation distance of the infrared rays emitted from the left infrared ray irradiation unit is longer than the detection distance of the left infrared ray camera, the left infrared ray camera can reliably capture the object existing within the detection range of the left infrared ray camera. Further, since the irradiation distance of the infrared rays emitted from the right infrared ray irradiation unit is longer than the detection distance of the right infrared ray camera, the right infrared ray camera can reliably capture the object existing within the detection range of the right infrared ray camera. In this way, it is possible to provide an infrared camera system that can reliably detect the surrounding environment in the left side area and the right side area of the vehicle without giving a large sense of discomfort to other vehicles, pedestrians, and the like. Further, a vehicle provided with the infrared camera system may be provided.
According to the above, it is possible to provide a vehicle that can detect the surrounding environment in the side area of the vehicle without giving a large sense of discomfort to other vehicles, pedestrians, and the like.
Effect of the utility model
According to the present disclosure, it is possible to provide an infrared camera system and a vehicle that can detect the surrounding environment in the side area of the vehicle without giving a large sense of discomfort to other vehicles, pedestrians, and the like.
Drawings
Fig. 1 is a plan view of a vehicle mounted with a vehicle system according to an embodiment of the present invention (hereinafter, simply referred to as the present embodiment).
Fig. 2 is a block diagram of the vehicle system of the present embodiment.
Fig. 3(a) shows a left side view of the vehicle and shows an enlarged front view of the left infrared camera module, and (b) shows a cross-sectional view of the left infrared camera module.
Fig. 4 is a view showing the optical axis of the left infrared irradiation unit, the optical axis of the left infrared camera, the optical axis of the right infrared irradiation unit, and the optical axis of the right infrared camera, respectively.
Fig. 5(a) is a view showing an irradiation range and an irradiation distance of infrared rays emitted from the left infrared irradiation unit and a detection range and a detection distance of the left infrared camera, respectively. (b) The diagrams respectively show the irradiation range and irradiation distance of the infrared ray emitted from the right infrared ray irradiation unit and the detection range and detection distance of the right infrared ray camera.
Description of the reference numerals
1: vehicle with a steering wheel
2: vehicle system
3: vehicle control unit
4: front sensor module
5: rear sensor module
6L: left infrared camera module
6R: right infrared camera module
10: wireless communication unit
11: storage device
12: steering actuator
13: steering device
14: brake actuator
15: brake device
16: acceleration actuator
17: accelerating device
21L: control unit
21R: control unit
22L: left infrared irradiation unit
22R: right side infrared irradiation unit
23L: left side camera
23R: right camera
26L: left infrared camera
26R: right infrared camera
27L: left visible light camera
27R: right visible light camera
30L: left side headlight
30R: right side headlight
32L: infrared light source
33L: sub-mount
34L: heat radiator
35L: reflector
40L: left side rear light
40R: right side back light
41: LiDAR unit
42: camera with a camera module
50L: left C column
50R: right C column
51: LiDAR unit
52: camera with a camera module
56L: through hole
61L: shell body
62L: cover
63L: through hole
65L: car logo
66L: coating layer
67L: spear-shaped object
68L: sealing sheet
162L: flange part
163L: outer side surface
164L: outer surface
Detailed Description
Hereinafter, an embodiment of the present invention (hereinafter, referred to as the present embodiment) will be described with reference to the drawings. For convenience of explanation, the dimensions of the components shown in the drawings may be different from the actual dimensions of the components.
In the description of the present embodiment, for convenience of description, the terms "left-right direction", "up-down direction", and "front-back direction" may be appropriately used. These directions are relative directions set with respect to the vehicle 1 shown in fig. 1. Here, the "left-right direction" is a direction including the "left direction" and the "right direction". The "up-down direction" is a direction including the "up direction" and the "down direction". The "front-rear direction" is a direction including the "front direction" and the "rear direction". Although not shown in fig. 1, the up-down direction is a direction orthogonal to the left-right direction and the front-back direction.
First, the vehicle system 2 according to the present embodiment will be described below with reference to fig. 1 and 2. Fig. 1 is a plan view of a vehicle 1 provided with a vehicle system 2. Fig. 2 is a block diagram of the vehicle system 2. The vehicle 1 is a vehicle (automobile) capable of running in an automatic driving mode.
As shown in fig. 2, the vehicle system 2 includes a vehicle control unit 3, a front sensor module 4, a rear sensor module 5, a left infrared camera module 6L, and a right infrared camera module 6R. The vehicle system 2 includes an hmi (human machine interface)8, a gps (global Positioning system)9, a wireless communication unit 10, and a storage device 11. The vehicle system 2 includes a steering actuator 12, a steering device 13, a brake actuator 14, a brake device 15, an acceleration actuator 16, and an accelerator device 17.
The vehicle control unit 3 is configured to control the traveling of the vehicle 1. The vehicle Control Unit 3 is constituted by at least one Electronic Control Unit (ECU), for example. The electronic control unit includes a computer system (for example, soc (system on a chip)) including one or more processors and one or more memories, and a circuit including active elements such as transistors and passive elements. The processor includes at least one of a cpu (central Processing unit), an mpu (microprocessing unit), a gpu (graphics Processing unit), and a tpu (temporal Processing unit), for example. The CPU may be configured by a plurality of CPU cores. A GPU may also be made up of multiple GPU cores. The memory includes ROM (read Only memory) and RAM (random Access memory). The ROM may also store a vehicle control program. For example, the vehicle control program may include an Artificial Intelligence (AI) program for automatic driving. The AI program is a program (trained model) constructed by teacher-or teacher-less mechanical learning (in particular, deep learning) using a multi-layer neural network. The RAM may temporarily store a vehicle control program, vehicle control data, and/or ambient environment information indicating the ambient environment of the vehicle. The processor may be configured to expand a program specified by various vehicle control programs stored in the ROM on the RAM and execute various processes in cooperation with the RAM. The computer system may be a non-von neumann computer such as an asic (application Specific Integrated circuit) or an FPGA (Field-Programmable gate array). The computer system may be a combination of a von neumann computer and a non-von neumann computer.
The front sensor module 4 is provided with a LiDAR unit 41 and a camera 42. The LiDAR unit 41 is configured to acquire 3D mapping data (point cloud data) indicating the surrounding environment in the area ahead of the vehicle 1 and transmit the acquired 3D mapping data to the vehicle control unit 3. The vehicle control unit 3 is configured to specify information indicating the surrounding environment in the front area (hereinafter, referred to as "surrounding environment information") based on the transmitted 3D map data. The ambient environment information may include information related to an object existing outside the vehicle 1. For example, the ambient environment information may include information related to the attribute of an object existing outside the vehicle 1 and information related to the distance and position of the object with respect to the vehicle 1. The camera 42 is configured to acquire image data indicating the surrounding environment in the area in front of the vehicle 1 and transmit the acquired image data to the vehicle control unit 3. The vehicle control unit 3 is configured to determine the surrounding environment information in the front area based on the transmitted image data.
The detection area of the LiDAR unit 41 and the detection area of the camera 42 may also at least partially overlap one another. The front sensor module 4 is disposed at a predetermined position in front of the vehicle 1. For example, the front sensor module 4 may be disposed in a front grille or a front bumper, or may be disposed in the left headlight 30L and/or the right headlight 30R. When the vehicle 1 includes two front sensor modules 4, one front sensor module 4 may be disposed in the left headlight 30L, and the other front sensor module 4 may be disposed in the right headlight 30R.
The rear sensor module 5 is provided with a LiDAR unit 51 and a camera 52. The LiDAR52 is configured to acquire 3D mapping data (point cloud data) indicating the surrounding environment in the rear area of the vehicle 1 and transmit the acquired 3D mapping data to the vehicle control unit 3. The vehicle control unit 3 is configured to determine the surrounding environment information in the rear area based on the transmitted 3D map data. The camera 52 is configured to acquire image data indicating the surrounding environment in the rear area of the vehicle 1 and transmit the acquired image data to the vehicle control unit 3. The vehicle control unit 3 is configured to determine the surrounding environment information in the rear area based on the transmitted image data.
The detection area of the LiDAR unit 51 and the detection area of the camera 52 may also at least partially overlap one another. The rear sensor module 5 is disposed at a predetermined position behind the vehicle 1. For example, the rear sensor module 5 may be disposed in a rear grill or a rear bumper, or may be disposed in the left side rear lamp 40L and/or the right side rear lamp 40R. When the vehicle 1 includes two rear sensor modules 5, one rear sensor module 5 may be disposed in the left side backlight 40L, and the other rear sensor module 5 may be disposed in the right side backlight 40R.
The left infrared camera module 6L (an example of a left infrared camera system) is mounted on a left C pillar 50L (an example of a left rear pillar) of the vehicle 1, and includes a left infrared irradiation unit 22L, a left camera 23L, and a control unit 21L. The left camera 23L has a left infrared camera 26L and a left visible light camera 27L.
The left infrared irradiation unit 22L is configured to emit infrared rays (particularly, near infrared rays) toward a left region of the vehicle 1. The wavelength band of the infrared rays emitted from the left infrared irradiation unit 22L is, for example, in the range of 700nm to 2500 nm. The peak wavelength of the infrared ray is, for example, 850nm, 940nm, or 1050 nm. The left infrared camera 26L is configured to acquire infrared image data indicating the surrounding environment of the left area of the vehicle 1. The left visible-light camera 27L is configured to acquire visible-light image data indicating the surrounding environment of the left area of the vehicle 1. For example, in the case where the RGB elements of one pixel of the visible-light image data each have a data amount of 8 bits, the visible-light image data has a data amount of 24 bits per pixel. The vehicle control unit 3 or the control unit 21L is configured to determine the surrounding environment information in the left side area based on the infrared image data and/or the visible light image data. As shown in fig. 4, the optical axis AL1 of the left infrared irradiation unit 22L and the optical axis AL2 of the left infrared camera 26L may be substantially parallel to each other. In this case, the shortage of the amount of infrared light in the detection area of the left infrared camera 26L can be appropriately prevented.
Further, as shown in fig. 5(a), the irradiation distance D of the infrared ray emitted from the left infrared ray irradiation unit 22LL1Detection distance D from the left infrared camera 26LL2Long. For example, the irradiation distance DL1Is 20m, and detects the distance DL2Is 15 m. The irradiation range S of the infrared rays emitted from the left infrared irradiation unit 22LL1Detection range S with the left infrared camera 26LL2Are repeated one after another, and the irradiation range SL1Ratio detection range SL2Is large. Thus, the left infrared camera 26L can reliably capture the detection range S existing in the left infrared camera 26LL2The object (other vehicle, pedestrian, etc.). In other words, the vehicle 1 can reliably specify the detection region S of the left infrared camera 26L based on the infrared image data acquired by the left infrared camera 26LL2The ambient environment information in (1). The upper limit of the irradiation angle of the infrared rays emitted from the left infrared irradiation unit 22L with respect to the horizontal direction may be about 10 °.
The left infrared camera 26L and the left visible light camera 27L may be integrally configured or may be separately configured. When the left infrared camera 26L and the left visible light camera 27L are integrally configured, a Color Filter Array (CFA) in which RGB color filters and infrared filters are arranged in an array may be used for the left camera 23L. In the case where the left infrared camera 26L is integrally configured with the left visible light camera 27L, the number of components constituting the left infrared camera module 6L can be reduced.
The control unit 21L is configured to control the operation of the left infrared irradiation unit 22L and the operations of the left infrared camera 26L and the left visible light camera 27L. The control unit 21L includes a computer system (for example, SoC) having one or more processors and one or more memories, and a circuit including active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU, and TPU. The memory comprises ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or an FPGA.
For example, the control unit 21L may operate only the left visible-light camera 27L of the left camera 23L when it is determined that the running environment of the vehicle 1 is bright (specifically, when the measured illuminance is greater than the threshold illuminance) based on the illuminance data acquired from an illuminance sensor (not shown) mounted on the vehicle 1. On the other hand, when determining that the running environment of the vehicle 1 is dark based on the illuminance data (specifically, when the measured illuminance is equal to or less than the threshold illuminance), the control unit 21L may operate only the left infrared camera 26L of the left cameras 23L. In this case, the control unit 21L may turn on the left infrared irradiation unit 22L. In this way, the ambient environment information in the left side area of the vehicle 1 can be specified without depending on the brightness of the running environment of the vehicle 1.
The left infrared camera module 6L may be disposed on a left rear pillar (e.g., a left D pillar) other than the left C pillar 50L. In the present embodiment, the left infrared camera system is configured as the left infrared camera module 6L, but the present embodiment is not limited thereto. In this regard, the left infrared camera system may not be packaged as a single module. In this case, the configuration of the left infrared ray camera system is not particularly limited as long as the left infrared ray irradiation unit 22L and the left infrared ray camera 26L are disposed on the left rear pillar (for example, the left C pillar 50L) of the vehicle 1. The control unit 21L may be integrated with the control unit 21R and/or the vehicle control unit 3.
The right infrared camera module 6R (an example of a right infrared camera system) is mounted on a right C pillar 50R (an example of a right rear pillar) of the vehicle 1, and includes a right infrared irradiation unit 22R, a right camera 23R, and a control unit 21R. The right camera 23R has a right infrared camera 26R and a right visible light camera 27R.
The right infrared radiation unit 22R is configured to emit infrared rays (near infrared rays) toward a right region of the vehicle 1. The wavelength band of the infrared rays emitted from the right infrared irradiation unit 22R is, for example, in the range of 700nm to 2500 nm. The peak wavelength of the infrared ray is, for example, 850nm, 940nm, or 1050 nm. The right infrared camera 26R is configured to acquire infrared image data indicating the surrounding environment of the right area of the vehicle 1. The right visible-light camera 27R is configured to acquire visible-light image data indicating the surrounding environment of the right area of the vehicle 1. The vehicle control unit 3 or the control unit 21R is configured to determine the surrounding environment information in the right side area based on the infrared image data and/or the visible light image data. As shown in fig. 4, the optical axis AR1 of the right infrared irradiation unit 22R and the optical axis AR2 of the right infrared camera 26R may be substantially parallel to each other. In this case, the shortage of the amount of infrared light in the detection area of the right infrared camera 26R can be appropriately prevented.
Further, as shown in fig. 5(b), the irradiation distance D of the infrared ray emitted from the right infrared ray irradiation unit 22RR1Detection distance D from right infrared camera 26RR2Long. Such as the irradiation distance DR1Is 20m, and detects the distance DR2Is 15 m. The irradiation range S of the infrared rays emitted from the right infrared ray irradiation unit 22RR1Detection range S with right infrared camera 26RR2Are repeated one after another, and the irradiation range SR1Ratio detection range SR2Is large. Thus, the right infrared camera 26R can reliably capture the detection range S existing in the right infrared camera 26RR2Object (other vehicle)Pedestrian, etc.). In other words, the vehicle 1 can reliably specify the detection range S of the right infrared camera 26R based on the infrared image data acquired by the right infrared camera 26RR2The ambient environment information in (1). The upper limit of the irradiation angle corresponding to the horizontal direction of the infrared rays emitted from the right infrared irradiation unit 22R may be about 10 °.
The right infrared camera 26R and the right visible light camera 27R may be integrally configured or may be separately configured. When the right infrared camera 26R and the right visible light camera 27R are integrally configured, a Color Filter Array (CFA) in which RGB color filters and infrared filters are arranged in an array may be used for the right camera 23R. When the right infrared camera 26R is integrally configured with the right visible light camera 27R, the number of components constituting the right infrared camera module 6R can be reduced.
The control unit 21R is configured to control the operation of the right infrared irradiation unit 22R and the operations of the right infrared camera 26R and the right visible light camera 27R. The control unit 21R includes a computer system (for example, SoC) having one or more processors and one or more memories, and a circuit including active elements such as transistors and passive elements. The processor includes at least one of a CPU, MPU, GPU, and TPU. The memory comprises ROM and RAM. The computer system may be a non-von neumann computer such as an ASIC or FPGA.
For example, when it is determined that the running environment of the vehicle 1 is bright (specifically, when the measured illuminance is greater than the threshold illuminance) based on the illuminance data acquired from an illuminance sensor (not shown) mounted on the vehicle 1, the control unit 21R may operate only the right visible light camera 27R of the right camera 23R. On the other hand, the controller 21R may operate only the right infrared camera 26R of the right cameras 23R when it is determined based on the illuminance data that the running environment of the vehicle 1 is dark (specifically, when the measured illuminance is equal to or less than the threshold illuminance). In this way, the surrounding environment information in the right side area of the vehicle 1 can be specified without depending on the brightness of the running environment of the vehicle 1.
The right infrared camera module 6R may be disposed on a right rear pillar (for example, a right D pillar) other than the right C pillar 50R. In the present embodiment, the right infrared camera system is configured as the right infrared camera module 6R, but the present embodiment is not limited thereto. In this regard, the right infrared camera system may not be packaged as a single module. In this case, the configuration of the right infrared camera system is not particularly limited as long as the right infrared irradiation unit 22R and the right infrared camera 26R are disposed on the right rear pillar (for example, the right C pillar 50R) of the vehicle 1. The control unit 21R may be integrally configured with the control unit 21L and/or the vehicle control unit 3.
The HMI8 is composed of an input unit that receives an input operation from the driver, and an output unit that outputs travel information and the like to the driver. The input unit includes a steering wheel, an accelerator pedal, a brake pedal, a driving mode switching switch for switching the driving mode of the vehicle 1, and the like. The output unit is a Display (for example, Head Up Display (HUD)) for displaying various kinds of travel information. The GPS9 is configured to acquire current position information of the vehicle 1 and output the acquired current position information to the vehicle control unit 3.
The wireless communication unit 10 is configured to receive information (for example, travel information) related to another vehicle located around the vehicle 1 from the other vehicle and transmit the information (for example, travel information) related to the vehicle 1 to the other vehicle (inter-vehicle communication). The wireless communication unit 10 is configured to receive basic information from a base device such as a traffic signal or a beacon lamp and transmit the travel information of the vehicle 1 to the base device (road-to-vehicle communication). The wireless communication unit 10 is configured to receive information related to a pedestrian from a portable electronic device (a smartphone, a tablet, a wearable device, or the like) carried by the pedestrian, and to transmit the own vehicle travel information of the vehicle 1 to the portable electronic device (pedestrian-to-vehicle communication). The vehicle 1 may communicate with other vehicles, infrastructure equipment, or portable electronic devices directly in a peer-to-peer mode, or may communicate via a communication network such as the internet.
The storage device 11 is an external storage device such as a Hard Disk Drive (HDD) or ssd (solid State drive). The storage device 11 may store two-dimensional or three-dimensional map information and/or a vehicle control program. For example, the three-dimensional map information may be composed of 3D map data (point cloud data). The storage device 11 is configured to output map information and a vehicle control program to the vehicle control unit 3 in response to a request from the vehicle control unit 3. The map information and the vehicle control program may be updated with the wireless communication unit 10 via a communication network.
When the vehicle 1 travels in the automatic driving mode, the vehicle control unit 3 automatically generates at least one of a steering control signal, an acceleration control signal, and a braking control signal based on the travel state information, the surrounding environment information, the current position information, the map information, and the like. The steering actuator 12 is configured to receive a steering control signal from the vehicle control unit 3 and control the steering device 13 based on the received steering control signal. The brake actuator 14 is configured to receive a brake control signal from the vehicle control unit 3 and control the brake device 15 based on the received brake control signal. The acceleration actuator 16 is configured to receive an acceleration control signal from the vehicle control unit 3 and control the acceleration device 17 based on the received acceleration control signal. In this way, the vehicle control unit 3 automatically controls the travel of the vehicle 1 based on the travel state information, the surrounding environment information, the current position information, the map information, and the like. That is, in the automatic driving mode, the travel of the vehicle 1 is automatically controlled by the vehicle system 2.
On the other hand, when the vehicle 1 travels in the manual driving mode, the vehicle control unit 3 generates a steering control signal, an acceleration control signal, and a braking control signal in accordance with manual operations of an accelerator pedal, a brake pedal, and a steering wheel by the driver. In this way, in the manual driving mode, the steering control signal, the acceleration control signal, and the braking control signal are generated by the manual operation of the driver, and thus the travel of the vehicle 1 is controlled by the driver.
Next, the driving mode of the vehicle 1 will be explained. The driving modes include an automatic driving mode and a manual driving mode. The automatic driving mode is composed of a full automatic driving mode, a high driving assistance mode, and a driving assistance mode. In the full-automatic driving mode, the vehicle system 2 automatically performs all the travel control of the steering control, the braking control, and the acceleration control, and the driver is not in a state in which the vehicle 1 can be driven. In the high driving assist mode, the vehicle system 2 automatically performs all the travel control of the steering control, the braking control, and the acceleration control, and the driver does not drive the vehicle 1 although in a state in which the vehicle 1 can be driven. In the driving assistance mode, the vehicle system 2 automatically performs a part of running control of the steering control, the braking control, and the acceleration control, and the driver drives the vehicle 1 with the driving assistance of the vehicle system 2. On the other hand, in the manual driving mode, the vehicle system 2 does not automatically perform the travel control, and the driver drives the vehicle 1 without the driving assistance of the vehicle system 2.
The driving mode of the vehicle 1 may be switched by operating a driving mode switching switch. In this case, the vehicle control unit 3 switches the driving mode of the vehicle 1 among four driving modes (full-automatic driving mode, high driving assistance mode, manual driving mode) in accordance with the operation of the driving mode switching switch by the driver. In addition, the driving mode of the vehicle 1 may be automatically switched based on information on a travelable section in which the autonomous vehicle can travel, a travel-prohibited section in which travel of the autonomous vehicle is prohibited, or information on an outside weather state. In this case, the vehicle control unit 3 switches the driving mode of the vehicle 1 based on these pieces of information. Further, the driving mode of the vehicle 1 may be automatically switched by using a seating sensor, a face direction sensor, or the like. In this case, the vehicle control unit 3 switches the driving mode of the vehicle 1 based on the output signals from the seating sensor and the face direction sensor.
Next, referring to fig. 3, a specific configuration of the left infrared camera module 6L will be described below. Fig. 3(a) shows a side view of the vehicle 1 and an enlarged front view of the left infrared camera module 6L. Fig. 3(b) is a sectional view of the left infrared camera module 6L.
As shown in fig. 3, the left infrared camera module 6L is mounted at a predetermined position of the left C pillar 50L of the vehicle 1. In particular, the left infrared camera module 6L is fitted into a through hole 56L formed at a predetermined position of the left C-pillar 50L. The left infrared camera module 6L includes a left camera 23L including a left infrared camera 26L and a left visible light camera 27L, a left infrared irradiation unit 22L, a control unit 21L, a housing 61L having an opening, and a cover 62L covering the opening of the housing 61L. In the figure, the left infrared camera 26L and the left visible light camera 27L are integrally configured.
The left camera 23L, the left infrared irradiation unit 22L, and the control unit 21L are disposed in a space formed by the housing 61L and the cover 62L. Specifically, the space formed by the case 61L and the cover 62L has a first space S1 and a second space S2 separated from the first space S1. The left camera 23L is disposed in the first space S1, and the left infrared irradiation unit 22L and the controller 21L are disposed in the second space S2. According to the above configuration, since the first space S1 and the second space S2 are separated from each other, the infrared rays emitted from the left infrared irradiation unit 22L can be appropriately prevented from directly entering the left infrared camera 26L. In this way, the reliability of the infrared image data acquired by the left infrared camera 26L can be improved.
The left infrared radiation unit 22L includes a heat sink 34L, a sub-mount 33L disposed on the heat sink 34L, an infrared light source 32L disposed on the sub-mount 33L, and a reflector 35L disposed on the heat sink 34L. The heat sink 34L is configured to discharge heat generated by the infrared light source 32L to the outside. That is, the heat generated by the infrared light source 32L is released to the air in the second space S2 via the sub-mount 33L and the heat sink 34L.
The infrared light source 32L is constituted by, for example, an infrared LED configured to emit infrared light. The reflector 35L is configured to reflect the infrared rays emitted from the infrared ray source 32L toward the outside. The reflector 35L is configured as a parabolic reflector, for example. In this case, the infrared light source 32L may be disposed near the focal point of the reflector 35L, and the infrared light emitted from the infrared light source 32L may be converted into substantially parallel light by the reflector 35L.
The control unit 21L is disposed in the second space S2, but the position at which the control unit 21L is disposed is not particularly limited. For example, the control unit 21L may be disposed outside the left infrared camera module 6L. In particular, the control unit 21L may be disposed at a predetermined position of the vehicle 1.
The housing 61L has a pair of lances 67L provided on an outer side surface 166L of the housing 61L. The cover 62L functions as an infrared-transmitting filter that transmits infrared rays (particularly, near infrared rays), for example. Therefore, the cover 62L is configured to transmit infrared rays (near infrared rays) emitted from the left infrared irradiation unit 22L, while being configured not to transmit at least visible rays having a wavelength of 600nm or less. Also, the color of the cover 62L can be recognized as black from the outside of the vehicle 1. The cover 62L has a through hole 63L at a position facing the left camera 23L. In the case where the left camera 23L is configured by only the left infrared camera 26L, the through hole 63L may not be formed in the cover 62L. When the left infrared camera 26L and the left visible light camera 27L are separated from each other, a through hole 63L or a visible light transmission filter may be formed at a position facing the left visible light camera 27L.
The cover 62L has a flange portion 162L projecting outward from an outer side surface 163L of the cover 62L. A seal sheet 68L is provided between the flange 162L and the left C-pillar 50L. Further, a emblem 65L of the vehicle 1 is formed on an outer surface 164L of the cover 62L. In this regard, when the infrared ray (near infrared ray) emitted from the left infrared irradiation unit 22L includes light in the red wavelength band, the emitted infrared ray is recognized as reddish light by a passenger, a pedestrian, or the like of another vehicle present outside the vehicle 1. On the other hand, pedestrians and the like pay more attention to the emblem 65L than the infrared rays (reddish light) emitted from the left infrared ray irradiation unit 22L. Therefore, the vehicle logo 65L can appropriately prevent the pedestrian or the like from feeling a large sense of discomfort with respect to the infrared ray (reddish light). In the present embodiment, "a" is shown as an example of the shape of the emblem 65L, but the shape of the emblem 65L is not particularly limited.
Further, a coating layer 66L is formed on the outer surface 164L. The coating layer 66L may also be a water repellent coating layer or a hydrophilic coating layer. In the case where the coating layer 66L is a water repellent coating layer, it is possible to suitably prevent water droplets from adhering to the outer surface 164L of the cover 62L. Therefore, it is possible to appropriately prevent water droplets from being reflected on the left camera 23L, and thus it is possible to improve the reliability of the image data acquired by the left camera 23L. On the other hand, in the case where the coating layer 66L is a hydrophilic coating layer, the formation of water spots on the outer surface 164L of the cover 62L can be appropriately prevented. Therefore, it is possible to appropriately prevent water spots from being reflected on the left camera 23L, and thus the reliability of the image data acquired by the left camera 23L can be improved.
In the present embodiment, only the specific structure of the left infrared camera module 6L is described, but the right infrared camera module 6R also has the same configuration as the left infrared camera module 6L. That is, the right infrared camera module 6R is fitted into a through hole formed at a predetermined position of the right C-pillar 50R of the vehicle 1. The right infrared camera module 6R includes a right camera 23R including a right infrared camera 26R and a right visible light camera 27R, a right infrared irradiation unit 22R, a control unit 21R, a housing (not shown) having an opening, and a cover (not shown) covering the opening of the housing. The right infrared camera module 6R includes a emblem (not shown) of the vehicle 1 formed on the outer surface of the cover and a coating layer (not shown) formed on the outer surface of the cover. The right infrared irradiation unit 22R has the same configuration as the left infrared irradiation unit 22L shown in fig. 3.
As described above, according to the present embodiment, even when the running environment of the vehicle 1 is dark, the peripheral environment in the left side region of the vehicle 1 can be detected using the left infrared camera 26L mounted on the left C-pillar 50L of the vehicle 1, and the peripheral environment in the right side region of the vehicle 1 can be detected using the right infrared camera 26R mounted on the right C-pillar 50R of the vehicle 1. Further, since the light for the infrared camera emits infrared light to the outside instead of visible light, it is possible to prevent a situation in which a large uncomfortable feeling is given to other vehicles, pedestrians, and the like existing outside the vehicle 1. In this way, the ambient environment in the side regions (the left side region and the right side region) of the vehicle 1 can be detected without giving a large sense of discomfort to other vehicles, pedestrians, and the like.
Further, since the left infrared camera module 6L (particularly, the left camera 23L) is mounted on the left C-pillar 50L, it is difficult to attach dirt to the left camera 23L. Therefore, it is not necessary to additionally provide a cleaner device for cleaning the left side camera 23L, and therefore the manufacturing cost of the left side infrared camera module 6L can be suppressed. Further, since it is difficult for the left camera 23L to be defective due to adhesion of dirt, the reliability of the left camera 23L can be improved.
As described above, since the right infrared camera module 6R (particularly, the right camera 23R) is mounted on the right C-pillar 50R, it is difficult to attach dirt to the right camera 23R. Therefore, it is not necessary to separately provide a cleaner device for cleaning the right side camera 23R, and therefore the manufacturing cost of the right side infrared camera module 6R can be suppressed. Further, since a trouble is less likely to occur in the right camera 23R due to adhesion of dirt, the reliability of the right camera 23R can be improved.
In the present embodiment, the left infrared irradiation unit 22L, the left camera 23L, and the control unit 21L are housed in a space formed by the case 61L and the cover 62L, and the left infrared camera system is packaged as one module. Therefore, the left infrared camera module 6L can be easily attached to the vehicle 1.
The embodiments of the present invention have been described above, but it is needless to say that the technical scope of the present invention should not be construed as being limited by the description of the embodiments. This embodiment is merely an example, and it is understood by those skilled in the art that various modifications of the embodiment can be made within the scope of the invention described in the claims. The technical scope of the present invention should be determined based on the scope of the present invention described in the claims and the equivalent scope thereof.
In the present embodiment, the description has been given of the driving modes of the vehicle including the full-automatic driving mode, the high-level driving assistance mode, the driving assistance mode, and the manual driving mode, but the driving modes of the vehicle should not be limited to these four modes. The classification of the driving modes of the vehicle may be appropriately changed according to the law or rule of automated driving in each country. Similarly, the definitions of the "full automatic driving mode", the "advanced driving assistance mode" and the "driving assistance mode" described in the description of the present embodiment are merely examples, and these definitions may be appropriately changed according to the laws and regulations of automatic driving in each country.

Claims (4)

1. An infrared camera system provided in a vehicle, the infrared camera system comprising:
an infrared irradiation unit configured to emit infrared rays toward a side area of the vehicle;
an infrared camera configured to acquire infrared image data representing a surrounding environment of a side area of the vehicle;
the irradiation distance of the infrared rays emitted from the infrared ray irradiation unit is longer than the detection distance of the infrared ray camera.
2. The infrared camera system as set forth in claim 1,
the infrared camera is mounted on a rear pillar of the vehicle.
3. The infrared camera system as set forth in claim 1 or 2,
the infrared irradiation unit includes:
a left infrared irradiation unit mounted on a left rear pillar of the vehicle;
a right infrared radiation unit mounted on a right rear pillar of the vehicle;
the infrared camera includes:
a left infrared camera mounted on the left rear pillar;
a right infrared camera mounted on the right rear pillar;
the irradiation distance of the infrared rays emitted from the left infrared ray irradiation unit is longer than the detection distance of the left infrared ray camera,
the irradiation distance of the infrared rays emitted from the right infrared ray irradiation unit is longer than the detection distance of the right infrared ray camera.
4. A vehicle provided with the infrared camera system according to any one of claims 1 to 3.
CN201921964211.1U 2018-11-14 2019-11-13 Infrared camera system and vehicle Active CN211468304U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018213869 2018-11-14
JP2018-213869 2018-11-14

Publications (1)

Publication Number Publication Date
CN211468304U true CN211468304U (en) 2020-09-11

Family

ID=72375028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201921964211.1U Active CN211468304U (en) 2018-11-14 2019-11-13 Infrared camera system and vehicle

Country Status (1)

Country Link
CN (1) CN211468304U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111186377A (en) * 2018-11-14 2020-05-22 株式会社小糸制作所 Infrared camera system, infrared camera module, and vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111186377A (en) * 2018-11-14 2020-05-22 株式会社小糸制作所 Infrared camera system, infrared camera module, and vehicle

Similar Documents

Publication Publication Date Title
US10479269B2 (en) Lighting apparatus for vehicle and vehicle having the same
CN109843676B (en) Automatic parking assist device and vehicle comprising same
US11242068B2 (en) Vehicle display device and vehicle
EP3663134B1 (en) Vehicular lighting system and vehicle
EP3888965B1 (en) Head-up display, vehicle display system, and vehicle display method
JP7187291B2 (en) Infrared camera system and vehicle
EP3929041B1 (en) Dirt detection system and vehicle
US10882465B2 (en) Vehicular camera apparatus and method
CN111186377B (en) Infrared camera system, infrared camera module, and vehicle
US11699250B1 (en) System and method for low visibility driving
US20220365345A1 (en) Head-up display and picture display system
US20220126792A1 (en) Sensing system for vehicle and vehicle
CN211468304U (en) Infrared camera system and vehicle
CN211468307U (en) Infrared camera module and vehicle
CN211468306U (en) Infrared camera module and vehicle
CN211468305U (en) Infrared camera system and vehicle
CN211468308U (en) Infrared camera system and vehicle
CN211468303U (en) Infrared camera system and vehicle
CN211468302U (en) Infrared camera system and vehicle
US20220206153A1 (en) Vehicular sensing system and vehicle
JP7382344B2 (en) Infrared camera systems and vehicles
CN113557386A (en) Vehicle lamp and vehicle
US20230184902A1 (en) Vehicular light source system, vehicular sensing system, and vehicle
KR20210100345A (en) Electronic device of vehicle for obtaining an image by controlling a plurality of light sources and operating method thereof

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant