CN112514361A - Vehicle-mounted camera and drive control system using same - Google Patents

Vehicle-mounted camera and drive control system using same Download PDF

Info

Publication number
CN112514361A
CN112514361A CN201980050640.4A CN201980050640A CN112514361A CN 112514361 A CN112514361 A CN 112514361A CN 201980050640 A CN201980050640 A CN 201980050640A CN 112514361 A CN112514361 A CN 112514361A
Authority
CN
China
Prior art keywords
external space
drive control
visible light
infrared light
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980050640.4A
Other languages
Chinese (zh)
Other versions
CN112514361B (en
Inventor
重松一真
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Publication of CN112514361A publication Critical patent/CN112514361A/en
Application granted granted Critical
Publication of CN112514361B publication Critical patent/CN112514361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • G03B11/04Hoods or caps for eliminating unwanted light from lenses, viewfinders or focusing aids
    • G03B11/045Lens hoods or shields
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B3/00Focusing arrangements of general interest for cameras, projectors or printers
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B3/00Focusing arrangements of general interest for cameras, projectors or printers
    • G03B3/10Power-operated focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/52Elements optimising image sensor operation, e.g. for electromagnetic interference [EMI] protection or temperature control by heat transfer or cooling elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R2011/0001Arrangements for holding or mounting articles, not otherwise provided for characterised by position
    • B60R2011/0003Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle
    • B60R2011/0026Windows, e.g. windscreen
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2217/00Details of cameras or camera bodies; Accessories therefor
    • G03B2217/002Details of arrangement of components in or on camera body

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Blocking Light For Cameras (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)

Abstract

To provide an in-vehicle camera that can capture a high-quality image. [ solution ] A vehicle-mounted camera is provided with: imaging element, casing and optical system. The housing has: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening for communicating the accommodating portion and the external space; and a functional portion that forms at least a part of the outer surface and absorbs visible light and reflects infrared light among light incident from an external space. The optical system images light incident on the opening from the external space into the imaging element.

Description

Vehicle-mounted camera and drive control system using same
Technical Field
The present technology relates to an in-vehicle camera and a drive control system using the same.
Background
A technique using a front camera that captures an image of a landscape in front of a car to control driving of the car is known. In this technique, for example, the driving of the automobile is controlled based on the position and movement of an object detected from an image taken by a front camera. Therefore, it is required to capture a high quality image by a front camera for controlling the driving of the car.
The imaging element is used in an image capturing apparatus such as a front camera. When the imaging element is not in the focal depth due to thermal expansion of the structural member due to temperature rise, the image capturing apparatus captures a low-resolution image. Therefore, in order to capture a high-resolution image, the image capturing apparatus needs to have a configuration in which the temperature is less likely to rise.
Patent document 1 discloses a technique that can suppress a temperature rise of an image capturing apparatus. According to the technique disclosed in patent document 1, the image capturing apparatus is connected to the metal holder using a heat transfer member. Therefore, heat generated by the internal chip is discharged into the holder through the heat transfer member, and this makes it possible to suppress a temperature rise of the image capturing apparatus.
CITATION LIST
Patent document
Patent document 1: japanese patent application laid-open No.2016-
Disclosure of Invention
Technical problem
Sunlight is directly incident unobstructed on the housing of a front camera arranged inside the windshield. Therefore, in particular, in the image capturing apparatus, the temperature in the front camera is more likely to rise. Further, when reflected light of sunlight in the housing of the front camera is incident on the lens, lens flare such as ghost is more easily caused in the front camera.
As described above, with a front camera in which sunlight is easily incident, the quality of a captured image is more likely to be degraded due to a decrease in resolution caused by an increase in temperature or due to lens flare. Therefore, even in an environment where sunlight is easily incident on the front camera, the front camera needs to be able to capture a high-quality image.
In view of the above, it is an object of the present technology to provide an in-vehicle camera capable of capturing a high-quality image and a drive control system using the same.
Solution to the problem
In order to achieve the above object, an in-vehicle camera according to an embodiment of the present technology includes an imaging element, a housing, and an optical system.
The housing includes: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening communicating the accommodating portion and the external space with each other; and a functional portion that forms at least a part of the outer surface, the functional portion absorbing visible light of light incident from the external space and reflecting infrared light of the light incident from the external space.
The optical system images light incident on the opening from the external space in the imaging element.
In the vehicle-mounted camera, visible light incident on a functional portion forming at least a part of an outer surface of the housing is absorbed. Therefore, the occurrence of reflected light of visible light on the outer surface is suppressed. Therefore, in the vehicle-mounted camera, reflected light of visible light is less likely to be incident on the lens, and thus lens flare is less likely to be caused in a captured image.
In addition, in this vehicle-mounted camera, since infrared light incident on the functional portion is reflected off the functional portion, a temperature increase due to absorption of infrared light into the housing is less likely to be caused. Therefore, in the vehicle-mounted camera, the structural member is less likely to thermally expand, and therefore, the relative position of the optical system with respect to the imaging element is less likely to shift. Therefore, in the vehicle-mounted camera, the resolution of the captured image is less likely to be reduced.
As described above, the onboard camera is capable of capturing high-quality images.
The functional portion may have a stacked structure including an infrared light reflecting layer that reflects infrared light away and a visible light absorbing layer that absorbs visible light.
The visible light absorbing layer may be located further outside than the infrared light reflecting layer, and the infrared light may be transmitted through the visible light absorbing layer.
The infrared light reflecting layer may be located further outside than the visible light absorbing layer, and visible light may be transmitted through the infrared light reflecting layer.
In these in-vehicle cameras, the following configuration may be provided: visible light is absorbed into the functional portion having the stacked structure, and infrared light is reflected off the functional portion.
The optical system may have a fixed focus.
As described above, the temperature is less likely to rise in the in-vehicle camera. Therefore, even if the optical system has a fixed focus, it is unlikely that the resolution of the captured image is reduced.
The housing may include a plurality of openings.
The in-vehicle camera may further include a plurality of imaging elements corresponding to the plurality of openings and a plurality of optical systems.
In this configuration, the optical system may include a plastic lens.
In the vehicle-mounted camera, since the temperature is less likely to rise, a plastic lens with low heat resistance can be used. This makes it possible to reduce the manufacturing cost of the in-vehicle camera.
The drive control system according to the embodiment of the present technology is capable of controlling the drive of a movable body including a windshield, and includes an imaging element, a housing, an optical system, a processing unit, an information generator, and a drive controller.
The imaging element takes an original image.
The housing includes: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening communicating the accommodating portion and the external space with each other; and a functional portion that forms at least a part of the outer surface, the functional portion absorbing visible light of light incident from the external space and reflecting infrared light of the light incident from the external space.
The optical system images light incident on the opening from the external space in the imaging element.
The processing unit includes: an image processor that performs image processing on the original image to generate a processed image; a recognition processor that performs a recognition process on the processed image to recognize an object; and a calculation processor that calculates object information about the object using the processed image.
The information generator generates drive control information related to control of drive of the movable body based on a result of the processing performed by the processing unit.
The drive controller controls driving of the movable body based on the drive control information.
In the drive control system, a high-quality image can be captured using an in-vehicle camera. Therefore, the drive of the movable body can be controlled more accurately.
The processing unit may further include a mapping processor that creates a digital map using the processed image and the object information.
The processing unit may further include a path planning section that determines a travel route of the movable body using the digital map.
In the drive control system, a high-quality image can be captured using an in-vehicle camera. Therefore, more complicated drive control can be performed on the movable body.
Drawings
FIG. 1 is a perspective view of an automobile including an onboard camera in accordance with embodiments of the present technology.
Fig. 2 is a perspective view of the in-vehicle camera.
Fig. 3 is a sectional view of the in-vehicle camera taken along line a-a' of fig. 2.
Fig. 4 is a set of partial sectional views for describing a functional section of the in-vehicle camera.
Fig. 5 is a set of partial sectional views showing a configuration example of the functional portion.
Fig. 6 is a set of partial sectional views showing a configuration example of the functional portion.
Fig. 7 is a perspective view showing another embodiment of the in-vehicle camera.
Fig. 8 is a block diagram showing a configuration that makes it possible to implement a driving assistance function in a drive control system according to an embodiment of the present technology.
Fig. 9 is a flowchart showing a drive control method executed by the drive control system.
Fig. 10 is a diagram for describing an example of a method for calculating a following distance to a preceding vehicle, which is executed by a calculation processor of the drive control system.
Fig. 11 is a block diagram showing a configuration that makes it possible to implement an automatic driving function in the drive control system.
Fig. 12 is a flowchart showing a drive control method executed by the drive control system.
Detailed Description
Embodiments of the present technology will now be described below with reference to the accompanying drawings.
[ vehicle-mounted camera 1]
(Overall Structure)
Fig. 1 is a perspective view of an automobile M including an in-vehicle camera 1 according to an embodiment of the present technology. The vehicle M includes a windshield (front window) M01 disposed at the front as a transparent glass window, a rear window M02 disposed at the rear, and a side window M03 disposed at the opposite side.
The in-vehicle camera 1 is a front sensing camera attached to the inner side of the windshield M01. The in-vehicle camera 1 is disposed at an upper portion of a central region in the width direction of the windshield M01. This enables the onboard camera 1 to successfully capture an image of the scenery in front of the automobile M without obstructing the driver's sight.
In order to realize the running function, the automobile M including the vehicle-mounted camera 1 includes therein a driving force generation mechanism M11, a brake mechanism M12, a steering mechanism M13, and the like, the driving force generation mechanism M11 including, for example, an engine and a motor. Further, the automobile M may include, for example, a surrounding information detector for detecting surrounding information and a positioning section for generating position information.
Fig. 2 is a perspective view of the in-vehicle camera 1 before being attached to the windshield M01. Fig. 3 is a sectional view of the vehicle-mounted camera 1 taken along line a-a' of fig. 2, the vehicle-mounted camera 1 being attached to a windshield M01. In other words, fig. 2 shows a longitudinal section of the onboard camera 1 in the front-rear direction, the longitudinal section being in the central portion in the width direction of the onboard camera 1.
The in-vehicle camera 1 includes a housing 10 forming an outer appearance of the in-vehicle camera 1. The housing 10 includes a hollow portion 11, an extending portion 12, and side wall portions 13, the hollow portion 11 being a hollow rectangular parallelepiped, the extending portion 12 extending forward from a lower portion of the hollow portion 11, the side wall portions 13 being arranged on opposite sides in a width direction of the extending portion 12. The upper surface of the side wall portion 13 of the vehicle-mounted camera 1 is joined to the inner surface of the windshield M01.
In the hollow portion 11, an accommodating portion 14 is formed as an internal space of the hollow portion 11. As shown in fig. 3, a shielding portion 15 is formed above the extension portion 12, and the shielding portion 15 is an external space closed by the windshield M01. An opening 16 that faces the windshield M01 and communicates the housing portion 14 and the shielding portion 15 with each other is formed in the hollow portion 11.
The shielding portion 15 is surrounded by the front surface of the hollow portion 11, the upper surface of the extending portion 12, and the inner side surface of the side wall portion 13, and is shielded by a portion other than the windshield M01. This enables the housing 10 to make only the light transmitted through the windshield M01 incident on the opening 16 for connecting the shielding portion 15 to the accommodating portion 14.
Further, the onboard camera 1 includes a circuit board 20 and an imaging element 21. The circuit board 20 is disposed on the bottom surface of the accommodating portion 14. The imaging element 21 is arranged to be oriented forward through a connection plate 21a that is vertical on the circuit board 20. Note that the imaging element 21 may be directly mounted on the circuit board 20.
The imaging element 21 is not limited to a specific type. For example, a Charge Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), or the like may be used as the imaging element 21. Various ceramic substrates and plastic substrates may be used as the circuit board 20 and the connection board 21 a.
Further, various components for realizing functions required for the in-vehicle camera 1 may be mounted on the circuit board 20 in addition to the imaging element 21. For example, a built-in communication portion for transmitting a captured image to another structural component included in the automobile M, an image processor for performing image processing on the captured image, or the like may be mounted on the circuit board 20.
The onboard camera 1 includes an optical system 30, and the optical system 30 includes a lens 31 and has a fixed focal point. The lens 31 is attached to the front side of the peripheral portion of the opening 16 in the hollow portion 11 by a frame 31a that holds the outer periphery of the lens 31. Therefore, only light transmitted through the lens 31 adjacent to the front of the opening 16 is incident on the opening 16.
The optical system 30 is configured such that light incident on the opening 16 is imaged in the light receiving surface of the imaging element 21. In addition to lenses, the optical system 30 may also include optical components such as mirrors or prisms, for example. This makes it possible to guide light incident on the lens 31 to the imaging element 21 regardless of the arrangement of the imaging element 21.
The housing 10 includes a functional portion 40, and the functional portion 40 forms at least a portion of an outer surface of the housing 10 exposed to the external space. Specifically, in the housing 10, the front surface of the hollow portion 11 surrounding the shielding portion 15, the upper surface of the extending portion 12, and the inner side surface of the side wall portion 13 are formed by the functional portion 40. The functional portion 40 has a function of suppressing the occurrence of reflected light and suppressing a temperature rise.
Note that it is particularly advantageous that the onboard camera 1 of the present technology has a configuration in which the height dimension of the imaging element 21 is 4.32mm and the width dimension is 8.64mm (1/1.7 type), the number of pixels of the imaging element 21 is equal to or greater than several millions (in particular, seven million pixels or more), and the allowable range of deviation of the focal position of the optical system 30 is several micrometers. Further, it is also particularly advantageous that the onboard camera 1 of the present technology has a configuration in which the imaging element 21 has a higher pixel density than the 1/1.7 type imaging element 21 including seven million pixels, and the allowable range of the deviation of the focal point position of the optical system 30 is several micrometers.
Fig. 4 is a set of partial sectional views of the in-vehicle camera 1 showing a part around the shielding portion 15. As shown in (a) of fig. 4, the functional portion 40 is configured such that visible light among incident light is absorbed into the functional portion 40. In other words, the occurrence of reflected light of visible light is suppressed in the functional portion 40. Therefore, the reflected light of the visible light is less likely to be incident on the lens 31 in the in-vehicle camera 1.
In particular, in the vehicle-mounted camera 1, since the outer surface of the housing 10 surrounding the shielding portion 15 to which the lens 31 is exposed is formed as the functional portion 40, it is possible to effectively prevent reflected light of visible light incident on the lens 31. Therefore, it is less likely that lens flare is caused in the captured image in the onboard camera 1.
Further, as shown in (B) of fig. 4, the functional section 40 is configured such that infrared light in incident light is reflected off by the functional section 40. In other words, in the in-vehicle camera 1, the infrared light incident on the function section 40 can be released into the external space. This results in that the temperature rise of the housing 10 due to absorption of infrared light can be suppressed.
Therefore, in the in-vehicle camera 1, the position of the imaging element 21 is maintained within the focal depth of the optical system 30. Therefore, the resolution of the captured image is less likely to be reduced. In addition, a component with low heat resistance may be used in the in-vehicle camera 1, and, for example, an inexpensive plastic lens may be used as the lens 31.
Further, as described above, in the vehicle-mounted camera 1, light is not incident on the shielding portion 15 from any place other than the windshield M01. Therefore, in the vehicle-mounted camera, light reflected from the lower side and the side surface due to the surrounding environment does not enter the shielding portion 15. Therefore, in the in-vehicle camera 1, the occurrence of lens flare in a captured image can be further suppressed.
(details of the functional section 40)
Fig. 5 and 6 are several sets of enlarged partial sectional views of the functional section 40 in the housing 10 of the vehicle-mounted camera 1. Fig. 5 and 6 schematically show configuration examples for realizing a function of absorbing visible light into the functional portion 40 and reflecting infrared light away from the functional portion 40, respectively. Note that the configuration of the functional section 40 is not limited to the configuration examples shown in fig. 5 and 6, and various modifications may be made thereto.
The functional section 40 shown in (a) of fig. 5 has a stacked structure including a visible light absorbing layer 41a and an infrared light reflecting layer 42 a. In this functional section 40, the infrared light reflecting layer 42a is stacked on the case 10, and the visible light absorbing layer 41a is stacked on the infrared light reflecting layer 42 a. In other words, the visible light absorbing layer 41a is disposed at a position outside the infrared ray reflecting layer 42 a.
The visible light absorption layer 41a is configured such that visible light (light having a wavelength of about 380nm to 780 nm) is absorbed in the visible light absorption layer 41a and infrared light (light having a wavelength of about 780nm to 2500 nm) is transmitted through the visible light absorption layer 41 a. The visible light absorbing layer 41a may use a known configuration, and for example, the visible light absorbing layer 41a may be formed of, for example, black paint having a transmission characteristic with respect to an infrared region.
The infrared light reflecting layer 42a is configured such that infrared light is reflected off of the infrared light reflecting layer 42 a. The infrared light reflecting layer 42a may use a known configuration, and for example, the infrared light reflecting layer 42a may use a metal plate on which mirror finishing is performed, a metal vapor-deposited film obtained by depositing a metal onto the case 10, or the like. The metal plate and the metal deposition film may be formed of, for example, aluminum.
Due to this configuration, in the functional portion 40 shown in fig. 5 (a), visible light is absorbed into the visible light absorbing layer 41a, and infrared light transmitted through the visible light absorbing layer 41a is reflected off from the infrared light reflecting layer 42 a. As described above, this functional portion 40 makes it possible to realize a function of absorbing visible light and reflecting infrared light.
The functional portion 40 shown in (B) of fig. 5 does not include the infrared light reflecting layer 42a shown in (a) of fig. 5, and the case 10 itself functions as the infrared light reflecting layer 42 a. In other words, when the housing 10 is made of, for example, aluminum, which has been mirror-finished, this enables the housing 10 itself to have a function of reflecting infrared light away from the housing 10.
In the functional portion 40 shown in (a) and (B) of fig. 5, the infrared light released by the black visible light absorbing layer 41a itself is also reflected off the infrared light reflecting layer 42a to be released into the external space. Therefore, the temperature rise is further suppressed in the in-vehicle camera 1 due to the cooling effect provided by the release by the visible light absorbing layer 41a itself.
The functional section 40 shown in (a) of fig. 6 has a stacked structure including a visible light absorbing layer 41b and an infrared light reflecting layer 42 b. In this functional section 40, the visible light absorbing layer 41b is stacked on the case 10, and the infrared light reflecting layer 42b is stacked on the visible light absorbing layer 41 b. In other words, the visible light absorbing layer 41b is arranged at a position more inward than the infrared light reflecting layer 42 b.
The infrared light reflecting layer 42b is configured such that infrared light is reflected off the infrared light reflecting layer 42b and visible light is transmitted through the infrared light reflecting layer 42 b. A known configuration may be used for the infrared light reflecting layer 42b, and for example, a dielectric multilayer or the like having a transmission property with respect to the visible light region and a reflection property with respect to the infrared region may be used for the infrared light reflecting layer 42 b.
The visible light absorption layer 41b is configured such that visible light is absorbed into the visible light absorption layer 41 b. The visible light absorbing layer 41b may use a known configuration, and for example, the visible light absorbing layer 41b may be formed of, for example, a black paint or a black plastic film. Further, the visible light absorption layer 41b may be a black layer formed by surface-treating the case 10.
Due to this configuration, in the functional portion 40 shown in (a) of fig. 6, infrared light is reflected off from the infrared light reflecting layer 42b, and visible light transmitted through the infrared light reflecting layer 42b is absorbed into the visible light absorbing layer 41 b. As described above, this functional portion 40 makes it possible to realize a function of absorbing visible light and reflecting infrared light.
The functional section 40 shown in (B) of fig. 6 does not include the visible light absorption layer 41B shown in (a) of fig. 6, and the casing 10 itself serves as the visible light absorption layer 41B. In other words, when the housing 10 is made of, for example, black plastic, this enables the housing 10 itself to include a function of absorbing visible light into the housing 10.
(Another configuration example of the vehicle-mounted camera 1)
The configuration of the in-vehicle camera 1 is not limited to the above configuration, and various modifications may be made thereto. For example, as shown in fig. 7, the housing 10 may include a plurality of openings 16, and the optical system 30 and the imaging element 21 may be provided for each of the plurality of openings 16. In this case, for example, the angle of view may be different for each optical system 30 in the in-vehicle camera 1.
Further, the functional section 40 is not limited to the above arrangement. When the functional portion 40 is disposed in at least a part of the outer surface of the housing 10, the above-described effect can be provided. In particular, the functional part 40 may be disposed in the entire region of the outer surface of the case 10. This results in the vehicle-mounted camera 1 having an excellent appearance, and also obtains a cooling effect provided by performing release in the entire area of the outer surface of the housing 10. This makes it possible to more effectively suppress a temperature increase of the in-vehicle camera 1.
Note that, although it is not necessary to arrange the functional section 40 in the entire area of the outer surface of the housing 10, it is advantageous that the entire area of the outer surface of the housing 10 of the onboard camera 1 is black. This results in the in-vehicle camera 1 having an excellent appearance and also being less likely to be affected by the incidence of sunlight. In particular, this makes it possible to take high-quality images.
In addition, the vehicle-mounted camera 1 may be attached not only to the windshield M01 but also to the rear window M02 as a rear sensing camera. Further, the in-vehicle camera 1 may be used for viewing, for example, instead of sensing. In this case, high-quality video can be displayed and recorded using the in-vehicle camera 1.
Further, the in-vehicle camera 1 is not necessarily directly joined to the inner surface of the windshield M01, and for example, the in-vehicle camera 1 may be fixed to the ceiling of the automobile M by a bracket or the like. Further, the vehicle-mounted camera 1 may have a configuration in which the shielding portion 15 is not formed, and for example, the vehicle-mounted camera 1 may be integrated with a rear view mirror.
In addition, the onboard camera 1 is applicable not only to the automobile M but also to various movable bodies. Examples of the movable body to which the onboard camera 1 is applied include an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal moving body, an airplane, an unmanned aerial vehicle, a ship, a robot, a construction machine, and an agricultural machine (tractor).
[ drive control System 100]
(brief description)
The drive control system 100 according to the embodiment of the present disclosure is a system for controlling the drive of the automobile M using the above-described in-vehicle camera 1. Specifically, the drive control system 100 controls the driving force generation mechanism M11, the brake mechanism M12, the steering mechanism M13, and the like of the automobile M using images captured with the onboard camera 1.
The drive control system 100 may have a configuration corresponding to a function required for the automobile M. Specifically, examples of the functions that can be realized by the drive control system 100 include a driving assist function and an automatic driving function. The configuration of the drive control system 100 capable of realizing the driving assist function and the automatic driving function is described below.
(Driving assistance function)
The driving assist function is generally a function of an Advanced Driver Assist System (ADAS) including collision avoidance, impact mitigation, follow-up running (maintaining a following distance), vehicle speed maintenance running, collision warning, lane departure warning, and the like. The drive control system 100 may be configured such that these driving assistance functions can be realized.
Fig. 8 is a block diagram showing the configuration of the drive control system 100 that makes it possible to implement the driving assistance function. The drive control system 100 includes the in-vehicle camera 1, a processor 110, an information generator 120, and a drive controller 130. The processor 110 includes an image processor 111, a recognition processor 112, and a calculation processor 113.
The respective structural components of the drive control system 100 are connected to each other through a communication network. The communication network may be, for example, an in-vehicle communication network conforming to any standard, such as a Controller Area Network (CAN), a Local Interconnect Network (LIN), a Local Area Network (LAN), or FlexRay (registered trademark).
Fig. 9 is a flowchart illustrating a drive control method performed by the drive control system 100 illustrated in fig. 8. The drive control method shown in fig. 9 includes step ST11 of image capturing, step ST12 of image processing, step ST13 of recognition processing, step ST14 of object information calculation, step ST15 of drive control information generation, and step ST16 of drive control signal output.
In step ST11 of image capturing, the in-vehicle camera 1 captures an image of a landscape in front of the automobile M through the windshield M01 to generate an original image of the landscape. As described above, due to the function of the function section 40, a high-quality original image is obtained using the in-vehicle camera 1. For example, the in-vehicle camera 1 transmits a raw image to the processor 110 using a built-in communication part mounted on the circuit board 20.
The processor 110 typically includes an Electronic Control Unit (ECU), and processes an original image generated by the in-vehicle camera 1. More specifically, in the processor 110, the image processor 111 performs step ST12 of the image processing, the recognition processor 112 performs step ST13 of the recognition processing, and the calculation processor 113 performs step ST14 of the object information calculation.
In step ST12 of the image processing, the image processor 111 performs image processing on the original image to generate a processed image. The image processing performed by the image processor 111 is generally processing performed to make it easy to recognize an object in an original image, and examples of the image processing performed by the image processor 111 include automatic exposure control, automatic white balance adjustment, and high dynamic range combination.
Note that, in step ST12 of the image processing, at least a part of the image processing may be performed by an image processor mounted on the circuit board 20 of the in-vehicle camera 1. Note that, when the image processor of the in-vehicle camera 1 performs all the image processing of step ST12 of the image processing, the processor 110 does not necessarily include the image processor 111.
In step ST13 of the recognition process, the recognition processor 112 performs a recognition process on the processed image to recognize an object in the processed image. Note that the object identified by the identification processor 112 is not limited to a three-dimensional object, and examples of the identified object include a vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a road lane, and a roadside.
In step ST14 of the calculation process, the calculation processor 113 calculates object information about the object in the processed image. Examples of the object information calculated by the calculation processor 113 include the shape of the object, the distance to the object, and the moving direction and moving speed of the object. The calculation processor 113 calculates dynamic object information using a plurality of temporally successive processed images.
As an example of the method for calculating the object information executed by the calculation processor 113, a method for calculating the following distance to the preceding automobile MF is described. Fig. 10 shows an example of a processed image G generated by the image processor 111. In the processed image G shown in fig. 10, a preceding automobile MF and two lanes L1 and L2 defining a driving lane appear.
First, a vanishing point V where the two lanes L1 and L2 intersect is obtained in the processed image G. Note that the vanishing point V can be obtained from other objects without using the lanes L1 and L2. For example, the calculation processor 113 may also obtain the vanishing point V using, for example, the motion trajectory of roadside or fixed objects such as traffic signs in the plurality of processed images.
Next, a distance D0 (the size in the up-down direction of the image) from the lower edge G1 of the processed image to the vanishing point V and a distance D1 (the size in the up-down direction of the image) from the lower edge G1 of the processed image to the front car MF are obtained. The following distance to the preceding vehicle MF can be obtained using the distances D0 and D1. For example, the following distance to the preceding vehicle MF can be calculated by using the ratio of the distance D0 to the distance D1.
The processor 110 transfers data including the processed images and the object information obtained in steps ST12 to ST14 to the information generator 120. Note that the processor 110 is not limited to the above configuration, and for example, the processor 110 may include structural components other than the image processor 111, the recognition processor 112, and the calculation processor 113.
In the drive control information generation step ST15, the information generator 120 generates drive control information including details of driving required for the automobile M. More specifically, the information generator 120 determines the driving details to be executed by the automobile M based on the data transmitted by the processor 110, and generates driving control information including the driving details.
Examples of the driving details of the automobile M include a change in speed (acceleration and deceleration) and a change in traveling direction. The following are specific examples: when the following distance of the car M to the preceding car MF is small, the information generator 120 determines that the car M is about to decelerate, and when the car M is likely to deviate from its lane, the information generator 120 determines that the traveling direction is about to be changed so that the car M moves toward the center of the lane.
The information generator 120 transmits the driving control information to the driving controller 130. Note that the information generator 120 may generate information other than the drive control information. For example, the information generator 120 may detect brightness in the surrounding environment from the processed image, and may generate information on lighting control performed to turn on the headlamps of the automobile M when it is dark in the surrounding environment.
In step ST16 of the drive control signal output, the drive controller 130 outputs the drive control signal based on the drive control information. For example, the drive controller 130 may accelerate the automobile M using the drive force generation mechanism M11, decelerate the automobile M using the brake mechanism M12, and change the traveling direction of the automobile M using the steering mechanism M13.
(automatic driving function)
The automatic driving function is a function that enables the vehicle M to automatically travel without an operation by the driver. In order to realize the automatic driving function, more complicated driving control is required than in the case of the driving assistance function. By using the in-vehicle camera 1 capable of generating a high-quality original image, the drive control system 100 can perform complicated drive control more accurately, thereby making it possible to realize an automatic driving function.
Fig. 11 is a block diagram showing the configuration of the drive control system 100 that makes it possible to implement the automatic driving function. In addition to the respective structural components shown in fig. 8, the drive control system 100 includes a mapping processor 114 and a path planning section 115 included in the processor 110. Description of structural components similar to those shown in fig. 8 is omitted below as appropriate.
Fig. 12 is a flowchart illustrating a drive control method performed by the drive control system 100 illustrated in fig. 11. The drive control method shown in fig. 12 includes, in addition to the respective steps shown in fig. 9, step ST21 of the mapping process performed by the mapping processor 114 and step ST22 of the path planning performed by the path planning section 115.
As shown in fig. 12, step ST21 of the mapping process and step ST22 of the path plan are performed between step ST14 of the object information calculation and step ST15 of the drive control information generation. The step ST22 of path planning is performed after the step ST21 of the mapping process.
In step ST21 of the mapping process, the mapping processor 114 performs spatial mapping using the processed image and object information to create a digital map. The digital map created by the mapping processor 114 is a three-dimensional map created by combining static information and dynamic information required to perform automated driving.
In the drive control system 100, since a high-quality original image is obtained using the in-vehicle camera 1, a high-resolution digital map can be created using the mapping processor 114. Note that the mapping processor 114 may create a digital map including more information by acquiring information other than the original image obtained using the onboard camera 1.
For example, the mapping processor 114 may acquire information from a surrounding information detector and a positioning section, for example, included in the automobile M. Further, the mapping processor 114 may acquire various information by communicating with various devices located in the external environment through an off-board communication section that may perform off-board communication.
The surrounding information detector is configured as, for example, an ultrasonic sensor, a radar device, a LIDAR (light detection and ranging, laser imaging detection and ranging) device, or the like. The mapping processor 114 may also acquire information about areas such as the rear and side of the automobile M that are not easily available from the onboard camera 1 from the surrounding information detector.
The positioning portion is capable of receiving Global Navigation Satellite System (GNSS) signals, such as GPS signals from Global Positioning System (GPS) satellites, for example, from GNSS satellites and performing positioning. The mapping processor 114 may obtain information about the location of the car M from the location section.
The vehicle exterior communication section may use, for example, global system for mobile communications (GSM) (registered trademark), WiMAX (registered trademark), Long Term Evolution (LTE) (registered trademark), LTE-advanced (LTE-a), wireless LAN (also referred to as Wi-Fi (registered trademark)), bluetooth (registered trademark), or the like.
In step ST22 of the path planning, the path planning section 115 performs the path planning performed to determine the traveling route of the automobile M using the digital map. Examples of path planning include various processes such as detecting white space on roads and predicting movement of objects such as vehicles and people.
After step ST22 of the path plan, the processor 110 transfers data including the digital map obtained in steps ST21 and ST22 together with the result of the path plan to the information generator 120, in addition to data including the processed images and the object information obtained in steps ST12 to ST 14.
In the drive control information generating step ST15, the information generator 120 generates drive control information including drive details executed to cause the automobile M to travel along the traveling route planned according to the route determined in the route planning step ST 22. The information generator 120 transmits the generated driving control information to the driving controller 130.
In step ST16 of the drive control signal output, the drive controller 130 outputs the drive control signal based on the drive control information. In other words, the drive controller 130 controls the driving of the driving force generation mechanism M11, the brake mechanism M12, the steering mechanism M13, and the like, so that the automobile M can safely travel along the travel route according to the route plan.
[ other examples ]
Embodiments of the present technology have been described above. However, of course, the present technology is not limited to the above-described embodiments, and various modifications may be made thereto without departing from the scope of the present technology.
Note that the present technology can also adopt the following configuration.
(1) An in-vehicle camera, comprising:
an imaging element;
a housing, the housing comprising: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening that communicates the accommodating portion and the external space with each other; and a functional portion that forms at least a part of the outer surface, the functional portion being a functional portion that absorbs visible light of light incident from the external space and reflects infrared light of the light incident from the external space away; and
an optical system that images light incident on the opening from the external space in the imaging element.
(2) The in-vehicle camera according to (1), wherein
The functional portion has a stacked structure including an infrared light reflecting layer that reflects infrared light away and a visible light absorbing layer that absorbs visible light.
(3) The in-vehicle camera according to (2), wherein
The visible light absorption layer is located further to the outside than the infrared light reflection layer, and infrared light is transmitted through the visible light absorption layer.
(4) The in-vehicle camera according to (2), wherein
The infrared light reflecting layer is located further to the outside than the visible light absorbing layer, and visible light is transmitted through the infrared light reflecting layer.
(5) The in-vehicle camera according to any one of (1) to (4), wherein
The optical system has a fixed focal point.
(6) The in-vehicle camera according to any one of (1) to (5), wherein
The housing includes a plurality of the openings, an
The vehicle-mounted camera further includes: a plurality of imaging elements corresponding to the plurality of openings, and a plurality of optical systems.
(7) The in-vehicle camera according to any one of (1) to (6), wherein
The optical system includes a plastic lens.
(8) A drive control system for controlling drive of a movable body including a windshield, the drive control system comprising:
an imaging element that takes an original image;
a housing, the housing comprising: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening that communicates the accommodating portion and the external space with each other; and a functional portion that forms at least a part of the outer surface, the functional portion being a functional portion that absorbs visible light in light incident from the external space and reflects infrared light in the light incident from the external space away, the housing being arranged inside the windshield such that the opening faces the windshield;
an optical system that images light incident on the opening from the external space in the imaging element;
a processing unit, the processing unit comprising: an image processor that performs image processing on the original image to generate a processed image; a recognition processor that performs a recognition process on the processed image to recognize an object; and a calculation processor that calculates object information relating to the object using the processed image;
an information generator that generates drive control information relating to control of drive of the movable body based on a result of processing performed by the processing unit; and
a drive controller that controls driving of the movable body based on the drive control information.
(9) The drive control system according to (8), wherein
The processing unit further includes a mapping processor that creates a digital map using the processed image and the object information.
(10) The drive control system according to (9), wherein
The processing unit further includes a path planning section that determines a travel route of the movable body using the digital map.
List of reference numerals
1 vehicle camera
10 casing
11 hollow part
12 extension part
13 side wall part
14 accommodating part
15 shield part
16 opening
20 circuit board
21 imaging element
30 optical system
31 lens
40 functional part
41a, 41b visible light absorption layer
42a, 42b infrared light reflecting layer
100 drive control system
110 processor
111 image processor
112 identification processor
113 calculation processor
114 mapping processor
115 route planning section
120 information generator
130 drive controller
M car
M1 windshield

Claims (10)

1. An in-vehicle camera, comprising:
an imaging element;
a housing, the housing comprising: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening that communicates the accommodating portion and the external space with each other; and a functional portion that forms at least a part of the outer surface, the functional portion being a functional portion that absorbs visible light of light incident from the external space and reflects infrared light of the light incident from the external space away; and
an optical system that images light incident on the opening from the external space in the imaging element.
2. The in-vehicle camera of claim 1, wherein
The functional portion has a stacked structure including an infrared light reflecting layer that reflects infrared light away and a visible light absorbing layer that absorbs visible light.
3. The in-vehicle camera of claim 2, wherein
The visible light absorption layer is located further to the outside than the infrared light reflection layer, and infrared light is transmitted through the visible light absorption layer.
4. The in-vehicle camera of claim 2, wherein
The infrared light reflecting layer is located further to the outside than the visible light absorbing layer, and visible light is transmitted through the infrared light reflecting layer.
5. The in-vehicle camera of claim 1, wherein
The optical system has a fixed focal point.
6. The in-vehicle camera of claim 1, wherein
The housing includes a plurality of the openings, an
The vehicle-mounted camera further includes: a plurality of imaging elements corresponding to the plurality of openings, and a plurality of optical systems.
7. The in-vehicle camera of claim 1, wherein
The optical system includes a plastic lens.
8. A drive control system for controlling drive of a movable body including a windshield, the drive control system comprising:
an imaging element that takes an original image;
a housing, the housing comprising: an accommodating portion accommodating the imaging element; an outer surface exposed to an external space; an opening that communicates the accommodating portion and the external space with each other; and a functional portion that forms at least a part of the outer surface, the functional portion being a functional portion that absorbs visible light in light incident from the external space and reflects infrared light in the light incident from the external space away, the housing being arranged inside the windshield such that the opening faces the windshield;
an optical system that images light incident on the opening from the external space in the imaging element;
a processing unit, the processing unit comprising: an image processor that performs image processing on the original image to generate a processed image; a recognition processor which performs recognition processing on the processed image to recognize an object; and a calculation processor that calculates object information relating to the object using the processed image;
an information generator that generates drive control information relating to control of drive of the movable body based on a result of processing performed by the processing unit; and
a drive controller that controls driving of the movable body based on the drive control information.
9. The drive control system of claim 8, wherein
The processing unit further includes a mapping processor that creates a digital map using the processed image and the object information.
10. The drive control system of claim 9, wherein
The processing unit further includes a path planning section that determines a travel route of the movable body using the digital map.
CN201980050640.4A 2018-08-06 2019-07-19 Vehicle-mounted camera and drive control system using the same Active CN112514361B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018147409 2018-08-06
JP2018-147409 2018-08-06
PCT/JP2019/028463 WO2020031660A1 (en) 2018-08-06 2019-07-19 Vehicle-mounted camera and drive control system using same

Publications (2)

Publication Number Publication Date
CN112514361A true CN112514361A (en) 2021-03-16
CN112514361B CN112514361B (en) 2023-06-23

Family

ID=69414752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980050640.4A Active CN112514361B (en) 2018-08-06 2019-07-19 Vehicle-mounted camera and drive control system using the same

Country Status (5)

Country Link
US (1) US11851007B2 (en)
EP (1) EP3836530A4 (en)
JP (1) JP7247201B2 (en)
CN (1) CN112514361B (en)
WO (1) WO2020031660A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115185152A (en) * 2022-06-24 2022-10-14 长春理工大学 Catadioptric panoramic vision detection device with anti-exposure system and anti-exposure method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7504387B2 (en) 2020-03-31 2024-06-24 パナソニックオートモーティブシステムズ株式会社 Imaging and display system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003241260A (en) * 2002-02-15 2003-08-27 Mitsubishi Electric Corp Infrared camera
JP2003300414A (en) * 2002-04-11 2003-10-21 Mitsubishi Electric Corp On-vehicle device
JP2005247014A (en) * 2004-03-01 2005-09-15 Denso Corp Car-mounted camera system
US20090212202A1 (en) * 2008-02-26 2009-08-27 Koichi Takahashi Imaging apparatus for taking images of objects in a plurality of directions and vehicle incorporating the same
DE102014218249A1 (en) * 2014-09-11 2015-12-17 Conti Temic Microelectronic Gmbh DEVICE FOR FIXING AN OPTICAL SENSOR ON THE INSIDE OF A VEHICLE DISK
EP2982941A1 (en) * 2014-08-07 2016-02-10 Conti Temic microelectronic GmbH Sensor device housing
WO2018025481A1 (en) * 2016-08-01 2018-02-08 ソニーセミコンダクタソリューションズ株式会社 On-vehicle camera, on-vehicle camera apparatus, and method for supporting on-vehicle camera
US20180059298A1 (en) * 2016-08-24 2018-03-01 Samsung Electronics Co., Ltd. Optical module and electronic device having the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016014564A (en) 2014-07-01 2016-01-28 株式会社リコー Imaging unit
JP2018042141A (en) 2016-09-08 2018-03-15 株式会社デンソー Imaging apparatus
JP6504145B2 (en) 2016-11-24 2019-04-24 株式会社Jvcケンウッド Vehicle recording device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003241260A (en) * 2002-02-15 2003-08-27 Mitsubishi Electric Corp Infrared camera
JP2003300414A (en) * 2002-04-11 2003-10-21 Mitsubishi Electric Corp On-vehicle device
JP2005247014A (en) * 2004-03-01 2005-09-15 Denso Corp Car-mounted camera system
US20090212202A1 (en) * 2008-02-26 2009-08-27 Koichi Takahashi Imaging apparatus for taking images of objects in a plurality of directions and vehicle incorporating the same
EP2982941A1 (en) * 2014-08-07 2016-02-10 Conti Temic microelectronic GmbH Sensor device housing
DE102014218249A1 (en) * 2014-09-11 2015-12-17 Conti Temic Microelectronic Gmbh DEVICE FOR FIXING AN OPTICAL SENSOR ON THE INSIDE OF A VEHICLE DISK
WO2018025481A1 (en) * 2016-08-01 2018-02-08 ソニーセミコンダクタソリューションズ株式会社 On-vehicle camera, on-vehicle camera apparatus, and method for supporting on-vehicle camera
US20180059298A1 (en) * 2016-08-24 2018-03-01 Samsung Electronics Co., Ltd. Optical module and electronic device having the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115185152A (en) * 2022-06-24 2022-10-14 长春理工大学 Catadioptric panoramic vision detection device with anti-exposure system and anti-exposure method

Also Published As

Publication number Publication date
JP7247201B2 (en) 2023-03-28
JPWO2020031660A1 (en) 2021-08-26
CN112514361B (en) 2023-06-23
WO2020031660A1 (en) 2020-02-13
US20210291750A1 (en) 2021-09-23
US11851007B2 (en) 2023-12-26
EP3836530A4 (en) 2021-11-24
EP3836530A1 (en) 2021-06-16

Similar Documents

Publication Publication Date Title
JP6795030B2 (en) Imaging control device, imaging control method, and imaging device
CN107272168B (en) Camera for vehicle
CN107272300B (en) Vehicle driving assistance device
JP6834964B2 (en) Image processing equipment, image processing methods, and programs
JPWO2017141746A1 (en) Imaging apparatus, imaging control method, and program
CN110574357B (en) Imaging control apparatus, method for controlling imaging control apparatus, and moving body
US11942494B2 (en) Imaging device
US20200057149A1 (en) Optical sensor and electronic device
US11454723B2 (en) Distance measuring device and distance measuring device control method
US20210297589A1 (en) Imaging device and method of controlling imaging device
CN112514361B (en) Vehicle-mounted camera and drive control system using the same
WO2017195459A1 (en) Imaging device and imaging method
WO2019163315A1 (en) Information processing device, imaging device, and imaging system
WO2024024148A1 (en) On-vehicle monitoring device, information processing device, and on-vehicle monitoring system
WO2021161858A1 (en) Rangefinder and rangefinding method
US11987184B2 (en) Vehicle-mounted camera
US20220345603A1 (en) Imaging apparatus
US11523070B2 (en) Solid-state imaging element, imaging device, and method for controlling solid-state imaging element
WO2020009172A1 (en) Camera
JP7059185B2 (en) Image processing equipment, image processing method, and imaging equipment
US20230375800A1 (en) Semiconductor device and optical structure body
WO2024018812A1 (en) Solid-state imaging device
WO2021161857A1 (en) Distance measurement device and distance measurement method
Thibault Novel compact panomorph lens based vision system for monitoring around a vehicle
Thibault 360 degree vision system: opportunities in transportation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant