WO2022080147A1 - Dispositif de caméra monté sur un véhicule - Google Patents

Dispositif de caméra monté sur un véhicule Download PDF

Info

Publication number
WO2022080147A1
WO2022080147A1 PCT/JP2021/036221 JP2021036221W WO2022080147A1 WO 2022080147 A1 WO2022080147 A1 WO 2022080147A1 JP 2021036221 W JP2021036221 W JP 2021036221W WO 2022080147 A1 WO2022080147 A1 WO 2022080147A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensitivity region
image
vehicle
arrangement
sensitivity
Prior art date
Application number
PCT/JP2021/036221
Other languages
English (en)
Japanese (ja)
Inventor
健 永崎
ユイビン ツーン
一雄 松浦
秀則 篠原
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Priority to JP2022557355A priority Critical patent/JP7492599B2/ja
Priority to DE112021004185.1T priority patent/DE112021004185T5/de
Publication of WO2022080147A1 publication Critical patent/WO2022080147A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/41Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • H04N25/585Control of the dynamic range involving two or more exposures acquired simultaneously with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/702SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present invention relates to an in-vehicle camera device for eliminating a distance measurement error phenomenon in which the distance measurement of a stereo camera is out of order due to the influence of the high dynamic range of the image sensor.
  • the stereo camera device simultaneously measures the visual information from the image and the distance information to the object in the image, various objects around the automobile (people, cars, stereoscopic objects, white lines / road surfaces, signs, etc.) It is a device that can grasp in detail and contribute to the improvement of safety during driving support. Further, as a characteristic of the stereo camera, it is mentioned that the spatial resolution and the accuracy are high with respect to the distance measurement of the target object. This is because, unlike a monocular camera, the distance to an arbitrary object can be measured based on triangulation.
  • HDR high dynamic range
  • the problem of applying an image sensor compatible with a high dynamic range to a stereo camera has also become apparent.
  • the range of distance measured by the in-vehicle stereo camera ranges from 100 to 200 m at the longest, although it depends on the device regulations such as the baseline length. In that case, it is necessary to investigate the correspondence (parallax) between the images captured by the left and right cameras at the sub-pixel level of less than one pixel.
  • There are various optical disturbance factors when calculating parallax at the subpixel level For example, refraction of the optical path due to the influence of the windshield and blurring of the projected light due to the focal characteristics of the lens. It has also been found that high dynamic range also affects distance measurement.
  • High dynamic range imaging is a general technique as a means for expanding the imaging sensitivity range of a camera, and various processing methods and devices have been proposed. Generally, this is referred to as an aiming process (aiming process).
  • Patent Document 1 is one of the examples.
  • Patent Document 1 is a document relating to an individual image pickup apparatus that generates an image in which a dynamic range is expanded based on an image pickup pixel signal with low-sensitivity pixels and high-sensitivity pixels in a pixel array portion.
  • Patent Document 1 describes the imaging and compositing of high dynamic range images that make the best use of the features of the image sensor.
  • the problems peculiar to the stereo camera due to the high dynamic range that is, the occurrence of the ranging error due to the high dynamic range and the countermeasures thereof are not described.
  • the present invention has been made in view of the above problems, and an object thereof is, for example, focusing on the characteristics of an in-vehicle stereo camera device, and measuring the high dynamic range provided only by the in-vehicle stereo camera device. It is to reduce the influence on the distance error, and to provide an in-vehicle camera device having the image correction and the arrangement configuration of the image pickup element.
  • the vehicle-mounted camera device of the present invention that solves the above problems includes a first HDR image pickup element including a first sensitivity region and a second sensitivity region surrounding the first sensitivity region as a pair, and a third. It has a sensitivity region and a second HDR image pickup element including a fourth sensitivity region surrounding the third sensitivity region as a pair, and has a signal from the first sensitivity region and the second sensitivity region. At least one of the weight for the signal from the sensitivity region, the signal from the third sensitivity region, and the weight for the signal from the fourth sensitivity region is the first sensitivity region and the second sensitivity region.
  • the arrangement of the first sensitivity region and the second sensitivity region or the third one which is changed according to the bias of the arrangement of the third sensitivity region and the fourth sensitivity region.
  • the pair of signals to be output is changed according to the arrangement of the sensitivity region and the fourth sensitivity region, or the arrangement of the first sensitivity region and the second sensitivity region and the third sensitivity.
  • the arrangement of the region and the fourth sensitivity region is the same.
  • the processing flow diagram inside the in-vehicle stereo camera device. The figure which showed the problem of high dynamic range.
  • Overview of data processing according to this embodiment. The processing flow diagram of the data processing inside the in-vehicle stereo camera apparatus by this embodiment.
  • Image composite image diagram. The layout image figure of the light receiving part on an image sensor.
  • FIG. 1 is a block diagram showing a schematic configuration of an in-vehicle system (in-vehicle stereo camera system) including an in-vehicle stereo camera device 100 according to an embodiment of the present invention.
  • FIG. 2 is a processing flow diagram of a processing unit driven by a recognition application inside the in-vehicle stereo camera device 100 shown in FIG.
  • the in-vehicle stereo camera device 100 of the present embodiment is a device mounted on a vehicle and recognizes the outside environment of the vehicle based on image information of a shooting target area in front of the vehicle.
  • the in-vehicle stereo camera device 100 recognizes, for example, a white line on a road, a pedestrian, a vehicle, other three-dimensional objects, a signal, a sign, a lighting lamp, and the like, and the vehicle (own vehicle) equipped with the in-vehicle stereo camera device 100. Make adjustments such as brake and steering adjustments. Further, after mounting the in-vehicle stereo camera device 100 on the vehicle, the device also performs an aiming process for calculating the optical axis, the correction amount of the ranging error, and the like at the vehicle factory.
  • the in-vehicle stereo camera device 100 is in an image based on two cameras 101 and 102 (left camera 101 and right camera 102) arranged side by side to acquire image information and image information acquired by the cameras 101 and 102. It has a processing device 110 that performs recognition processing of an object (object) of the above.
  • the processing device 110 is configured as a computer including a processor such as a CPU (Central Processing Unit), a memory such as a ROM (ReadOnlyMemory), a RAM (RandomAccessMemory), and an HDD (HardDiskDrive). Each function of the processing device 110 is realized by the processor executing the program stored in the ROM. RAM stores data including intermediate data of operations performed by a program executed by a processor.
  • the processing device 110 has an image input interface 103 for controlling the imaging of the cameras 101 and 102 and capturing the captured images.
  • the image captured through the image input interface 103 is sent data through the internal bus 109 and processed by the image processing unit 104 and the arithmetic processing unit 105, and the result in the process of processing and the image data which is the final result are stored in the storage unit. It is stored in 106.
  • the image processing unit 104 compares the first image (left image) obtained from the image pickup element of the camera 101 with the second image (right image) obtained from the image pickup element of the camera 102, and makes each image. On the other hand, correction of the deviation peculiar to the device caused by the image sensor and image correction such as noise interpolation are performed, and this is stored in the storage unit 106. Further, the points corresponding to each other are calculated between the first and second images, the parallax information is calculated, and this is stored in the storage unit 106 in the same manner as before.
  • the arithmetic processing unit 105 recognizes various objects necessary for perceiving the environment around the vehicle by using the image and the parallax information (distance information for each point on the image) stored in the storage unit 106.
  • Various objects include people, cars, other obstacles, traffic lights, signs, car tail lamps and headlights, and the like. A part of these recognition results and intermediate calculation results is recorded in the storage unit 106 as before. After performing various object recognition on the captured image, the arithmetic processing unit 105 calculates the control policy of the vehicle using these recognition results.
  • the vehicle control policy obtained as a result of the calculation and a part of the object recognition result are transmitted to the in-vehicle network CAN111 through the CAN interface 107, whereby the vehicle is braked. Further, regarding these operations, the control processing unit 108 monitors whether each processing unit has caused an abnormal operation, whether an error has occurred during data transfer, and the like, which is a mechanism for preventing the abnormal operation. ing.
  • the image processing unit 104 is an external image input interface 103 that is an input / output unit between the control processing unit 108, the storage unit 106, the arithmetic processing unit 105, and the image pickup elements of the cameras 101 and 102 via the internal bus 109. It is connected to the CAN interface 107, which is an input / output unit with the vehicle-mounted network CAN 111.
  • the control processing unit 108, the image processing unit 104, the storage unit 106, the arithmetic processing unit 105, and the input / output units 103 and 107 are composed of a single computer unit or a plurality of computer units.
  • the storage unit 106 is composed of, for example, a memory for storing image information obtained by the image processing unit 104, image information created as a result of scanning by the arithmetic processing unit 105, and the like.
  • the CAN interface 107 which is an input / output unit with the external vehicle-mounted network CAN 111, outputs the information output from the vehicle-mounted stereo camera device 100 to the control system of the own vehicle via the vehicle-mounted network CAN 111.
  • each processing unit has been described above, but next, the processing flow inside the in-vehicle stereo camera device 100 will be described based on FIG.
  • images are imaged by the left and right cameras 101 and 102 (S201 and S202), and image processing such as correction for absorbing the peculiar habit of the image sensor is performed on each of the image data 121 and 122 captured by each. Do (S203).
  • the left and right images obtained from the left camera 101 and the right camera 102 are compared, and a parallax calculation is performed to calculate the corresponding points of the left and right images (S204).
  • the results and the like are stored in the storage unit 106 of FIG.
  • Object detection is performed using the result of the parallax calculation and the image captured by the left and right cameras (S205).
  • the detected object that is, a mass of some three-dimensional object is recognized (S206), and it is determined whether the object is a pedestrian or a vehicle.
  • Objects for object recognition include people, cars, bicycles, motorcycles, other three-dimensional objects, signs, traffic lights, self-luminous bodies such as tail lamps, etc., but object recognition is performed using captured images and disparity calculation results. In that case, use a recognition dictionary as needed.
  • the recognition dictionary is data created in advance by means such as machine learning, and is stored in the storage unit 106 on the in-vehicle stereo camera device 100.
  • Vehicle control processing is performed in consideration of the result of object recognition and the state of the own vehicle (speed, steering angle, etc.) (S207).
  • the vehicle control process (S207) is, for example, issuing a warning to the occupant and braking the own vehicle such as braking and steering angle adjustment.
  • Information regarding vehicle control is output from the vehicle-mounted stereo camera device 100 through the CAN interface 107.
  • Various object recognition processes (S206) and vehicle control processes (S207) are performed by the arithmetic processing unit 105 of FIG. 1, and output to the vehicle-mounted network CAN 111 is performed by the CAN interface 107.
  • Each of these processes / means is composed of, for example, a single computer unit or a plurality of computer units, and is configured so that data can be exchanged with each other.
  • FIG. 3 shows the effect on the stereo camera when an image sensor (HDR image sensor) corresponding to a high dynamic range is used.
  • the target is one in which elements (sensitivity regions) having different light receiving sensitivities are arranged on the image pickup device.
  • Reference numeral 301 is an image of an image sensor, and two light receiving portions A (302) (first sensitivity region and third sensitivity region) and light receiving portions B (303) (second sensitivity region) having different light receiving sensitivities are placed on the image sensor. , Fourth sensitivity region) is arranged.
  • the light receiving unit A is a normal sensitivity light receiving unit suitable for photographing a normal brightness
  • the light receiving unit B is a low sensitivity light receiving unit suitable for imaging a bright object.
  • two light receiving units A and a light receiving unit B are paired like 304, and the amount of light received by the two light receiving units A and the light receiving unit B is accumulated as electrons and obtained from the two light receiving elements.
  • Electronic data (signals) are combined in some form to obtain pixel values for one pixel.
  • 305 and 307 shown in the lower part of FIG. 3 are examples in which the image sensors are arranged symmetrically on the left and right sides of the stereo camera.
  • the image of the image sensor of the left camera 101 is 305
  • the light projected from an object in the outside world is projected onto 306.
  • 307 is an image of the image pickup element of the right camera 102
  • the arrangement of the light receiving portions A and B on the image sensor 102 is line-symmetrical with respect to 305.
  • the projection position of the object on the right camera 102 is 308.
  • the problem that arises in the arrangement of such an image sensor is the deviation of parallax due to the brightness of the object.
  • the signal intensity captured by the light receiving unit A (302) becomes the brightness value of the image.
  • the amount of light received by the light receiving unit B (303) on the upper right of the left camera 101 is reflected in the luminance value of the image.
  • the light receiving unit B (303) on the upper left side is the light receiving unit B (303) in the upper left, so that the brightness value received here is reflected.
  • FIG. 4 shows the whole picture of this data processing.
  • an image pickup module that combines an image sensor and a lens (S402) is assembled (S402), and then two image pickup modules are used to assemble as a stereo camera (S403), and the camera is assembled correctly. It consists of a step of checking whether the image sensor is functioning or not (S404). At this time, what kind of image sensor is arranged on the left and right and how is determined by the process and design of assembling the image pickup module of S402 and assembling the stereo camera of S403. This is recorded on the stereo camera as placement characteristic information as in S410.
  • the processing of the image composition S409 according to the arrangement characteristic information S410 will be described later in the description of FIGS. 6 to 8.
  • the basic idea is to take into consideration the placement position of light receiving elements (sensitivity regions) with different sensitivities and the placement area of the light receiving elements of each sensitivity, and perform weighted linear operations from each image obtained by the light receiving elements of each sensitivity. It is a process of obtaining a new composite image by a light calculation of the calculation amount of.
  • the coefficient used in the weighted linear operation is determined in consideration of the arrangement position and the arrangement area of the light receiving element.
  • FIG. 5 is a diagram in which the above-mentioned processing is positioned in the data processing inside the in-vehicle stereo camera device 100.
  • the images obtained from the light receiving unit A group (plural light receiving units A) or the light receiving unit B group (plural light receiving units B) in the left camera 101 or the right camera 102 are (S501, S502), and the manufacturing information is as described above.
  • the image sensor arrangement characteristic information (S505) obtained as design information the image of the light receiving unit A and the image of the light receiving unit B are combined (S503, S504), and the output image from the left camera 101 and the right image are combined.
  • the output image is taken from the camera 102.
  • This image composition may be processed in the camera module, or may be processed to combine the left and right images immediately before the parallax calculation after sending the data to the memory. Further, in S506, image correction is performed to correct lens distortion and the like.
  • parallax calculation (S507) is performed based on the normalized left and right images obtained by the left camera 101 and the right camera 102. Based on the parallax image obtained as a result of the parallax calculation, object detection (S508) and distance measurement / recognition (S509) are performed, vehicle control calculation is performed using the results (S510), and the results are mounted on the vehicle. Output through network CAN111 or the like.
  • FIG. 6 is an image of image composition.
  • the composite image 603 is created based on the light receiving unit A image 601 and the light receiving unit B image 602.
  • FIG. 7 is an image of the arrangement of the light receiving portion on the image sensor, and FIGS. 7 and Equations 1 and 2 show the weights at the time of image composition.
  • the image composition of this example is a process of determining the luminance value of a certain image in consideration of the luminance value obtained by the light receiving unit A (302) and the luminance value obtained by the peripheral light receiving unit B (303). ..
  • the arrangement of the image sensor of the stereo camera that is, according to the arrangement of the light receiving unit A and the light receiving unit B
  • the arrangement of the pair of the light receiving unit A / B (corresponding to the output brightness value) to be calculated is changed.
  • the light receiving unit B (303) paired with the light receiving unit A (302) is changed from the upper left to the upper right, and the physical position of receiving light is in the same direction.
  • the arrangement of the light-receiving parts B (303) around the four corners of the light-receiving part A (302) that is, the arrangement of the light-receiving part A and the light-receiving part B
  • This weight has less influence on parallax when it is determined in consideration of the position and area where the light receiving unit A (302) and the light receiving unit B (303) are actually physically arranged on the image sensor. Become.
  • 701 is an example in which the light receiving unit B (303) is evenly arranged at the four corners with respect to the light receiving unit A (302) (see region 702).
  • the luminance value of the light receiving unit B (303) is evenly distributed at a certain ratio to the luminance value of the light receiving unit A (302).
  • 703 is an example in which the physical arrangement of the light receiving unit B (303) is uneven with respect to the light receiving unit A (302) (see region 704).
  • the weight with respect to the luminance value is determined according to the area of the light receiving unit B (303) and the respective positions of the light receiving unit B (303) (that is, according to the bias of the arrangement of the light receiving unit B with respect to the light receiving unit A). Is conceivable to change.
  • the following formula 2 is an example.
  • Various approaches can be considered for how to design this weight, but for example, the idea of an integrated image or the weight arrangement based on the sampling theory can be considered.
  • the coefficient for the light receiving element set as the pair is set to 1, and the other coefficients are set to 0. A similar effect can be obtained.
  • FIG. 8 is a defined image of the image sensor arrangement.
  • the image sensor 802 having the sensitivity B is focused on and the image P (i, j) is obtained from the image sensor 802.
  • the luminance value of the sensitivity B in the image sensor in the star region 803 is weighted and obtained. At that time, the weight can be derived from the idea of the integrated image.
  • the integrated image value S (i, j) is obtained from the image P (i, j) obtained from the image sensor having the sensitivity B by the following mathematical formula 3.
  • the luminance value corresponding to the star region 803 can be calculated by the following mathematical formula 4 as the luminance value in the rectangular area of 805.
  • the luminance value can be calculated by the linear sum on the integrated image from the coordinates (upper left point, lower left point, upper right point, lower right point) of the four corners of the rectangular area 805 using the characteristics of the integrated image. ..
  • the coordinates of the four corners of the rectangular area 805 are derived from the position / area of the image sensor of sensitivity B on the image sensor and the position / area of the image sensor of sensitivity A, and generally do not become an integer value. Therefore, the integrated image value S (x, y) with respect to the non-integer coordinate value (x, y) is obtained by linear interpolation or the like from the integer values around it.
  • the integrated image value S (x, y) with respect to the non-integer coordinate value (x, y) can be expressed as a linear function of the integrated image value S (i, j) with respect to the surrounding integer coordinates, and further integrated with respect to the integer coordinate. Since the image value S (i, j) is represented by the linear sum of the original image P (i, j) obtained from the sensitivity B, the estimated brightness value by the sensitivity B with respect to the star region 803 is the original image P ( It can be expressed as a linear sum of i and j). The coefficient at this time can be used as a weight at the time of image composition.
  • the vehicle-mounted stereo camera device (vehicle-mounted camera device) 100 of the present embodiment has a first sensitivity region and a second sensitivity region around the first sensitivity region (of image composition).
  • the signal that is, the signal obtained from the first sensitivity region (brightness value of the image)
  • the signal obtained from the second sensitivity region (brightness value of the image) included as a pair that is, the brightness value of the image
  • the first HDR image pickup element output as a pixel value for one pixel
  • the third sensitivity region, and the fourth sensitivity region surrounding the third sensitivity region are paired (during image composition).
  • the signal obtained from the third sensitivity region (brightness value of the image) and the signal obtained from the fourth sensitivity region (brightness value of the image) are combined to include pixels for one pixel of the image. It has a second HDR image pickup element (output as a value), and has a weight for a signal from the first sensitivity region (when synthesizing an image / brightness value) and a signal from the second sensitivity region. , Or the weight for the signal from the third sensitivity region and the signal from the fourth sensitivity region, the arrangement of the first sensitivity region and the second sensitivity region or the third.
  • the pair of signals (brightness values) to be output is changed according to the arrangement with the sensitivity region of the above, or the arrangement of the first sensitivity region and the second sensitivity region, the third sensitivity region, and the above. It is characterized in that the arrangement with the fourth sensitivity region is the same arrangement.
  • the first and second sensitivity regions are described.
  • the arrangement (of the sensitivity region) of the HDR image sensor is designed so that the paired sensitivity regions having different light receiving sensitivities are in the same direction.
  • the in-vehicle stereo camera device (vehicle-mounted camera device) 100 of the present embodiment is an in-vehicle stereo camera device that processes images captured by a pair of cameras, and the image pickup element unit is assumed to support high dynamic range exposure.
  • the parameter of image composition according to the characteristics of the light receiving part of the image pickup element. It is characterized by having a means for performing stereo vision after designing (pair characteristics, weights, etc.) and using this to correct the left and right images to be suitable for stereo vision.
  • the arrangement of the light receiving portions in the image sensor having different light receiving sensitivities is matched in the left and right cameras, or the left and right cameras.
  • the weight is based on the arrangement characteristic information of the image sensor. And the calculation method is changed.
  • the stereo camera in order to reduce the influence of the ranging error due to the high dynamic range of the image sensor, 1) when manufacturing the stereo camera, a certain restriction is set on the direction of the image sensor, or the light received by the image sensor. Correction parameters are prepared according to the characteristics of the arrangement of the element unit. 2) When the stereo camera attached to the vehicle recognizes the outside world, the input images of the left and right cameras for stereo viewing are determined at the time of manufacturing the stereo camera. It is characterized in that it supports high dynamic range by performing processing according to specific parameters.
  • the present invention is not limited to the above-described embodiment, but includes various modified forms.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be stored in a memory, a hard disk, a storage device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • SSD Solid State Drive
  • control lines and information lines indicate what is considered necessary for explanation, and not all control lines and information lines are necessarily shown in the product. In practice, it can be considered that almost all configurations are interconnected.
  • In-vehicle stereo camera device (in-vehicle camera device) 101 ... Left camera 102 ... Right camera 103 ... Image input interface 104 ... Image processing unit 105 ... Arithmetic processing unit 106 ... Storage unit 107 ... CAN interface 108 ... Control processing unit 109 ... Internal bus 110 ... Processing device 111 ... In-vehicle network CAN 301 ... Image sensor 302 ... Light receiving unit A (first sensitivity region, third sensitivity region) 303 ... Light receiving unit B (second sensitivity region, fourth sensitivity region) 305 ... Image sensor of left camera 307 ... Image sensor of right camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)

Abstract

Compte tenu, par exemple, des caractéristiques d'un dispositif de caméra stéréo monté sur un véhicule, l'influence d'une plage dynamique élevée sur une erreur de mesure de distance est réduite uniquement par le dispositif de caméra stéréo monté sur un véhicule, et un dispositif de caméra monté sur un véhicule, doté d'une correction d'image et d'un agencement de dispositifs de capture d'image, est divulgué. Au moins l'un d'un poids à un signal provenant d'une première région de sensibilité et à un signal provenant d'une seconde région de sensibilité et d'un poids à un signal provenant d'une troisième région de sensibilité et à un signal provenant d'une quatrième région de sensibilité est modifié selon une polarisation dans un réseau. En variante, une paire de signaux de sortie (valeurs de luminance) est modifiée selon le réseau, ou le réseau de la première région de sensibilité et de la seconde région de sensibilité et le réseau de la troisième région de sensibilité et de la quatrième région de sensibilité sont identiques l'un à l'autre.
PCT/JP2021/036221 2020-10-14 2021-09-30 Dispositif de caméra monté sur un véhicule WO2022080147A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022557355A JP7492599B2 (ja) 2020-10-14 2021-09-30 車載カメラ装置
DE112021004185.1T DE112021004185T5 (de) 2020-10-14 2021-09-30 Fahrzeugmontierte kameravorrichtung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020173470 2020-10-14
JP2020-173470 2020-10-14

Publications (1)

Publication Number Publication Date
WO2022080147A1 true WO2022080147A1 (fr) 2022-04-21

Family

ID=81207958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/036221 WO2022080147A1 (fr) 2020-10-14 2021-09-30 Dispositif de caméra monté sur un véhicule

Country Status (3)

Country Link
JP (1) JP7492599B2 (fr)
DE (1) DE112021004185T5 (fr)
WO (1) WO2022080147A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017216648A (ja) * 2016-06-01 2017-12-07 キヤノン株式会社 撮像素子、撮像装置、および撮像信号処理方法
WO2019102887A1 (fr) * 2017-11-22 2019-05-31 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie à semi-conducteur, et dispositif électronique

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017216648A (ja) * 2016-06-01 2017-12-07 キヤノン株式会社 撮像素子、撮像装置、および撮像信号処理方法
WO2019102887A1 (fr) * 2017-11-22 2019-05-31 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie à semi-conducteur, et dispositif électronique

Also Published As

Publication number Publication date
JPWO2022080147A1 (fr) 2022-04-21
JP7492599B2 (ja) 2024-05-29
DE112021004185T5 (de) 2023-06-29

Similar Documents

Publication Publication Date Title
US11131753B2 (en) Method, apparatus and computer program for a vehicle
US20200278202A1 (en) Ranging apparatus and moving object capable of high-accuracy ranging
KR101458287B1 (ko) 거리 측정용 카메라 장치
US20180308282A1 (en) Shape measuring apparatus and method
JP4440341B2 (ja) キャリブレーション方法、キャリブレーション装置及びその装置を備えるキャリブレーションシステム
WO2016171050A1 (fr) Dispositif de traitement d'image
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
JP5273356B2 (ja) 複眼画像入力装置及びそれを用いた距離測定装置
JP6791341B2 (ja) 校正方法、校正装置、及びプログラム
WO2015045501A1 (fr) Dispositif de reconnaissance d'environnement externe
US10776649B2 (en) Method and apparatus for monitoring region around vehicle
US20170221188A1 (en) Imaging device
JP2021025868A (ja) ステレオカメラ
JP6967464B2 (ja) 画像処理装置
JP6447121B2 (ja) 画像処理装置、画像処理方法、撮像システム、画像処理システム、およびプログラム
US20210375975A1 (en) Photoelectric conversion device, photoelectric conversion system, moving body, and signal processing method
US20180338095A1 (en) Imaging system and moving body control system
US20220174254A1 (en) A camera assembly and a method
JP6844223B2 (ja) 情報処理装置、撮像装置、機器制御システム、情報処理方法およびプログラム
WO2022080147A1 (fr) Dispositif de caméra monté sur un véhicule
JP6152646B2 (ja) 複眼カメラ装置、及びそれを備えた車両
JP2018146495A (ja) 物体検出装置、物体検出方法、物体検出プログラム、撮像装置、及び、機器制御システム
JP7207889B2 (ja) 測距装置および車載カメラシステム
JP2022015847A (ja) 車両用のウィンドシールドガラスの検査方法
WO2023175708A1 (fr) Dispositif et procédé de reconnaissance d'environnement externe

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21879888

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022557355

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 21879888

Country of ref document: EP

Kind code of ref document: A1