WO2024135722A1 - 撮像装置及び画像データ生成方法 - Google Patents

撮像装置及び画像データ生成方法 Download PDF

Info

Publication number
WO2024135722A1
WO2024135722A1 PCT/JP2023/045661 JP2023045661W WO2024135722A1 WO 2024135722 A1 WO2024135722 A1 WO 2024135722A1 JP 2023045661 W JP2023045661 W JP 2023045661W WO 2024135722 A1 WO2024135722 A1 WO 2024135722A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
imaging
imaging element
element group
gain value
Prior art date
Application number
PCT/JP2023/045661
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
拓弥 片岡
大樹 角谷
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Priority to JP2024566102A priority Critical patent/JPWO2024135722A1/ja
Publication of WO2024135722A1 publication Critical patent/WO2024135722A1/ja

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/51Control of the gain

Definitions

  • This disclosure relates to an imaging device and an image data generation method.
  • Patent Document 1 discloses a vehicle lamp that employs a variable light distribution device known as an ADB (Adaptive Driving Beam), which uses a camera to detect the presence or absence of a shading target, such as an oncoming vehicle, and can dim or turn off the light in the area corresponding to the shading target.
  • ADB Adaptive Driving Beam
  • the present disclosure therefore aims to provide an imaging device and image data generation method that is simple in configuration and can easily detect objects.
  • An imaging device includes: A plurality of image pickup elements each capable of setting a gain value; an image generating unit that generates first image data and second image data based on imaging information output from the imaging element in one imaging operation,
  • the imaging element includes: a first imaging element group set to a first gain value; a second imaging element group set to a second gain value greater than the first gain value;
  • the image generating unit includes: generating the first image data based on first imaging information output from the first imaging element group;
  • the second image data is generated based on second imaging information output from the second imaging element group, and the second image data has a higher average luminance than the first image data.
  • the image data generating method includes: 1. An image data generating method for generating image data based on imaging information output from a plurality of imaging elements, each of which can have a set gain value, in one imaging operation, comprising: generating first image data based on first imaging information output from a first imaging element group set to a first gain value among the imaging elements; The second image data is generated based on second imaging information output from a second imaging element group among the imaging elements, the second imaging element group being set to a second gain value greater than the first gain value, and the second image data has a higher average luminance than the first image data.
  • the imaging device and image data generation method disclosed herein can provide an imaging device and image data generation method that can generate image data that is easy to detect objects with a simple configuration.
  • FIG. 1 is a block diagram of an imaging device according to an embodiment.
  • FIG. 2 is a side view illustrating an example of an imaging element according to an embodiment.
  • FIG. 3 is a front view illustrating an example of an imaging unit according to the embodiment.
  • FIG. 4 is a flowchart showing a control form of the image data generating method according to the embodiment.
  • FIG. 5 is a schematic diagram showing an image data generating method according to an embodiment.
  • FIG. 6A is a diagram comparing an actual view ahead of a vehicle with image data generated by an image data generating device according to an embodiment.
  • FIG. 6B is a diagram comparing an actual view ahead of the vehicle with image data generated by the image data generating device according to the embodiment.
  • FIG. 6A is a diagram comparing an actual view ahead of a vehicle with image data generated by an image data generating device according to an embodiment.
  • FIG. 6B is a diagram comparing an actual view ahead of the vehicle with image data generated by the image data generating device according to the embodiment
  • FIG. 6C is a diagram comparing an actual view ahead of the vehicle with image data generated by the image data generating device according to the embodiment.
  • FIG. 6D is a diagram comparing an actual view ahead of the vehicle with image data generated by the image data generating device according to the embodiment.
  • FIG. 7A is a diagram showing an actual view ahead of a vehicle and image data generated by an image data generating device according to an embodiment.
  • FIG. 7B is a diagram showing an actual view ahead of the vehicle and image data generated by the image data generating device according to the embodiment.
  • FIG. 7C is a diagram showing an actual view ahead of the vehicle and image data generated by the image data generating device according to the embodiment.
  • FIG. 7D is a diagram showing an actual view ahead of the vehicle and image data generated by the image data generating device according to the embodiment.
  • Fig. 1 is a block diagram of the imaging device 1 according to this embodiment.
  • the vehicle X has a vehicle control unit X1 and a vehicle lamp 3, and the imaging device 1 is provided in the vehicle lamp 3.
  • the imaging device 1 has an imaging unit 2, a signal processing unit 20, an image generation unit 30, and an image recognition unit 40.
  • the imaging unit 2 has a plurality of imaging elements 10 and converts light from the outside into a signal.
  • the signal processing unit 20 converts the signal received from the imaging unit 2 into imaging information 100 (see Fig. 5) required to generate an image.
  • the image generation unit 30 converts the imaging information 100 received from the signal processing unit 20 into image data.
  • the image recognition unit 40 detects an object from the image data generated by the image generation unit 30. Note that, although the image recognition unit 40 is illustrated as a part of the imaging device 1 in Fig. 1, the image recognition unit 40 does not have to be a part of the imaging device 1. For example, it may be a part of the vehicle control unit X1 mounted on the vehicle X.
  • the image sensor 10 has an on-chip lens 11, a color filter 12, a photodiode 13, and an amplifier 14.
  • the on-chip lens 11 is a lens directly mounted on the photodiode 13 on which the color filter 12 is mounted.
  • An optical image of a subject is incident on the photodiode 13 through the on-chip lens 11.
  • the color filter 12 blocks light other than a specific wavelength range and transmits only light in a specific wavelength range. As a result, only light in the specific wavelength range is incident on the photodiode 13.
  • each of the image sensors 10 has an amplifier 14 that can amplify or attenuate the intensity of a signal when converting photons incident on the photodiode 13 into a signal.
  • one amplifier 14 is provided for each photodiode 13, and each amplifier 14 can set the degree of amplification and attenuation (gain) of the signal of the photodiode 13 individually.
  • the imaging unit 2 is configured with a plurality of imaging elements 10 arranged in a lattice pattern, and one color filter 12 is arranged for each imaging element 10.
  • the imaging element 10 has a red imaging element 10r that receives red light, a blue imaging element 10b that receives blue light, and a green imaging element 10g that receives green light.
  • the green imaging element 10g that receives green light has a first green imaging element 10g1 and a second green imaging element 10g2.
  • the red imaging element 10r is provided with a color filter R that transmits red light
  • the blue imaging element 10b is provided with a color filter B that transmits blue light
  • the first green imaging element 10g1 and the second green imaging element 10g2 are provided with a color filter G that transmits green light.
  • the imaging section 2 in this embodiment has a Bayer array in which one unit formed of four imaging elements 10, namely a red imaging element 10r, a blue imaging element 10b, a first green imaging element 10g1, and a second green imaging element 10g2, is repeatedly arranged.
  • the imaging element 10 is divided into a first imaging element group 10A and a second imaging element group 10B.
  • the first imaging element group 10A is composed of the second green imaging element 10g2.
  • the second imaging element group 10B is composed of imaging elements that are not included in the first imaging element group 10A, and in this embodiment, is composed of a red imaging element 10r, a blue imaging element 10b, and a first green imaging element 10g1.
  • FIG 4 is a flowchart showing the processing of the signal processing unit 20 and the image generating unit 30 according to this embodiment.
  • the imaging unit 2 receives light from an object, the imaging unit outputs imaging information 100 according to the light.
  • the signal processing unit 20 receives the imaging information 100 from the imaging unit 2 (STEP 1). Thereafter, the signal processing unit 20 extracts the first imaging information 100A acquired by the first imaging element group 10A from the imaging information 100 received from the imaging unit 2 (STEP 2) and transmits it to the image generating unit 30.
  • the image generating unit 30 generates first image data 200A based on the first imaging information 100A received from the signal processing unit 20 (STEP 3).
  • the signal processing unit 20 extracts the second imaging information 100B acquired by the second imaging element group 10B from the imaging information 100 received from the imaging element 10 (STEP 4) and transmits it to the image generating unit 30.
  • the image generation unit 30 generates second image data 200B based on the second imaging information 100B (STEP 5).
  • FIG. 5 is a schematic diagram showing a process in which the first image data 200A and the second image data 200B are generated from the imaging information 100.
  • the imaging information 100 transmitted from the imaging unit 2 is shown as data arranged in four rows and four columns in FIG. 5, but in reality, the imaging information 100 corresponds to the number of imaging elements 10.
  • a certain piece of data is given a code representing the color of a color filter provided on the imaging element 10 that output the data, and a two-digit number corresponding to the row number and column number.
  • the imaging information 100 corresponds to the arrangement of the imaging elements 10 shown in FIG. 3, and the grouping of the first imaging element group 10A and the second imaging element group 10B is the same as in FIG. 3.
  • the imaging information 100 based on a signal emitted by the imaging element 10 arranged in the first row and first column of the imaging elements 10 shown in FIG. 3 corresponds to G11 in the imaging information 100 shown in FIG. 5. Furthermore, for example, among the imaging elements 10 shown in FIG. 3, the imaging information 100 based on signals emitted by the imaging elements 10 arranged in the first row, first column, the first row, second column, and the second row, first column, which are imaging elements 10 belonging to the second imaging element group 10B, correspond to G 11 , R 12 , and B 21 , respectively, in FIG. 5.
  • the signal processing unit 20 When the signal processing unit 20 extracts the first imaging information 100A from the imaging information 100, the signal processing unit 20 extracts G 22 , G 24 , G 42 , and G 44 , which correspond to the data received from the imaging elements 10 belonging to the first imaging element group 10A.
  • the second imaging information 100B generated in this manner is output as data of two rows and two columns in the example of Fig. 5.
  • the image generating unit 30 generates first image data 200A based on the first imaging information 100A generated by the signal processing unit 20.
  • the image generating unit 30 generates a single pixel from each piece of data in the first imaging information 100A.
  • the first imaging information 100A is composed only of signals received from the second green imaging element 10g2, which is provided with the same green color filter, so the first image data 200A is output as a grayscale image.
  • the second imaging information 100B extracts G 11 , R 12 , and B 21 , which correspond to the data received from the second imaging element group 10B.
  • the image generating unit 30 uses the second imaging information 100B extracted from the imaging information 100 to generate GRB 11 , which is data constituting the second imaging information 100B.
  • the image generating unit 30 extracts G 13 , R 14 , and B 23 to generate GRB 12 , extracts G 31 , R 32 , and B 41 to generate GRB 21 , and extracts G 33 , R 34 , and B 43 to generate GRB 22 .
  • the signal processing unit 20 generates second imaging information 100B consisting of GRB 11 , GRB 12 , GRB 21 , and GRB 22 and arranged in two rows and two columns.
  • the image generating unit 30 uses the above-mentioned method to generate the second image data 200B based on the second imaging information 100B generated by the signal processing unit 20.
  • the data GRB11 can include color information because it is generated based on the data G11 of the first green imaging element 10g1 provided with a green color filter, the data R12 of the red imaging element 10r provided with a red color filter, and the data B21 of the blue imaging element 10b provided with a blue color filter.
  • the second image data 200B generated by such pixels is output as a color image or a grayscale image. In this embodiment, the second image data 200B is output as a color image.
  • the gain set in the imaging elements 10 belonging to the second imaging element group 10B is set higher than the gain set in the imaging elements 10 belonging to the first imaging element group 10A.
  • the second image data 200B generated using the signal transmitted from the second imaging element group 10B has a higher average luminance than the first image data 200A generated using the signal transmitted from the first imaging element group 10A.
  • the second image data 200B generated from the second element group 10B tends to be a brighter image than the first image data 200A generated from the first element group 10A because the gain of the second element group 10B is higher than the gain of the first element group 10A.
  • the image recognition unit 40 identifies the pixel as a head lamp (hereinafter referred to as HL) or a tail lamp (hereinafter referred to as TL).
  • the image recognition unit 40 identifies the pixel as an HL when a group of pixels having a brightness equal to or greater than a predetermined value is white, and identifies the pixel as a TL when the pixel having a brightness equal to or greater than a predetermined value is red.
  • the method of identifying HL or TL from an image by the image recognition unit 40 is not limited to the above embodiment.
  • the imaging device 1 of this embodiment generates first image data 200A and second image data 200B with different average luminances in a single shooting. How the HL of an oncoming vehicle and the TL of a leading vehicle are detected using such an imaging device 1 will be explained using Figures 6A to 6D and 7A to 7D.
  • the light spot that appears in image data 200 and corresponds to HL is referred to as light spot Z1
  • the light spot that appears in image data 200 and corresponds to TL is referred to as light spot Z2.
  • FIGS. 6A to 6C show a scene where there is an oncoming vehicle ahead of vehicle X.
  • FIG. 6A is a diagram showing the state ahead of vehicle X.
  • FIG. 6B is a diagram showing first image data 200A1 generated in the situation shown in FIG. 6A.
  • FIG. 6C is a diagram showing second image data 200B1 generated in the situation shown in FIG. 6A.
  • the first image data 200A1 is a grayscale image
  • the second image data 200B1 is a color image.
  • the average luminance of the second image data 200B1 is higher than the average luminance of the first image data 200A1. Note that average luminance means the average luminance of the previous pixels contained in the image.
  • the imaging device 1 of this embodiment generates the first image data 200A1 in Fig. 6B at the same time as generating the second image data 200B1.
  • This first image data 200A1 is generated based on data obtained from the imaging element 10 with a low gain setting. Therefore, when generating the first image data 200A1, the light emitted from HL is not too bright, and the shape of HL appears clearly in the first image data 200A1. Therefore, the image recognition unit 40 can identify HL from Fig. 6B. 6B and 6C, the average luminance of the second image data 200B1 is higher than the average luminance of the first image data 200A1 even though the images were taken at the same moment. The image recognition unit 40 can identify the light spot Z1 from each of the first image data 200A1 and the second image data 200B1.
  • FIGS. 7A to 7C show a scene where a vehicle ahead of vehicle X is present.
  • FIG. 7A is a diagram showing the state ahead of vehicle X.
  • FIG. 7B is a diagram showing first image data 200A generated in the situation shown in FIG. 7A.
  • FIG. 7C is a diagram showing second image data 200B generated in the situation shown in FIG. 7A.
  • the image recognition unit 40 cannot identify the TL of the vehicle in front from the first image data 200A2 shown in Figure 7B, but can identify the TL from the second image data 200B2 shown in Figure 7C.
  • the imaging device 1 of this embodiment generates the second image data 200B2 of Fig. 7C at the same time as generating the first image data 200A2.
  • This second image data 200B2 is generated based on data obtained from the imaging element 10 with a high gain setting. Therefore, when generating the second image data 200B2, the light emitted from the TL is not too dark, and the shape of the TL appears clearly as a light spot Z2 in the second image data 200B2.
  • the image recognition unit 40 can identify the TL from Fig. 7C.
  • the image recognition unit 40 in the present disclosure recognizes HL and TL in the above-mentioned manner.
  • the image recognition unit 40 transmits the recognition result to a vehicle control unit X1 provided in the vehicle X and a lamp control unit 3b provided in the vehicle lamp 3, and the vehicle control unit X1 and the lamp control unit 3b change the lighting state of the vehicle lamp 3 based on the recognition result of the image recognition unit 40.
  • "changing the lighting state of the vehicle lamp 3" means changing the light distribution pattern to be formed.
  • 6D and 7D show a manner in which the image recognition unit 40 changes the lighting state of the light source 3a based on the result of recognizing HL and TL. In Fig. 6D and Fig.
  • the light distribution pattern formed by the vehicle lamp 3 is shown by a hatched area.
  • the image recognition unit 40 is able to identify the light spot Z1 (HL) from the first image data 200A1, so the lamp control unit 3b controls the light source 3a to block or dim the portion where the light spot Z1 is located, thereby forming the light distribution pattern Hi1 as shown in the figure.
  • the image recognition unit 40 is able to identify the light spot Z2 (TL) from the second image data 200B2, so the lamp control unit 3b controls the light source 3a to block or dim the area where the light spot Z2 is located, forming the light distribution pattern Hi2 as shown in the figure.
  • the lighting state of the vehicle lamp 3 incorporating the imaging device 1 according to the present disclosure can be changed as described above.
  • first image data is generated based on the first imaging information output from the first imaging element group
  • second image data with high average luminance is generated based on the second imaging information output from the second imaging element group.
  • high average luminance here can also be rephrased as a high average pixel illuminance or a high average pixel gamma value.
  • HDR High Dynamic Range
  • the inventor therefore considered generating image data that makes it easy to recognize objects without using HDR technology, by generating two types of image data with different brightness from the imaging information generated by a single imaging session.
  • the imaging element 10 of the imaging device 1 is divided into a first imaging element group 10A and a second imaging element group 10B, and first image data 200A is generated from a signal generated by the first imaging element group 10A, and second image data 200B is generated from a signal generated by the second imaging element group 10B.
  • the second imaging element group 10B since the second imaging element group 10B has a higher gain set than the first imaging element group 10A, the second image data 200B generated from the second imaging element group 10B is capable of capturing dark objects brighter than the first image data 200A.
  • first image data and second image data with different average luminance can be generated simultaneously with one shooting.
  • image data with different average luminance the luminance distribution can be expanded using two images. This effectively expands the dynamic range, making it possible to recognize bright objects while also being able to recognize dark objects.
  • the first image data 200A and the second image data 200B may both be grayscale images. Even if the first image data 200A and the second image data 200B are both grayscale images, the average luminance of the first image data 200A and the second image data 200B is different, so that bright objects can be recognized as bright objects, and it is also possible to recognize dark objects that were previously difficult to recognize using image data generated by a single capture.
  • the image sensor 10 has a color filter 12, and the color filter 12 has a Bayer array composed of three colors, red, green, and blue, but the present disclosure is not limited to this.
  • a white filter may be included, or a filter that enables the image sensor to detect light other than visible light may be included.
  • a configuration in which no color filter is provided at all may also be used.
  • the signal processing unit 20 and the image generating unit 30 create the first image data 200A from the imaging information 100 and then create the second image data 200B, but the present disclosure is not limited to this. In the present disclosure, it is sufficient that the first image data 200A and the second image data 200B are generated from imaging information obtained by a single imaging operation, and the first image data and the second image data may be generated sequentially or simultaneously.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
PCT/JP2023/045661 2022-12-21 2023-12-20 撮像装置及び画像データ生成方法 WO2024135722A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2024566102A JPWO2024135722A1 (enrdf_load_stackoverflow) 2022-12-21 2023-12-20

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022204744 2022-12-21
JP2022-204744 2022-12-21

Publications (1)

Publication Number Publication Date
WO2024135722A1 true WO2024135722A1 (ja) 2024-06-27

Family

ID=91588735

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/045661 WO2024135722A1 (ja) 2022-12-21 2023-12-20 撮像装置及び画像データ生成方法

Country Status (2)

Country Link
JP (1) JPWO2024135722A1 (enrdf_load_stackoverflow)
WO (1) WO2024135722A1 (enrdf_load_stackoverflow)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014165528A (ja) * 2013-02-21 2014-09-08 Clarion Co Ltd 撮像装置
JP2017046204A (ja) * 2015-08-27 2017-03-02 クラリオン株式会社 撮像装置
JP2018117309A (ja) * 2017-01-20 2018-07-26 ソニーセミコンダクタソリューションズ株式会社 撮像装置、画像処理方法および画像処理システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014165528A (ja) * 2013-02-21 2014-09-08 Clarion Co Ltd 撮像装置
JP2017046204A (ja) * 2015-08-27 2017-03-02 クラリオン株式会社 撮像装置
JP2018117309A (ja) * 2017-01-20 2018-07-26 ソニーセミコンダクタソリューションズ株式会社 撮像装置、画像処理方法および画像処理システム

Also Published As

Publication number Publication date
JPWO2024135722A1 (enrdf_load_stackoverflow) 2024-06-27

Similar Documents

Publication Publication Date Title
US10136076B2 (en) Imaging device, imaging system, and imaging method
US9906766B2 (en) Imaging device
CN109416475B (zh) 分束扩展的动态范围图像捕获系统
JP2013219560A (ja) 撮像装置、撮像方法及びカメラシステム
JP2016096430A (ja) 撮像装置及び撮像方法
WO2011132241A1 (ja) 撮像装置
JP2007318753A (ja) イメージ撮像装置及びその動作方法
JP5397714B1 (ja) 監視用カメラ装置
JP2017118191A (ja) 撮像素子及びその駆動方法、並びに撮像装置
WO2017033687A1 (ja) 撮像装置
US20130155302A1 (en) Digital image sensor
KR20130139788A (ko) 장치의 화상 센서에 의해서 생성된 고정형 패턴 노이즈를 억제하는 촬상 장치
JP5750291B2 (ja) 画像処理装置
JP7057818B2 (ja) 低照度撮像システム
WO2024135722A1 (ja) 撮像装置及び画像データ生成方法
JP2012010282A (ja) 撮像装置、露光制御方法及び露光制御プログラム
JP6322723B2 (ja) 撮像装置および車両
KR101475468B1 (ko) 적외선 led 카메라 시스템
JP4530149B2 (ja) ハイダイナミックレンジカメラシステム
JP2017038311A (ja) 固体撮像装置
JP2019197948A (ja) 撮像装置およびその制御方法
US10458849B2 (en) Sensor assembly for capturing spatially resolved photometric data
JP6594557B2 (ja) 画像処理装置および配光制御システム
JP2014150471A (ja) 撮像装置
JP7524777B2 (ja) 画像形成装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23907082

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024566102

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE