WO2021229983A1 - Dispositif et programme de capture d'image - Google Patents

Dispositif et programme de capture d'image Download PDF

Info

Publication number
WO2021229983A1
WO2021229983A1 PCT/JP2021/015371 JP2021015371W WO2021229983A1 WO 2021229983 A1 WO2021229983 A1 WO 2021229983A1 JP 2021015371 W JP2021015371 W JP 2021015371W WO 2021229983 A1 WO2021229983 A1 WO 2021229983A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
signal
detection unit
flicker
vehicle
Prior art date
Application number
PCT/JP2021/015371
Other languages
English (en)
Japanese (ja)
Inventor
大祐 片岡
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021229983A1 publication Critical patent/WO2021229983A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Definitions

  • This disclosure relates to an imaging device and a program.
  • An image sensor having a plurality of pixels having different sensitivities can generate an HDR (High Dynamic Range) image by synthesizing a plurality of pixel values.
  • HDR High Dynamic Range
  • LED flicker due to the composition of pixels affected by LED (Light Emitting Diode) blinking of a traffic light and pixels not affected by blinking may become a problem.
  • the exposure time is controlled to be sufficiently long for all pixels, for example, about 11 msec with respect to 50 Hz, blurring of moving objects due to long-time exposure becomes a problem.
  • the present disclosure provides an imaging device that appropriately detects the state of the signal acquired in the pixel.
  • the image pickup apparatus includes a first pixel, a second pixel, and a detection unit.
  • the first pixel has the first sensitivity.
  • the second pixel has a second sensitivity that is lower than the first sensitivity.
  • the detection unit is based on the brightness information and saturation information of the first pixel signal output from the first pixel and the brightness information and saturation information of the second pixel signal output from the second pixel. Detect the state.
  • the image pickup apparatus may further include a control unit that controls shooting by the first pixel and the second pixel.
  • the control unit may control the exposure time of the second pixel longer than the exposure time of the first pixel.
  • the detection unit may detect the pixel in which the flicker is photographed based on the information of the first pixel signal and the second pixel signal.
  • the detection unit may detect that the flicker is a photographed pixel when the brightness of the first pixel signal is lower than the brightness of the pixel of the second pixel signal.
  • the detection unit may further detect that the pixel is a flicker photographed when the saturation of the first pixel signal is lower than the saturation of the second pixel signal.
  • the detection unit may detect the area where the LED is used as the subject based on the pixel in which the flicker is photographed.
  • the detection unit may detect the area where the LED is used as the area of the traffic light.
  • the image pickup apparatus may further include a signal processing unit that restores the pixel values of the pixels in which the detection unit has detected flicker from the pixel values of the pixels in which the detection unit has not detected flicker.
  • the signal processing unit may restore the pixel value of the pixel in which the flicker is detected based on the ratio of the luminance value of the first pixel signal and the luminance value of the second pixel signal.
  • the signal processing unit may emphasize the saturation of the pixel in which the flicker is detected and restore the pixel value of the pixel in which the flicker is detected.
  • the signal processing unit may execute signal processing on the pixel in which flicker is detected by using the restored pixel value.
  • the signal processing unit may execute signal processing using the restored pixel value as the pixel value of the first pixel signal.
  • the signal processing unit may execute HDR (High Dynamic Range) synthesis based on the restored pixel values.
  • HDR High Dynamic Range
  • the detection unit may detect that the pixel value is saturated in the first pixel when the saturation of the first pixel signal is lower than the saturation of the second pixel signal.
  • the control unit may control the exposure time of the first pixel based on the state in which saturation occurs.
  • FIG. 1 is a block diagram schematically showing an image pickup apparatus according to an embodiment.
  • the image pickup apparatus 1 includes at least a pixel 10, a detection unit 12, a state detection unit 14, a signal processing unit 16, an object detection unit 18, and a control unit 20.
  • the image pickup apparatus 1 appropriately includes a storage unit for storing data, an output unit for outputting a reconstructed image based on a signal acquired from a pixel, and the like, as appropriate.
  • Pixel 10 is a sensor unit that receives light in the image pickup device 1.
  • the pixel 10 is configured to include, for example, a photoelectric conversion element such as a photodiode, and acquires the intensity of the received light as an analog signal.
  • a photoelectric conversion element such as a photodiode
  • an A / D conversion unit (not shown) that converts an analog signal into a digital signal may be connected to the pixel 10, and a digital signal appropriately converted by the A / D conversion unit or the like is a pixel signal. Is output as.
  • FIG. 2 is a diagram showing an arrangement of pixels 10 according to an embodiment.
  • the pixel 10 may be provided, for example, in two dimensions or more as shown in the figure, and may form a pixel array 100.
  • the arrangement and shape of the pixels are not limited to the case shown in FIG. 2, and may be any configuration as long as the image of the subject can be appropriately acquired.
  • the shape of the pixel 10 is not limited to a square, but may be an arbitrary rectangle, or may be a hexagon, an octagon, or the like.
  • the size of the pixel 10 does not have to be the same over the pixel array 100, and the pixels 10 having different sizes may be mixed in the pixel array 100. Further, the arrangement does not have to be in the form of an array of approximately 90 degrees.
  • the shape of the pixels may be hexagonal, and the pixel array 100 may be provided with the pixels in a honeycomb shape.
  • pixel 10 includes a first pixel 10A and a second pixel 10B. These first pixel 10A and second pixel 10B may have different imaging characteristics. For example, as shown below, each pixel may have different sensitivities.
  • the first pixel 10A is a pixel having a first sensitivity which is higher sensitivity than the second pixel 10B.
  • the first pixel 10A has a higher sensitivity than, for example, the second pixel 10B, but has a shorter exposure time to perform shooting.
  • the first pixel 10A can acquire a higher luminance value than the second pixel 10B in a state where a filter or the like of the same color is provided. As a result, the first pixel 10A may be able to obtain a higher saturation value than the second pixel 10B.
  • the first pixel 10A outputs the first pixel signal.
  • This first pixel signal contains at least luminance information.
  • a signal having a saturation in which a color is mixed with the surrounding first pixel signal may be included.
  • the saturation may be output as a signal received through a monochromatic filter.
  • the second pixel 10B is a pixel having a second sensitivity, which is lower than the first sensitivity.
  • the second pixel 10B has a lower sensitivity than the first pixel 10A, while the exposure time is longer than the first pixel 10A, so that a good signal can be obtained even in a region where the first pixel 10A is saturated. Can be obtained.
  • the second pixel 10B outputs the second pixel signal.
  • This second pixel signal basically includes a signal of the same type as the first pixel signal. That is, the second pixel signal may be a signal including the luminance information and the saturation information acquired by the second pixel 10B.
  • the detection unit 12 detects the pixel signal output from the pixel 10.
  • the detection unit 12 acquires, for example, a luminance value and a saturation value from the first pixel signal and the second pixel signal output from each pixel, and performs detection.
  • the detection is, for example, a process of comparing the luminance value and the saturation value of the pixel signal with a predetermined threshold value to determine whether these signal values are high or low.
  • the detection unit 12 may include a first detection unit 12A and a second detection unit 12B.
  • the first detection unit 12A executes detection of the first image signal output from the first pixel 10A
  • the second detection unit 12B executes detection of the second image signal output from the second pixel 10B.
  • the determination result of the detection unit 12 is output to the state detection unit 14 and the control unit 20.
  • the state detection unit 14 detects the state of the pixel 10 provided with the first pixel 10A and the second pixel 10B based on the detection result acquired from the detection unit 12.
  • the state detection unit 14 determines, for example, whether or not the information of the subject in which the flicker has occurred in each pixel 10 is acquired based on the outputs from the first detection unit 12A and the second detection unit 12B. Further, the state detection unit 14 detects, for example, whether or not the amount of light received by the first pixel 10A is saturated in each pixel 10.
  • the signal processing unit 16 executes appropriate processing for the output from the pixel 10 based on the output of the state detection unit 14.
  • the signal processing unit 16 may generate HDR-combined signal values based on, for example, the first image signal and the second image signal. This composition may be performed pixel by pixel or may be performed over the entire image.
  • the signal processing unit 16 may further correct the luminance value or saturation value of the first pixel 10A based on the output of the state detection unit 14.
  • the signal processing unit 16 may perform HDR synthesis using the first pixel signal, which is a high-sensitivity signal, and the second pixel signal, which is a low-sensitivity signal, as in general processing. .. Further, the signal processing unit 16 can also correct the first pixel signal based on the relationship between the first pixel signal and the second pixel signal before performing HDR composition.
  • the signal processing unit 16 acquires the first pixel signal and the second pixel signal via the state detection unit 14, and generates an image based on the signal and the result detected by the state detection unit 14. May be good. In this way, the signal processing unit 16 may perform a process of generating an image from the pixel signal. As another form, the image itself may be generated by an image generation unit (not shown), and the signal processing unit 16 may synthesize, for example, a value corrected with a luminance value or a saturation value with the generated image.
  • the object detection unit 18 detects an object based on the image signal processed by the signal processing unit 16.
  • the object detection unit 18 may detect, for example, a region in the image where LED flicker is generated based on the result of processing by the signal processing unit 16. More specifically, this area may be detected as an area where LEDs may be used.
  • the area where the LED flicker is generated may be detected as, for example, the illumination area of the traffic signal. For example, these detections may be performed by the state detection unit 14.
  • the control unit 20 executes control on the pixel 10, that is, the first pixel 10A and the second pixel 10B.
  • the control of the pixel 10 is, for example, an image pickup process, for example, the pixel 10 receives light and outputs a pixel signal based on the exposure time and frame rate controlled by the control unit 20.
  • the control unit 20 may execute this control using, for example, the detection result output by the detection unit 12. Further, as another example, control may be executed using the data output by the state detection unit 14, the signal processing unit 16, or the object detection unit 18.
  • the control unit 20 controls, for example, to take a picture with the exposure time of the second pixel 10B longer than that of the first pixel 10A.
  • FIG. 3 is a diagram schematically showing the pixel 10 according to the embodiment. For example, nine pixels 10 are shown, and the pixel array 100 is provided with a combination of pixels as shown in FIG. 3 in an array. The area surrounded by the solid line indicates one pixel 10, and the dotted line indicates the boundary between the first pixel 10A and the second pixel 10B.
  • G is a pixel that receives green light
  • R is a pixel that receives red light
  • B is a pixel that receives blue light.
  • the color to be received may be determined by the color filter, or the color of the light acquired by the photoelectric conversion element may be selectively determined by the organic photoelectric conversion film or the like.
  • the color arrangement is assumed to be a Bayer array as an example, but it is not limited to the Bayer array, and if information such as brightness and saturation can be appropriately acquired, it may be another array such as RGB-W. It doesn't matter.
  • one pixel 10 includes, for example, a substantially regular octagonal first pixel 10A and a substantially square second pixel 10B. Each pixel 10 may include the first pixel 10A and the second pixel 10B.
  • the first pixel 10A has a wider light receiving area than the second pixel 10B.
  • the first pixel 10A can function as a pixel with higher sensitivity than the second pixel 10B.
  • the sensitivity of the second pixel 10B is inferior to that of the first pixel 10A, it does not saturate even if it is exposed for a long time so that the first pixel 10A is saturated, and it has a wider luminance region than the first pixel 10A. It can function as a pixel to have.
  • HDR composition can be performed by the pixels having the first pixel 10A and the second pixel 10B provided at substantially the same position.
  • FIG. 4 is a diagram showing another example schematically showing the pixel 10. Similar to FIG. 3, the area shown by the solid line indicates one pixel 10, and the dotted line indicates the boundary between the first pixel 10A and the second pixel 10B. Further, the meaning and arrangement of RGB are the same as those in FIG.
  • the arrangement of the second pixel 10B is alternately arranged in the opposite direction with respect to the first pixel 10A. May be.
  • the present invention is not limited to this, and any arrangement may be used as long as the first pixel 10A and the second pixel 10B are appropriately arranged.
  • the first pixel 10A cannot receive light
  • the second pixel 10B can receive light, such as the LED lighting outside the exposure time of the first pixel 10A. That is, in a subject in which such flicker occurs, a luminance reversal phenomenon occurs in which the luminance and saturation acquired by the second pixel 10B are higher than the luminance and saturation acquired by the first pixel 10A.
  • the first pixel brightness low
  • the second pixel brightness high
  • the first pixel saturation low
  • the second pixel saturation high.
  • the light cannot be received in the first pixel 10A, but in the second pixel 10B. May be determined to be able to receive light. That is, in such a case, it may be determined that flicker has occurred.
  • the state detection unit 14 may determine that flicker may have occurred if at least (luminance of the first pixel signal) ⁇ (luminance of the second pixel signal). Further, as described above, the state detection unit 14 may further confirm that (saturation of the first pixel signal) ⁇ (saturation of the second pixel signal) and detect flicker.
  • the state detection unit 14 may determine that the pixel 10 has taken a flicker as a subject.
  • the state detection unit 14 acquires such a case from the detection unit 12, it determines that the LED in which the flicker is generated is the subject in the pixel, and detects that the flicker has occurred.
  • the signal processing unit 16 that has received this determination may correct the luminance value of the first pixel signal and perform signal processing, for example, HDR synthesis using the corrected luminance value and saturation value.
  • the signal processing unit 16 may add a flag to the first pixel signal that the pixel has flicker and output the signal.
  • the signal processing unit 16 may output with emphasis on saturation instead of correcting the luminance value of the first pixel signal.
  • FIG. 6 is a diagram showing an example of acquisition of the luminance of the first pixel 10A and the second pixel 10B. Using this figure, some examples of correction of the first pixel signal will be described.
  • the luminance value of the first pixel signal and the luminance value of the second pixel signal are higher than the luminance value of the first pixel signal based on the sensitivity.
  • the luminance value of the first pixel signal may be lower than the luminance value of the second pixel signal depending on the exposure timing.
  • the signal processing unit 16 may correct the luminance value of the first pixel signal by using the ratio ⁇ of the luminance value of the first pixel signal and the luminance value of the second pixel signal when there is no flicker. good. That is, when it is detected that flicker has occurred, the signal processing unit 16 may multiply the luminance value of the second pixel signal by ⁇ to correct the luminance value of the first pixel signal.
  • the present invention is not limited to this, and for example, the signal value may be corrected based on the signal value of the pixel 10 around the pixel 10 of interest.
  • the signal processing unit 16 may perform correction using the signal values of the pixels 10 of the lines before and after the images taken at different timings.
  • other correction methods using the surrounding pixel values can also be applied.
  • the signal processing unit 16 extracts the pixel 10 that does not detect flicker or the pixel 10 that is unlikely to have flicker from the output of the detection unit 12 of the surrounding pixels 10, and the signal value thereof. May be used to make corrections.
  • the signal processing unit 16 may correct the signal value by emphasizing the saturation instead of the luminance value of the first pixel signal. For example, as in the case of ⁇ described above, the ratio ⁇ of the saturation values of the first pixel signal and the second pixel signal in the case of no flicker is used, and the saturation of the second pixel signal is multiplied by ⁇ to obtain the first pixel signal. It may be adopted as the saturation of.
  • the signal processing unit 16 may perform correction based on the signal value of the pixel 10 of interest, or may further perform correction based on the signal value of the peripheral pixel 10.
  • the image pickup apparatus 1 may determine that the area including the pixel is, for example, the area where the illumination of the traffic signal is present.
  • the control unit 20 may determine that the exposure time of the first pixel 10A is long and control to shorten the exposure time of the first pixel 10A. ..
  • the signal processing unit 16 may correct the signal value of the pixel 10 by using the signal values of the surrounding pixels 10, for example, assuming that the detection result of the pixel contains noise.
  • the exposure time may be set for each pixel, but the cost may be high. Therefore, for example, in the pixel 10 existing in the pixel array 100, in an environment where the number of pixels 10 in which the first pixel 10A is saturated is a predetermined number or more, the control unit 20 sets the exposure time of the first pixel 10A. It may be controlled short. Further, after the control in this way, if the luminance value of the first pixel signal does not become higher than the predetermined value in the predetermined period, the control unit 20 again lengthens the exposure time of the first pixel 10A. You may. By controlling in this way, for example, even when moving to a bright place or moving to a dark place, it is possible to appropriately receive light in the first pixel 10A.
  • LED detection information useful for ADAS Advanced Driver-Assistance System
  • object detection processing of automatic driving etc.
  • the accuracy of the can be improved.
  • LED flicker can be detected by comparing the brightness between the first pixel and the second pixel with different sensitivities
  • LED detection is possible with a simple circuit, and it is possible to realize an image sensor with an LED detection function in the pixel. Become. Furthermore, by setting different exposure times for each pixel, the dynamic range in HDR can be further expanded.
  • the above processing may be implemented by an analog circuit or a digital circuit at an appropriate place. Further, when information processing by software is specifically realized by hardware resources, an appropriate program may be stored in a storage unit, and this program may be executed by a processor or the like. A part or all of it may be executed by a program, or it may be realized by appropriately separating the processing from a specialized analog circuit or digital circuit.
  • all the mountings may be realized by a first board including the pixel array 100 and a circuit formed on the second board laminated on the first board.
  • These substrates may be semiconductor devices (image sensors) laminated by the method of CoC, CoW, or WoW. Further, as another example, a semiconductor device in which these functions are formed as one chip may be used.
  • the technology related to this disclosure can be applied to various products.
  • the technology according to the present disclosure is any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor). It may be realized as a device mounted on the body.
  • FIG. 7 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. ..
  • the communication network 7010 connecting these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various arithmetic, and a drive circuit that drives various controlled devices. To prepare for.
  • Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside and outside the vehicle by wired communication or wireless communication.
  • a communication I / F for performing communication is provided. In FIG.
  • control unit 7600 the microcomputer 7610, the general-purpose communication I / F7620, the dedicated communication I / F7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F7660, the audio image output unit 7670,
  • the vehicle-mounted network I / F 7680 and the storage unit 7690 are illustrated.
  • Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 has a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • the vehicle state detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. It includes at least one of sensors for detecting an angle, engine speed, wheel speed, and the like.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, turn signals or fog lamps.
  • a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200.
  • the body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
  • the outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400.
  • the image pickup unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle outside information detection unit 7420 is used, for example, to detect the current weather or an environment sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
  • the environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 8 shows an example of the installation position of the image pickup unit 7410 and the vehicle exterior information detection unit 7420.
  • the image pickup unit 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirror, rear bumper, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900.
  • the image pickup unit 7910 provided in the front nose and the image pickup section 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • the image pickup units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900.
  • the image pickup unit 7916 provided in the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900.
  • the image pickup unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 8 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging range of the imaging units 7912 and 7914 provided on the side mirrors, respectively
  • the imaging range d indicates the imaging range d.
  • the imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the image pickup units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 can be obtained.
  • the vehicle exterior information detection unit 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corner and the upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device.
  • These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
  • the vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle outside information detection unit 7400 receives the detection information from the connected vehicle outside information detection unit 7420.
  • the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device
  • the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the out-of-vehicle information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
  • the out-of-vehicle information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc. based on the received information.
  • the out-of-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the vehicle outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data.
  • the vehicle outside information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different image pickup units 7410 to generate a bird's-eye view image or a panoramic image. May be good.
  • the vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different image pickup units 7410.
  • the in-vehicle information detection unit 7500 detects the in-vehicle information.
  • a driver state detection unit 7510 for detecting the state of the driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
  • the biosensor is provided on, for example, a seat surface or a steering wheel, and detects biometric information of a passenger sitting on the seat or a driver holding the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is asleep. You may.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. You may.
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
  • the storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • Bluetooth® may be implemented.
  • the general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal). May be connected with.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
  • the dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of IEEE802.11p in the lower layer and IEEE1609 in the upper layer. May be implemented.
  • Dedicated communication I / F7630 is typically vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-house (Vehicle to Home) communication, and pedestrian-to-vehicle (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic jam, road closure, or required time.
  • the function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle.
  • the in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • the in-vehicle device I / F7660 is via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)).
  • -Definition Link and other wired connections may be established.
  • the in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle. Further, the in-vehicle device 7760 may include a navigation device for searching a route to an arbitrary destination.
  • the in-vehicle device I / F 7660 may be a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
  • the in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the vehicle-mounted network I / F7680 transmits / receives signals and the like according to a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680.
  • the vehicle control system 7000 is controlled according to various programs based on the information acquired. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good.
  • the microcomputer 7610 has information acquired via at least one of a general-purpose communication I / F7620, a dedicated communication I / F7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F7660, and an in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict the danger of a vehicle collision, a pedestrian or the like approaching or entering a closed road, and generate a warning signal based on the acquired information.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices.
  • the display unit 7720 may include, for example, at least one of an onboard display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps other than these devices.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs the audio signal audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be composed of a plurality of control units.
  • the vehicle control system 7000 may include another control unit (not shown).
  • the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any of the control units.
  • a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. .
  • the image pickup apparatus 1 is a part of the image pickup unit 7410 of the application example shown in FIG. 7, or an outside information detection unit. It can be applied to a part of 7420.
  • the image pickup unit 7410 includes the image pickup device 1 shown in FIG. 1, the output thereof is processed by the vehicle outside information detection unit 7420, and the signal transmission / reception from the power supply device 22 is transmitted / received by the microcomputer 7610 or the like of the integrated control unit 7600. You may go.
  • the image pickup device 1 of the present embodiment can be used as the positioning device in the moving body.
  • the first pixel with the first sensitivity and A second pixel having a second sensitivity lower than the first sensitivity The pixel state is determined based on the brightness information and saturation information of the first pixel signal output from the first pixel and the brightness information and saturation information of the second pixel signal output from the second pixel.
  • the detector to detect and An image pickup device equipped with.
  • the control unit controls the exposure time of the second pixel longer than the exposure time of the first pixel.
  • the detection unit detects the pixel in which the flicker is photographed based on the information of the first pixel signal and the second pixel signal.
  • the detection unit detects that the flicker is a photographed pixel.
  • the imaging device according to (4).
  • the detection unit further detects that the pixel is a pixel in which the flicker is photographed when the saturation of the first pixel signal is lower than the saturation of the second pixel signal.
  • the detection unit detects an area where the LED is used as a subject based on the pixel in which the flicker is photographed.
  • the image pickup apparatus according to any one of (4) to (6).
  • the detection unit detects the area where the LED is used as the area of the traffic light.
  • the imaging device according to (7).
  • the signal processing unit restores the pixel value of the pixel in which flicker is detected based on the ratio of the luminance value of the first pixel signal and the luminance value of the second pixel signal.
  • the imaging device according to (9).
  • the signal processing unit emphasizes the saturation of the pixel in which the flicker is detected, and restores the pixel value of the pixel in which the flicker is detected.
  • the imaging device according to (9).
  • the signal processing unit executes signal processing on the pixel in which flicker is detected by using the restored pixel value.
  • the image pickup apparatus according to any one of (9) to (11).
  • the signal processing unit executes signal processing using the restored pixel value as the pixel value of the first pixel signal.
  • the signal processing unit executes HDR (High Dynamic Range) synthesis based on the restored pixel values.
  • HDR High Dynamic Range
  • the detection unit detects that the pixel value is saturated in the first pixel when the saturation of the first pixel signal is lower than the saturation of the second pixel signal.
  • the image pickup apparatus according to any one of (3) to (14).
  • the control unit controls the exposure time of the first pixel based on the state in which saturation occurs.
  • the imaging device according to (15).
  • the aspect of the present disclosure is not limited to the above-mentioned embodiment, but also includes various possible modifications, and the effect of the present disclosure is not limited to the above-mentioned contents.
  • the components in each embodiment may be applied in appropriate combinations. That is, various additions, changes and partial deletions are possible without departing from the conceptual idea and purpose of the present disclosure derived from the contents specified in the claims and their equivalents.
  • Imaging device 10: Pixel, 10A: 1st pixel, 10B: 2nd pixel, 100: Pixel array, 12: Detection section, 12A: 1st detection section, 12B: 2nd detection section, 14: Status detector, 16: Signal processing unit, 18: Object detector, 20: Control unit,

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

La présente invention détecte de manière appropriée les états de signaux acquis au niveau de pixels. Un dispositif de capture d'image selon la présente invention comprend des premiers pixels, des seconds pixels et une unité de détection. Les premiers pixels ont une première sensibilité. Les seconds pixels ont une seconde sensibilité, qui est inférieure à la première sensibilité. L'unité de détection détecte les états des pixels sur la base d'informations de luminosité et d'informations de chrominance de premiers signaux de pixels émis par les premiers pixels et sur la base d'informations de luminosité et d'informations de chrominance de seconds signaux de pixels émis par les seconds pixels.
PCT/JP2021/015371 2020-05-15 2021-04-14 Dispositif et programme de capture d'image WO2021229983A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020086148A JP2021180459A (ja) 2020-05-15 2020-05-15 撮像装置及びプログラム
JP2020-086148 2020-05-15

Publications (1)

Publication Number Publication Date
WO2021229983A1 true WO2021229983A1 (fr) 2021-11-18

Family

ID=78510558

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015371 WO2021229983A1 (fr) 2020-05-15 2021-04-14 Dispositif et programme de capture d'image

Country Status (2)

Country Link
JP (1) JP2021180459A (fr)
WO (1) WO2021229983A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113248A (ja) * 2010-11-26 2012-06-14 Canon Inc 撮像装置
JP2013066145A (ja) * 2011-08-31 2013-04-11 Sony Corp 画像処理装置、および画像処理方法、並びにプログラム
JP2016213740A (ja) * 2015-05-12 2016-12-15 キヤノン株式会社 撮像装置及び撮像システム
US20170251153A1 (en) * 2016-02-26 2017-08-31 Stmicroelectronics (Grenoble 2) Sas Image sensor adapted to blinking sources

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113248A (ja) * 2010-11-26 2012-06-14 Canon Inc 撮像装置
JP2013066145A (ja) * 2011-08-31 2013-04-11 Sony Corp 画像処理装置、および画像処理方法、並びにプログラム
JP2016213740A (ja) * 2015-05-12 2016-12-15 キヤノン株式会社 撮像装置及び撮像システム
US20170251153A1 (en) * 2016-02-26 2017-08-31 Stmicroelectronics (Grenoble 2) Sas Image sensor adapted to blinking sources

Also Published As

Publication number Publication date
JP2021180459A (ja) 2021-11-18

Similar Documents

Publication Publication Date Title
US10957029B2 (en) Image processing device and image processing method
US10880498B2 (en) Image processing apparatus and image processing method to improve quality of a low-quality image
JP6977722B2 (ja) 撮像装置、および画像処理システム
US20190335079A1 (en) Signal processing apparatus, imaging apparatus, and signal processing method
US10704957B2 (en) Imaging device and imaging method
JP7226440B2 (ja) 情報処理装置、情報処理方法、撮影装置、照明装置、及び、移動体
WO2018034209A1 (fr) Dispositif d'imagerie, et procédé d'imagerie
JP2018207453A (ja) 制御装置、撮像装置、制御方法、プログラム及び撮像システム
US11585898B2 (en) Signal processing device, signal processing method, and program
JP2018064007A (ja) 固体撮像素子、および電子装置
WO2019116746A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et dispositif de capture d'images
JP6981416B2 (ja) 画像処理装置と画像処理方法
WO2018008408A1 (fr) Dispositif d'imagerie à semi-conducteurs, procédé de correction, et dispositif électronique
WO2021229983A1 (fr) Dispositif et programme de capture d'image
WO2018034157A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et dispositif de capture d'images
WO2022181265A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et système de traitement d'images
JP7173056B2 (ja) 認識装置と認識方法およびプログラム
WO2022190801A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement
US10791287B2 (en) Imaging control apparatus and method, and vehicle
WO2018135208A1 (fr) Dispositif d'imagerie et système d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21803383

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21803383

Country of ref document: EP

Kind code of ref document: A1