WO2017217177A1 - 画像処理装置、および撮像装置、ならびに画像処理システム - Google Patents
画像処理装置、および撮像装置、ならびに画像処理システム Download PDFInfo
- Publication number
- WO2017217177A1 WO2017217177A1 PCT/JP2017/018538 JP2017018538W WO2017217177A1 WO 2017217177 A1 WO2017217177 A1 WO 2017217177A1 JP 2017018538 W JP2017018538 W JP 2017018538W WO 2017217177 A1 WO2017217177 A1 WO 2017217177A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- unit
- image data
- pixel
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
- H04N23/662—Transmitting camera control signals through networks, e.g. control via the Internet by using master/slave camera arrangements for affecting the control of camera image capture, e.g. placing the camera in a desirable condition to capture a desired image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/745—Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
Definitions
- the present disclosure relates to an image processing device, an imaging device, and an image processing system suitable for a target tracking system, for example.
- a target tracking system that tracks a moving object of a specific color as a target.
- the image data from the imaging apparatus is directly binarized as RGB color information of the target to be tracked with a threshold value, and this is totaled to estimate the target position.
- the target tracking system described above it is difficult to dynamically control the threshold value in an environment where the threshold value changes depending on illuminance or ambient light, or in subjects whose colors change in RGB ratio depending on the light source, such as human skin color. In this case, the target cannot be extracted and the target may be untracked.
- An image processing apparatus is configured to receive image data from a pixel unit including pixels of a plurality of colors, and to multiply the image data by an adjustment parameter for adjusting a color level for each pixel; Calculates the ratio of each color for each pixel in the image data, and adjusts the value of the adjustment parameter based on the ratio of each color, and extracts a target image of a specific color based on the image data multiplied by the adjustment parameter And a binarization processing unit.
- An imaging apparatus includes a pixel unit including pixels of a plurality of colors, image data from the pixel unit, and multiplication for multiplying the image data by an adjustment parameter for adjusting a color level for each pixel.
- a color adjustment unit that calculates a ratio of each color for each pixel in the image data, adjusts the value of the adjustment parameter based on the ratio of each color, and a target of a specific color based on the image data multiplied by the adjustment parameter
- a binarization processing unit for extracting an image.
- An image processing system includes an imaging device and an actuator that causes the imaging device to track and shoot a specific color target, and the imaging device includes a pixel unit including pixels of a plurality of colors And a multiplier that multiplies the image data by an adjustment parameter that adjusts the color level for each pixel, and calculates the ratio of each color for each pixel in the image data, based on the ratio of each color
- an adjustment unit that adjusts the value of the adjustment parameter and a binarization processing unit that extracts a target image of a specific color based on the image data multiplied by the adjustment parameter are provided.
- the ratio of each color for each pixel in the image data is calculated, and the value of the adjustment parameter is adjusted based on the ratio of each color. .
- a target image of a specific color is extracted based on the image data after being multiplied by the adjustment parameter.
- the target image of the specific color is extracted based on the image data multiplied by the adjustment parameter. Extraction accuracy can be improved. Note that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
- 1 is a configuration diagram schematically illustrating a configuration example of an image processing system according to a first embodiment of the present disclosure. It is a block diagram which shows roughly the example of 1 structure of the binarization process part in the image processing system which concerns on 1st Embodiment. It is explanatory drawing which shows the outline
- Comparative Example> As an example of the image processing system of the comparative example, there is a sensing system in which an image sensor and an image processing circuit are integrated into one chip, and various functions can be realized by a high-speed frame rate operation. For example, a target tracking system using a high frame rate represented by the Self Window method has been developed.
- Fig. 3 shows an overview of target tracking.
- a target tracking system for example, there is a system that tracks a moving object of a specific color as a target.
- Such a target tracking system includes an imaging device and an actuator that causes the imaging device to track and shoot a target of a specific color.
- a target target image
- FIG. 3 a target (target image) that moves within a screen. 30) always controls the image pickup apparatus by the actuator so that it comes to the center of the screen 30.
- a voltage threshold is directly set for data from sensor pixels, and binarization processing of “1” and “0” is performed depending on the level of the target image 20 to be extracted.
- Japanese Patent Laid-Open No. 07-086936 Japanese Patent Laid-Open No. 01-173269.
- the target tracking system of the comparative example does not have a path for feedback from the image status of the current frame or the past frame for this threshold setting. If a large change occurs, tracking cannot be performed, and target extraction failure, that is, a problem such as the tracking error of the target image 20 may occur.
- FIG. 1 schematically illustrates a configuration example of an image processing system according to the first embodiment of the present disclosure.
- FIG. 2 schematically shows a configuration example of the binarization processing unit 5.
- the image processing system includes a pixel unit 1, an AD conversion unit 2, a memory unit 3, a color-specific multiplier 4, a binarization processing unit 5, an adjustment unit 40, and drive information generation.
- the unit 41 and the actuator 10 are provided.
- the adjustment unit 40 includes a color integrator 6 and a color adjustment ratio calculator 7. As shown in FIG. 2, the binarization processing unit 5 includes a specific color extraction unit 51 and a binarization output unit 52.
- the drive information generation unit 41 generates information for driving the actuator 10 and includes an image moment extraction unit 8 and a centroid calculation unit 9.
- the image processing system is a system that includes an imaging device and an actuator 10 that causes the imaging device to track and shoot a target of a specific color, and performs target tracking as shown in FIG. May be.
- the imaging apparatus includes a pixel unit 1, an AD conversion unit 2, a memory unit 3, a color-specific multiplier 4, a binarization processing unit 5, an adjustment unit 40, and a drive information generation unit 41. Also good. Further, the pixel unit 1 is a part of the components of the imaging apparatus, and other circuit parts (the AD conversion unit 2, the memory unit 3, the color multiplier 4, the binarization processing unit 5, the adjustment unit 40, and drive information) The generation unit 41) may be configured as an image processing device.
- At least the image processing apparatus may be configured as one chip.
- a portion other than the actuator 10 may be configured as a single chip.
- the pixel unit 1 may be a CMOS (Complementary MOS) image sensor.
- the pixel unit 1 includes, for example, a light receiving unit in which a plurality of photoelectric conversion elements are two-dimensionally arranged at a predetermined interval, and a color filter array arranged on a light incident surface of the light receiving unit.
- the pixel unit 1 is configured by a CMOS (Complementary Metal-Oxide Semiconductor) image sensor.
- the pixel unit 1 shows an example of a structure in which, for example, three color pixels of R (red), G (green), and B (blue) are arranged in a Bayer array.
- the pixel unit 1 includes one R color pixel, one B color pixel, and two G color pixels.
- the pixel unit 1 may have an array different from the Bayer array.
- the AD conversion unit 2, the memory unit 3, the color-by-color multiplier 4, and the binarization processing unit 5 may be provided in column parallel.
- the image data from the pixel unit 1 in which one pixel includes a plurality of color pixels is input to the multiplier 4 for each color via the AD conversion unit 2 and the memory unit 3.
- the color multiplier 4 has a function of multiplying image data by an adjustment parameter for adjusting a color level for each pixel.
- the adjustment unit 40 has a function of calculating the ratio of each color for each pixel in the image data and adjusting the value of the adjustment parameter based on the ratio of each color.
- the binarization processing unit 5 has a function of extracting a target image 20 of a specific color based on the image data after the adjustment parameter is multiplied by the color-specific multiplier 4.
- the specific color extraction unit 51 has a function of comparing the color components for each pixel in the image data after being multiplied by the adjustment parameter and extracting pixel data of a specific color.
- the binarization output unit 52 sets 1 for pixel data of a specific color extracted by the specific color extraction unit 51 and 0 for pixel data other than the specific color for the image data multiplied by the adjustment parameter.
- the binarized image data subjected to the binarization process has a function of outputting to the drive information generating unit 41.
- the color-specific integrator 6 has a function of integrating the image data for each pixel for a predetermined period (for example, one frame period).
- the color integrator 6 includes an RGB color integrator that integrates the RGB colors and a luminance calculator that calculates the luminance Y.
- the color adjustment ratio calculator 7 has a function of calculating a ratio of each color for each pixel in the image data after being integrated by the color integrator 6.
- the column-unit image data photoelectrically converted by the pixel unit 1 is output to the AD conversion unit 2.
- the AD conversion unit 2 performs AD conversion of each pixel in column parallel.
- the memory unit 3 latches the image data AD-converted by the AD conversion unit 2 in a column parallel manner.
- the color-specific multiplier 4 reads the image data latched in the memory unit 3, multiplies the image data by an adjustment parameter corresponding to each color for each pixel, and multiplies the image data after the multiplication by the binarization processing unit 5. The data is output to the specific color extraction unit 51.
- the adjustment parameter to be multiplied by the color multiplier 4 is calculated by the color adjustment ratio calculator 7 and fed back to the color multiplier 4.
- the specific color extraction unit 51 sets a specific color extraction parameter to a specific color, and extracts pixel data of the specific color with the parameter for the image data after being multiplied by the adjustment parameter.
- the specific color extraction unit 51 sets a threshold value for each color level and extracts pixel data of a specific color.
- the binarization output unit 52 performs binarization processing in which the pixel data of the specific color extracted by the specific color extraction unit 51 is 1 and the pixel data other than the specific color is 0.
- the binarized image data is output to the drive information generation unit 41.
- FIG. 5 shows a calculation example of specific color extraction.
- FIG. 5 shows the color space corresponding to the HSV model.
- the HSV model is a color space model composed of three components of hue (Hue), saturation (Saturation / Chroma), and lightness (Value / Lightness / Brightness).
- FIG. 5 shows an example of extracting and detecting pixel data of a specific color using R as a reference color (skin color detection) (1), and an example of extracting and detecting pixel data of a specific color using G as a reference color. (2) and an example (3) of extracting and detecting pixel data of a specific color using B as a reference color are shown.
- ⁇ , ⁇ , and ⁇ are coefficients that determine threshold values for determining whether pixel data of a specific color or pixel data other than a specific color in the binarization output unit 52.
- the binarized output unit 52 receives the basic color outside ratio threshold value and the basic color absolute threshold value.
- the basic color is a color corresponding to the specific color.
- the drive information generation unit 41 calculates the moment, the centroid, and the like to drive the actuator 10. Is generated.
- the integrator for each color 6 integrates the image data latched in the memory unit 3 for each pixel for a predetermined period (for example, one frame period) for each color.
- the color integrator 6 integrates the image data of the entire screen for each of R, G, and B colors.
- the color adjustment ratio calculator 7 calculates the ratio of R, G, B from the integration result of R, G, B, and adjusts the value of the adjustment parameter to be multiplied by the color multiplier 4 based on the ratio of each color. (provide feedback).
- white balance is calculated so that R: G: B becomes a ratio of 1: 1: 1, and is reflected in the value of the adjustment parameter of the color-specific multiplier 4.
- the color integrator 6 calculates the luminance Y for each color of R, G, and B for each pixel based on the image data latched in the memory unit 3.
- the color adjustment ratio calculator 7 calculates (adjusts) the auto exposure (AE, automatic exposure) level so that the integration result of the luminance Y becomes an appropriate level, and the accumulation time (shutter) of each pixel in the pixel unit 1 Or you may make it reflect in the multiplier 4 classified by color.
- the above processing may be performed every frame.
- the target image 20 of a specific color is extracted based on the image data after being multiplied by the adjustment parameter, so that the target extraction accuracy can be improved.
- the accuracy of extracting the target image 20 is improved, and as a result, the object tracking accuracy is improved.
- the image processing system according to the present embodiment suppresses subject tracking loss due to changes in ambient environment light.
- the high-speed frame rate camera can suppress changes in the brightness and color of the image that occur on the entire screen for each frame. As a result, it can be expected to improve the image quality performance.
- FIG. 6 schematically illustrates a configuration example of an image processing system according to the second embodiment of the present disclosure.
- an image processing system that increases the accuracy of target extraction and enables countermeasures against flicker.
- the image processing system according to the present embodiment further includes an image I / F (interface) 11 and an external device 12 with respect to the configuration of the image processing system according to the first embodiment.
- the external device 12 may be another image processing device or an image output device such as a display.
- the output from the color integrator 4 is input to the color integrator 6 and the image I / F 11.
- FIG. 7 shows a flicker occurrence image.
- FIG. 7 shows the cycle of the power supply, the blinking cycle of the fluorescent lamp, and the vertical synchronization signal VD. The same applies to the following FIG. 8 and FIG.
- FIG. 8 shows an image of color rolling. Color rolling occurs because the ratio of the color components varies from frame to frame.
- the operation of feeding back the output from the color adjustment ratio calculator 7 to the color multiplier 4 can be performed every frame. Therefore, an auto white balance (AWB) operation and an AE operation at a high frame rate are possible.
- ABB auto white balance
- FIG. 9 shows an example of the sampling period at the time of shooting at a high frame rate.
- the influence of flicker shown in FIG. 7 can be almost eliminated by performing the AE operation for each frame at a high frame rate corresponding to the sampled brightness. Thereby, occurrence of flicker in the image data output to the external device 12 can be suppressed.
- the influence of color rolling can be minimized by performing the AWB operation for each frame at a high frame rate. Thereby, occurrence of color rolling in the image data output to the external device 12 can be suppressed.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor).
- FIG. 10 is a block diagram illustrating a schematic configuration example of a vehicle control system 7000 that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
- the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, a vehicle exterior information detection unit 7400, a vehicle interior information detection unit 7500, and an integrated control unit 7600. .
- a communication network 7010 for connecting the plurality of control units is compliant with an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
- CAN Controller Area Network
- LIN Local Interconnect Network
- LAN Local Area Network
- FlexRay registered trademark
- Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various calculations, and a drive circuit that drives various devices to be controlled. Is provided.
- Each control unit includes a network I / F for communicating with other control units via a communication network 7010, and is connected to devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I / F for performing communication is provided. In FIG.
- a microcomputer 7610 As a functional configuration of the integrated control unit 7600, a microcomputer 7610, a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, an audio image output unit 7670, An in-vehicle network I / F 7680 and a storage unit 7690 are illustrated.
- other control units include a microcomputer, a communication I / F, a storage unit, and the like.
- the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 7100 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
- a vehicle state detection unit 7110 is connected to the drive system control unit 7100.
- the vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the rotational movement of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of sensors for detecting an angle, an engine speed, a rotational speed of a wheel, or the like is included.
- the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, or the like.
- the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 7200 can be input with radio waves or various switch signals transmitted from a portable device that substitutes for a key.
- the body system control unit 7200 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the battery control unit 7300 controls the secondary battery 7310 that is a power supply source of the drive motor according to various programs. For example, information such as battery temperature, battery output voltage, or remaining battery capacity is input to the battery control unit 7300 from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature adjustment of the secondary battery 7310 or the cooling device provided in the battery device.
- the outside information detection unit 7400 detects information outside the vehicle on which the vehicle control system 7000 is mounted.
- the outside information detection unit 7400 is connected to at least one of the imaging unit 7410 and the outside information detection unit 7420.
- the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
- the outside information detection unit 7420 detects, for example, current weather or an environmental sensor for detecting weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the surrounding information detection sensors.
- the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects sunlight intensity, and a snow sensor that detects snowfall.
- the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
- the imaging unit 7410 and the outside information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
- FIG. 11 shows an example of installation positions of the imaging unit 7410 and the vehicle outside information detection unit 7420.
- the imaging units 7910, 7912, 7914, 7916, and 7918 are provided at, for example, at least one of the front nose, the side mirror, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior of the vehicle 7900.
- An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
- Imaging units 7912 and 7914 provided in the side mirror mainly acquire an image of the side of the vehicle 7900.
- An imaging unit 7916 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 7900.
- the imaging unit 7918 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or
- FIG. 11 shows an example of shooting ranges of the respective imaging units 7910, 7912, 7914, and 7916.
- the imaging range a indicates the imaging range of the imaging unit 7910 provided in the front nose
- the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided in the side mirrors, respectively
- the imaging range d The imaging range of the imaging part 7916 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916, an overhead image when the vehicle 7900 is viewed from above is obtained.
- the vehicle outside information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners of the vehicle 7900 and the upper part of the windshield in the vehicle interior may be, for example, an ultrasonic sensor or a radar device.
- the vehicle outside information detection units 7920, 7926, and 7930 provided on the front nose, the rear bumper, the back door, and the windshield in the vehicle interior of the vehicle 7900 may be, for example, LIDAR devices.
- These outside information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, and the like.
- the vehicle exterior information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the vehicle exterior information detection unit 7420 connected thereto.
- the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device
- the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
- the outside information detection unit 7400 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received information.
- the vehicle exterior information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, or the like based on the received information.
- the vehicle outside information detection unit 7400 may calculate a distance to an object outside the vehicle based on the received information.
- the outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
- the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by the different imaging units 7410 to generate an overhead image or a panoramic image. Also good.
- the vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
- the vehicle interior information detection unit 7500 detects vehicle interior information.
- a driver state detection unit 7510 that detects the driver's state is connected to the in-vehicle information detection unit 7500.
- Driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects biometric information of the driver, a microphone that collects sound in the passenger compartment, and the like.
- the biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of an occupant sitting on the seat or a driver holding the steering wheel.
- the vehicle interior information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determines whether the driver is asleep. May be.
- the vehicle interior information detection unit 7500 may perform a process such as a noise canceling process on the collected audio signal.
- the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
- An input unit 7800 is connected to the integrated control unit 7600.
- the input unit 7800 is realized by a device that can be input by a passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
- the integrated control unit 7600 may be input with data obtained by recognizing voice input through a microphone.
- the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. May be.
- the input unit 7800 may be, for example, a camera.
- the passenger can input information using a gesture.
- data obtained by detecting the movement of the wearable device worn by the passenger may be input.
- the input unit 7800 may include, for example, an input control circuit that generates an input signal based on information input by a passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600.
- a passenger or the like operates the input unit 7800 to input various data or instruct a processing operation to the vehicle control system 7000.
- the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like.
- the storage unit 7690 may be realized by a magnetic storage device such as an HDD (HardHDisc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- General-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
- the general-purpose communication I / F 7620 is a cellular communication protocol such as GSM (Global System of Mobile communications) (registered trademark), WiMAX, LTE (Long Term Evolution) or LTE-A (LTE-Advanced), or a wireless LAN (Wi-Fi). (Also referred to as (registered trademark)) and other wireless communication protocols such as Bluetooth (registered trademark) may be implemented.
- the general-purpose communication I / F 7620 is connected to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. May be.
- a device for example, an application server or a control server
- an external network for example, the Internet, a cloud network, or an operator-specific network
- the general-purpose communication I / F 7620 uses, for example, a P2P (Peer) To ⁇ Peer) technology
- a terminal for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal
- You may connect with.
- the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in vehicles.
- the dedicated communication I / F 7630 is, for example, a standard protocol such as WAVE (Wireless Access In Vehicle Environment) (WAVE), DSRC (Dedicated Short Range Communication), or a cellular communication protocol, which is a combination of IEEE 802.11p in the lower layer and IEEE 1609 in the upper layer. May be implemented.
- the dedicated communication I / F 7630 typically includes vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian (Vehicle to Pedestrian). ) Perform V2X communication, which is a concept that includes one or more of the communications.
- the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a Global Positioning System (GPS) satellite), performs positioning, and performs latitude, longitude, and altitude of the vehicle.
- the position information including is generated.
- the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
- the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station installed on the road, and acquires information such as the current position, traffic jam, closed road, or required time. Note that the function of the beacon receiving unit 7650 may be included in the dedicated communication I / F 7630 described above.
- the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
- the in-vehicle device I / F 7660 may establish a wireless connection using a wireless communication protocol such as a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
- the in-vehicle device I / F 7660 is connected to a USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), or MHL (Mobile) via a connection terminal (and a cable if necessary).
- Wired connection such as High-definition (Link) may be established.
- the in-vehicle device 7760 may include, for example, at least one of a mobile device or a wearable device that a passenger has, or an information device that is carried into or attached to the vehicle.
- In-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination.
- In-vehicle device I / F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
- the in-vehicle network I / F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
- the in-vehicle network I / F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
- the microcomputer 7610 of the integrated control unit 7600 is connected via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680.
- the vehicle control system 7000 is controlled according to various programs based on the acquired information. For example, the microcomputer 7610 calculates a control target value of the driving force generation device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. Also good.
- the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, tracking based on inter-vehicle distance, vehicle speed maintenance, vehicle collision warning, or vehicle lane departure warning. You may perform the cooperative control for the purpose. Further, the microcomputer 7610 controls the driving force generator, the steering mechanism, the braking device, or the like based on the acquired information on the surroundings of the vehicle, so that the microcomputer 7610 automatically travels independently of the driver's operation. You may perform the cooperative control for the purpose of driving.
- ADAS Advanced Driver Assistance System
- the microcomputer 7610 is information acquired via at least one of the general-purpose communication I / F 7620, the dedicated communication I / F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F 7660, and the in-vehicle network I / F 7680.
- the three-dimensional distance information between the vehicle and the surrounding structure or an object such as a person may be generated based on the above and local map information including the peripheral information of the current position of the vehicle may be created.
- the microcomputer 7610 may generate a warning signal by predicting a danger such as a collision of a vehicle, approach of a pedestrian or the like or an approach to a closed road based on the acquired information.
- the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
- the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
- an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices.
- Display unit 7720 may include at least one of an on-board display and a head-up display, for example.
- the display portion 7720 may have an AR (Augmented Reality) display function.
- the output device may be other devices such as headphones, wearable devices such as glasses-type displays worn by passengers, projectors, and lamps.
- the display device can display the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually. Further, when the output device is an audio output device, the audio output device converts an audio signal made up of reproduced audio data or acoustic data into an analog signal and outputs it aurally.
- At least two control units connected via the communication network 7010 may be integrated as one control unit.
- each control unit may be configured by a plurality of control units.
- the vehicle control system 7000 may include another control unit not shown.
- some or all of the functions of any of the control units may be given to other control units. That is, as long as information is transmitted and received via the communication network 7010, the predetermined arithmetic processing may be performed by any one of the control units.
- a sensor or device connected to one of the control units may be connected to another control unit, and a plurality of control units may transmit / receive detection information to / from each other via the communication network 7010. .
- the image processing device, the imaging device, and the image processing system of the present disclosure are, for example, the vehicle exterior information detection unit 7400 and the imaging unit 7410, or the vehicle interior information detection unit 7500 and the driver state detection. Part 7510 can be applied.
- an example of a target tracking system is shown as an example of an image processing system.
- the present technology is widely applicable to other sensing systems that use binarized image data.
- this technique can also take the following structures.
- a multiplier for inputting image data from a pixel portion including pixels of a plurality of colors, and multiplying the image data by an adjustment parameter for adjusting a color level for each pixel;
- An adjustment unit that calculates a ratio of each color for each pixel in the image data and adjusts the value of the adjustment parameter based on the ratio of each color;
- An image processing apparatus comprising: a binarization processing unit that extracts a target image of a specific color based on the image data multiplied by the adjustment parameter.
- the binarization processing unit The image processing apparatus according to (1), further including: a specific color extracting unit that compares color components for each pixel in the image data multiplied by the adjustment parameter and extracts pixel data of the specific color. (3) The binarization processing unit The image data multiplied by the adjustment parameter is subjected to a binarization process in which the pixel data of the specific color extracted by the specific color extraction unit is 1 and the pixel data other than the specific color is 0.
- the image processing apparatus according to (2) further including: a binarization output unit that outputs binarized image data.
- a drive information generation unit that generates information for driving an actuator that tracks the target of the specific color based on the binarized image data.
- the adjustment unit is An integrator for integrating the image data for each pixel for a predetermined period;
- the image processing apparatus according to any one of (1) to (4), further including: a color adjustment ratio calculator that calculates a ratio of each color for each pixel in the image data after the integration.
- a pixel portion including pixels of a plurality of colors;
- a multiplier that receives image data from the pixel unit and multiplies the image data by an adjustment parameter for adjusting a color level for each pixel;
- An adjustment unit that calculates a ratio of each color for each pixel in the image data and adjusts the value of the adjustment parameter based on the ratio of each color;
- a binarization processing unit that extracts a target image of a specific color based on the image data multiplied by the adjustment parameter.
- An imaging device An actuator that causes the imaging apparatus to track and shoot a target of a specific color, and The imaging device A pixel portion including pixels of a plurality of colors; A multiplier that receives image data from the pixel unit and multiplies the image data by an adjustment parameter for adjusting a color level for each pixel; An adjustment unit that calculates a ratio of each color for each pixel in the image data and adjusts the value of the adjustment parameter based on the ratio of each color; An image processing system comprising: a binarization processing unit that extracts the target image of the specific color based on the image data multiplied by the adjustment parameter.
Abstract
Description
なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。
0.比較例
1.第1の実施の形態(ターゲット抽出の精度を高めることを可能にした画像処理システム)(図1~図5)
1.1 第1の実施の形態に係る画像処理システムの構成(図1、図2)
1.2 第1の実施の形態に係る画像処理システムの動作(図3~図5)
1.3 効果
2.第2の実施の形態(ターゲット抽出の精度を高めると共にフリッカ対策を可能にした画像処理システム)(図6~図9)
3.応用例(図10~図11)
4.その他の実施の形態
比較例の画像処理システムの一例として、イメージセンサと画像処理回路とをワンチップ化し、高速なフレームレート動作で様々な機能を実現することを可能にしたセンシングシステムがある。例えば、Self Window法を代表とする高フレームレートを利用したターゲットトラッキングシステムが開発されている。
(1)センサ構造が並列演算器(列並列、面並列どちらでも可能)を持つ高フレームレートでの高速移動物体のターゲット抽出実施に際し、環境変化に対しロバストに2値化を実施する。
(2)高フレームレート撮像で発生する面フリッカやカラーローリングによる光源の周期的な明るさの変化に対し、変化分を迅速にフィードバックして、フリッカによる明るさや色変化を抑圧する。
[1.1 第1の実施の形態に係る画像処理装置の構成]
図1は、本開示の第1の実施の形態に係る画像処理システムの一構成例を概略的に示している。図2は、2値化処理部5の一構成例を概略的に示している。
本実施の形態に係る画像処理システムでは、画素部1で光電変換された列単位の画像データが、AD変換部2に出力される。AD変換部2では、列並列に各画素のAD変換を実施する。メモリ部3は、AD変換部2によってAD変換された画像データを列並列にラッチする。
もしくは、特定色抽出部51では、各色レベルに閾値を設定し、特定色の画素データを抽出する。
なお、図5は、色空間をHSVモデルに対応させて示している。HSVモデルは、色相(Hue)、彩度(Saturation・Chroma)、および明度(Value・Lightness・Brightness)の3つの成分からなる色空間モデルである。
以上のように、本実施の形態によれば、調整パラメータが乗算された後の画像データに基づいて特定色のターゲット画像20を抽出するようにしたので、ターゲットの抽出精度を向上させることができる。
次に、本開示の第2の実施の形態に係る画像処理システムについて説明する。なお、以下では、上記第1の実施の形態に係る画像処理システムの構成要素と略同じ部分については、同一符号を付し、適宜説明を省略する。
図9のように、サンプリングした明るさに対応し、高フレームレートで毎フレームごとにAE動作をさせることで、図7に示したフリッカの影響をほぼ無くすことができる。これにより、外部装置12に出力する画像データにおけるフリッカの発生を抑制できる。
本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
本開示による技術は、上記各実施の形態の説明に限定されず種々の変形実施が可能である。
(1)
複数色の画素を含む画素部からの画像データが入力され、前記画素ごとの色レベルを調整する調整パラメータを前記画像データに乗算する乗算器と、
前記画像データにおける前記画素ごとの各色の比率を算出し、前記各色の比率に基づいて、前記調整パラメータの値を調整する調整部と、
前記調整パラメータが乗算された前記画像データに基づいて、特定色のターゲット画像を抽出する2値化処理部と
を備える
画像処理装置。
(2)
前記2値化処理部は、
前記調整パラメータが乗算された前記画像データにおける前記画素ごとに色成分を比較し、前記特定色の画素データを抽出する特定色抽出部
を有する
上記(1)に記載の画像処理装置。
(3)
前記2値化処理部は、
前記調整パラメータが乗算された前記画像データに対して、前記特定色抽出部によって抽出された前記特定色の画素データを1、前記特定色以外の画素データを0とする2値化処理を施した2値化画像データを出力する2値化出力部
をさらに有する
上記(2)に記載の画像処理装置。
(4)
前記2値化画像データに基づいて、前記特定色のターゲットを追尾するアクチュエータを駆動する情報を生成する駆動情報生成部
をさらに備える
上記(3)に記載の画像処理装置。
(5)
前記調整部は、
前記画像データを前記画素ごとに所定期間分、積分する積分器と、
前記積分した後の前記画像データにおける前記画素ごとの前記各色の比率を算出する色調整比率計算器と
を有する
上記(1)ないし(4)のいずれか1つに記載の画像処理装置。
(6)
複数色の画素を含む画素部と、
前記画素部からの画像データが入力され、前記画素ごとの色レベルを調整する調整パラメータを前記画像データに乗算する乗算器と、
前記画像データにおける前記画素ごとの各色の比率を算出し、前記各色の比率に基づいて、前記調整パラメータの値を調整する調整部と、
前記調整パラメータが乗算された前記画像データに基づいて、特定色のターゲット画像を抽出する2値化処理部と
を備える
撮像装置。
(7)
撮像装置と、
前記撮像装置に対して特定色のターゲットを追尾して撮影させるアクチュエータと
を含み、
前記撮像装置は、
複数色の画素を含む画素部と、
前記画素部からの画像データが入力され、前記画素ごとの色レベルを調整する調整パラメータを前記画像データに乗算する乗算器と、
前記画像データにおける前記画素ごとの各色の比率を算出し、前記各色の比率に基づいて、前記調整パラメータの値を調整する調整部と、
前記調整パラメータが乗算された前記画像データに基づいて、前記特定色のターゲット画像を抽出する2値化処理部と
を備える
画像処理システム。
Claims (7)
- 複数色の画素を含む画素部からの画像データが入力され、前記画素ごとの色レベルを調整する調整パラメータを前記画像データに乗算する乗算器と、
前記画像データにおける前記画素ごとの各色の比率を算出し、前記各色の比率に基づいて、前記調整パラメータの値を調整する調整部と、
前記調整パラメータが乗算された前記画像データに基づいて、特定色のターゲット画像を抽出する2値化処理部と
を備える
画像処理装置。 - 前記2値化処理部は、
前記調整パラメータが乗算された前記画像データにおける前記画素ごとに色成分を比較し、前記特定色の画素データを抽出する特定色抽出部
を有する
請求項1に記載の画像処理装置。 - 前記2値化処理部は、
前記調整パラメータが乗算された前記画像データに対して、前記特定色抽出部によって抽出された前記特定色の画素データを1、前記特定色以外の画素データを0とする2値化処理を施した2値化画像データを出力する2値化出力部
をさらに有する
請求項2に記載の画像処理装置。 - 前記2値化画像データに基づいて、前記特定色のターゲットを追尾するアクチュエータを駆動する情報を生成する駆動情報生成部
をさらに備える
請求項3に記載の画像処理装置。 - 前記調整部は、
前記画像データを前記画素ごとに所定期間分、積分する積分器と、
前記積分した後の前記画像データにおける前記画素ごとの前記各色の比率を算出する色調整比率計算器と
を有する
請求項1に記載の画像処理装置。 - 複数色の画素を含む画素部と、
前記画素部からの画像データが入力され、前記画素ごとの色レベルを調整する調整パラメータを前記画像データに乗算する乗算器と、
前記画像データにおける前記画素ごとの各色の比率を算出し、前記各色の比率に基づいて、前記調整パラメータの値を調整する調整部と、
前記調整パラメータが乗算された前記画像データに基づいて、特定色のターゲット画像を抽出する2値化処理部と
を備える
撮像装置。 - 撮像装置と、
前記撮像装置に対して特定色のターゲットを追尾して撮影させるアクチュエータと
を含み、
前記撮像装置は、
複数色の画素を含む画素部と、
前記画素部からの画像データが入力され、前記画素ごとの色レベルを調整する調整パラメータを前記画像データに乗算する乗算器と、
前記画像データにおける前記画素ごとの各色の比率を算出し、前記各色の比率に基づいて、前記調整パラメータの値を調整する調整部と、
前記調整パラメータが乗算された前記画像データに基づいて、前記特定色のターゲット画像を抽出する2値化処理部と
を備える
画像処理システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17813080.3A EP3474534B1 (en) | 2016-06-17 | 2017-05-17 | Image processing apparatus, imaging apparatus, and image processing system |
JP2018523602A JP6977722B2 (ja) | 2016-06-17 | 2017-05-17 | 撮像装置、および画像処理システム |
US16/308,660 US11202046B2 (en) | 2016-06-17 | 2017-05-17 | Image processor, imaging device, and image processing system |
KR1020187029716A KR102392221B1 (ko) | 2016-06-17 | 2017-05-17 | 화상 처리 장치, 및 촬상 장치, 및 화상 처리 시스템 |
CN201780028141.6A CN109076167A (zh) | 2016-06-17 | 2017-05-17 | 图像处理器、摄像装置和图像处理系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-120965 | 2016-06-17 | ||
JP2016120965 | 2016-06-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017217177A1 true WO2017217177A1 (ja) | 2017-12-21 |
Family
ID=60663106
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/018538 WO2017217177A1 (ja) | 2016-06-17 | 2017-05-17 | 画像処理装置、および撮像装置、ならびに画像処理システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US11202046B2 (ja) |
EP (1) | EP3474534B1 (ja) |
JP (1) | JP6977722B2 (ja) |
KR (1) | KR102392221B1 (ja) |
CN (1) | CN109076167A (ja) |
WO (1) | WO2017217177A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109299711A (zh) * | 2018-12-25 | 2019-02-01 | 常州纺织服装职业技术学院 | 颜色跟踪方法和装置 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018092495A (ja) * | 2016-12-07 | 2018-06-14 | ソニーセミコンダクタソリューションズ株式会社 | 画像センサ |
US10863155B2 (en) * | 2018-12-24 | 2020-12-08 | Gopro, Inc. | Reduction of banding artifacts in image processing |
CN109961450B (zh) * | 2019-02-19 | 2021-08-24 | 厦门码灵半导体技术有限公司 | 图像二值化处理方法、装置、存储介质和电子设备 |
CN112132861B (zh) * | 2020-08-17 | 2024-04-26 | 浙江大华技术股份有限公司 | 一种车辆事故监测方法、系统和计算机设备 |
CN113421183B (zh) * | 2021-05-31 | 2022-09-20 | 中汽数据(天津)有限公司 | 车辆环视全景图的生成方法、装置、设备和存储介质 |
CN113942458B (zh) * | 2021-10-29 | 2022-07-29 | 禾多科技(北京)有限公司 | 用于车载摄像头调整系统的控制方法、装置、设备和介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09298693A (ja) * | 1996-05-08 | 1997-11-18 | Fuji Photo Film Co Ltd | 電子スチルカメラの記録制御方法 |
JP2003259352A (ja) * | 2002-02-26 | 2003-09-12 | Minolta Co Ltd | 物体検出装置、物体検出方法、および物体検出プログラム |
JP2012178788A (ja) * | 2011-02-28 | 2012-09-13 | Sony Corp | 画像処理装置と画像処理方法およびプログラム |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01173269A (ja) | 1987-12-28 | 1989-07-07 | Fujitsu Ltd | 色物体位置自動測定装置 |
JP2748678B2 (ja) * | 1990-10-09 | 1998-05-13 | 松下電器産業株式会社 | 階調補正方法および階調補正装置 |
JPH06203157A (ja) * | 1992-10-14 | 1994-07-22 | Fujitsu Ltd | カラー画像処理方法および装置 |
JPH0786936A (ja) | 1993-09-14 | 1995-03-31 | Nippon Steel Corp | A/dコンバータ |
EP1489833A1 (en) * | 1997-06-17 | 2004-12-22 | Seiko Epson Corporation | Image processing apparatus, image processing method, color adjustment method, and color adjustment system |
JP4649734B2 (ja) * | 2000-12-08 | 2011-03-16 | 株式会社ニコン | 映像信号処理装置および映像信号処理プログラムを記録した記録媒体 |
US7196724B2 (en) * | 2000-12-08 | 2007-03-27 | Nikon Corporation | Image signal processing device, digital camera and computer program product that perform white balance adjustment using pixel outputs from extracted areas |
US7633523B2 (en) * | 2001-02-09 | 2009-12-15 | Olympus Corporation | Image capturing device using correction information for preventing at least a part of correction process from being performed when image data is corrected at an external device |
JP3823314B2 (ja) | 2001-12-18 | 2006-09-20 | ソニー株式会社 | 撮像信号処理装置及びフリッカ検出方法 |
JP2003348601A (ja) * | 2002-05-27 | 2003-12-05 | Fuji Photo Film Co Ltd | オートホワイトバランス制御方法及び電子カメラ |
KR100707269B1 (ko) * | 2005-08-18 | 2007-04-16 | 삼성전자주식회사 | 영상의 칼라를 픽셀에 적응적으로 조절할 수 있는칼라조절장치 및 방법 |
EP2023599A4 (en) * | 2006-05-12 | 2012-02-22 | Konica Minolta Business Corp Inc | IMAGE PROCESSING AND ARRANGEMENT |
JP5111789B2 (ja) | 2006-06-08 | 2013-01-09 | オリンパスイメージング株式会社 | ズームレンズ及びそれを備えた電子撮像装置 |
JP2008306377A (ja) * | 2007-06-06 | 2008-12-18 | Fuji Xerox Co Ltd | 色調整装置および色調整プログラム |
JP2009193429A (ja) * | 2008-02-15 | 2009-08-27 | Mitsubishi Electric Corp | 画像読取装置 |
JP5147994B2 (ja) * | 2009-12-17 | 2013-02-20 | キヤノン株式会社 | 画像処理装置およびそれを用いた撮像装置 |
JP2011203814A (ja) * | 2010-03-24 | 2011-10-13 | Sony Corp | 画像処理装置および方法、プログラム |
WO2011148799A1 (ja) * | 2010-05-28 | 2011-12-01 | 富士フイルム株式会社 | 撮像装置及びホワイトバランスゲイン算出方法 |
JP2012002541A (ja) * | 2010-06-14 | 2012-01-05 | Sony Corp | 画像処理装置、画像処理方法、プログラム、及び電子機器 |
CN102170640A (zh) * | 2011-06-01 | 2011-08-31 | 南通海韵信息技术服务有限公司 | 基于模式库的智能手机端不良内容网站鉴别方法 |
US9196056B2 (en) * | 2013-08-19 | 2015-11-24 | Manufacturing Techniques, Inc. | Electro-optical system and method for analyzing images of a scene to identify the presence of a target color |
JP2015115922A (ja) | 2013-12-16 | 2015-06-22 | オリンパス株式会社 | 撮像装置および撮像方法 |
JP2016161763A (ja) * | 2015-03-02 | 2016-09-05 | 株式会社ジャパンディスプレイ | 表示装置 |
CN105491120A (zh) * | 2015-12-01 | 2016-04-13 | 小米科技有限责任公司 | 图片传输的方法及装置 |
-
2017
- 2017-05-17 US US16/308,660 patent/US11202046B2/en active Active
- 2017-05-17 JP JP2018523602A patent/JP6977722B2/ja active Active
- 2017-05-17 WO PCT/JP2017/018538 patent/WO2017217177A1/ja unknown
- 2017-05-17 CN CN201780028141.6A patent/CN109076167A/zh active Pending
- 2017-05-17 KR KR1020187029716A patent/KR102392221B1/ko active IP Right Grant
- 2017-05-17 EP EP17813080.3A patent/EP3474534B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09298693A (ja) * | 1996-05-08 | 1997-11-18 | Fuji Photo Film Co Ltd | 電子スチルカメラの記録制御方法 |
JP2003259352A (ja) * | 2002-02-26 | 2003-09-12 | Minolta Co Ltd | 物体検出装置、物体検出方法、および物体検出プログラム |
JP2012178788A (ja) * | 2011-02-28 | 2012-09-13 | Sony Corp | 画像処理装置と画像処理方法およびプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3474534A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109299711A (zh) * | 2018-12-25 | 2019-02-01 | 常州纺织服装职业技术学院 | 颜色跟踪方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
US11202046B2 (en) | 2021-12-14 |
JP6977722B2 (ja) | 2021-12-08 |
EP3474534B1 (en) | 2023-12-20 |
US20190166345A1 (en) | 2019-05-30 |
KR102392221B1 (ko) | 2022-05-02 |
JPWO2017217177A1 (ja) | 2019-04-04 |
EP3474534A4 (en) | 2019-10-16 |
CN109076167A (zh) | 2018-12-21 |
KR20190019904A (ko) | 2019-02-27 |
EP3474534A1 (en) | 2019-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10957029B2 (en) | Image processing device and image processing method | |
JP6977722B2 (ja) | 撮像装置、および画像処理システム | |
US10880498B2 (en) | Image processing apparatus and image processing method to improve quality of a low-quality image | |
WO2018074252A1 (ja) | 画像処理装置および画像処理方法 | |
US10704957B2 (en) | Imaging device and imaging method | |
WO2018179671A1 (ja) | 画像処理装置と画像処理方法および撮像装置 | |
JP2018207453A (ja) | 制御装置、撮像装置、制御方法、プログラム及び撮像システム | |
WO2018016150A1 (ja) | 画像処理装置と画像処理方法 | |
WO2018016151A1 (ja) | 画像処理装置と画像処理方法 | |
JP2018064007A (ja) | 固体撮像素子、および電子装置 | |
US11375137B2 (en) | Image processor, image processing method, and imaging device | |
US11585898B2 (en) | Signal processing device, signal processing method, and program | |
WO2018008408A1 (ja) | 固体撮像装置、補正方法、および電子装置 | |
JP6981416B2 (ja) | 画像処理装置と画像処理方法 | |
JP7059185B2 (ja) | 画像処理装置、画像処理方法、および撮像装置 | |
WO2021229983A1 (ja) | 撮像装置及びプログラム | |
WO2024004644A1 (ja) | センサ装置 | |
WO2018135208A1 (ja) | 撮像装置と撮像システム | |
WO2019111651A1 (ja) | 撮像システム、画像処理装置、及び、画像処理方法 | |
US10791287B2 (en) | Imaging control apparatus and method, and vehicle | |
JP2018011246A (ja) | 固体撮像装置、補正方法、および電子装置 | |
JP2022131079A (ja) | 画像処理装置、画像処理方法及び画像処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20187029716 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018523602 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17813080 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017813080 Country of ref document: EP Effective date: 20190117 |