WO2022219874A1 - Signal processing device and method, and program - Google Patents

Signal processing device and method, and program Download PDF

Info

Publication number
WO2022219874A1
WO2022219874A1 PCT/JP2022/002779 JP2022002779W WO2022219874A1 WO 2022219874 A1 WO2022219874 A1 WO 2022219874A1 JP 2022002779 W JP2022002779 W JP 2022002779W WO 2022219874 A1 WO2022219874 A1 WO 2022219874A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
signal
image
low
filter
Prior art date
Application number
PCT/JP2022/002779
Other languages
French (fr)
Japanese (ja)
Inventor
駿 阿久津
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2022219874A1 publication Critical patent/WO2022219874A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems

Definitions

  • the present technology relates to a signal processing device, method, and program, and more particularly to a signal processing device, method, and program capable of suppressing the occurrence of artifacts.
  • Patent Document 1 an image sensor having a sub-pixel structure in which large pixels and small pixels with different sizes (sensitivities) are provided in one unit pixel is known (see, for example, Patent Document 1).
  • an image with a wider dynamic range can be obtained by HDR (High Dynamic Range) synthesis of an image captured with large pixels and an image captured with small pixels. be able to.
  • HDR High Dynamic Range
  • the exposure time is set to differ between large pixels and small pixels, there will be differences in imaging results for moving subjects and subject areas such as LEDs (Light Emitting Diodes) between large pixels and small pixels. Since such a difference causes moving object artifacts during HDR synthesis, it is common to perform motion determination and perform correction according to the determination result.
  • LEDs Light Emitting Diodes
  • Patent Literature 2 discloses detecting the difference between a plurality of images captured with different exposure times and combining the plurality of images with a composition coefficient according to the difference.
  • the presence or absence of motion is determined based on the difference between two images having different sensitivities, that is, the magnitude of the difference between the two image signals.
  • Such motion determination is performed on the premise that if the subject is stationary, the image signals (pixel values) will match if the sensitivity difference between the image signals with different sensitivities is corrected.
  • This technology has been developed in view of this situation, and is intended to suppress the occurrence of artifacts.
  • a signal processing device performs filtering on a first image signal using a first filter that extracts a low-frequency component, thereby generating a first low-frequency luminance signal. and a second filter for extracting a low-frequency component and correcting a phase difference, unlike the first filter, for a second image signal having a phase different from that of the first image signal.
  • a second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal by performing filtering with the first low-frequency luminance signal and the second low-frequency luminance signal and a motion determination unit that performs motion determination based on the low-frequency luminance signal.
  • a signal processing method or program generates a first low-frequency luminance signal by filtering a first image signal using a first filter that extracts low-frequency components, A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. a step of generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal and performing motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal; including.
  • a first low-frequency luminance signal is generated by filtering a first image signal using a first filter that extracts low-frequency components, and the first image is Unlike the first filter, a second image signal having a different phase from the signal is filtered by a second filter that extracts low-frequency components and corrects the phase difference.
  • a second low-frequency luminance signal having the same phase as the low-frequency luminance signal is generated, and motion determination is performed based on the first low-frequency luminance signal and the second low-frequency luminance signal.
  • FIG. 4 is a diagram showing an example of a large-pixel luminance generation filter and a small-pixel luminance generation filter; 10 is a flowchart for explaining image compositing processing; It is a figure explaining the effect of this technique.
  • FIG. 10 is a diagram showing another example of a small-pixel luminance generation filter; It is a figure which shows the structural example of a computer.
  • 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
  • FIG. FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • the present technology is intended to suppress the occurrence of artifacts by using a low-frequency luminance signal for motion determination and performing phase difference correction only on the signal used for motion determination.
  • the image sensor is provided with unit pixels arranged, for example, as shown in FIG.
  • a large pixel SP1 and a small pixel SP2 having different sensitivities are provided adjacent to each other.
  • the large pixel SP1 and the small pixel SP2 are formed so that the area (size) of the light receiving surface of the large pixel SP1 is larger than that of the small pixel SP2, and the large pixel SP1 is larger than the small pixel SP2. also has high sensitivity. Also, here, the small pixel SP2 is arranged on the upper right side of the large pixel SP1 in the figure.
  • multiple unit pixels with the same configuration as the unit pixel PX1 are arranged in a matrix.
  • each unit pixel is provided with a color filter of one of R (red), G (green), and B (blue).
  • unit pixels are arranged in a checkered pattern.
  • a unit pixel having a B color filter and a unit pixel having a G color filter are arranged in the horizontal direction in the drawing. are arranged alternately.
  • the unit pixel having the R color filter and the unit pixel having the G color filter are arranged in the figure. They are arranged alternately in the middle and the horizontal direction.
  • the unit pixel adjacent in the vertical direction in the figure of the unit pixel PX1 has an R color filter, and the unit pixel PX1 is adjacent in the horizontal direction in the figure.
  • Each unit pixel has a B color filter.
  • each unit pixel of the image sensor may have any color filter, and each unit pixel may have a configuration in which no color filter is provided.
  • an image signal of an image picked up by large pixels provided in an image sensor that is, an image signal composed of pixel signals read out from each large pixel by performing an exposure operation (imaging processing) on the image sensor is converted into a large image signal.
  • an image based on a large pixel image signal is also referred to as a large pixel image.
  • an image signal of an image picked up by small pixels provided in an image sensor that is, an image signal composed of pixel signals read out from each small pixel after an exposure operation is performed on the image sensor, is referred to as a small pixel image signal.
  • An image based on a small pixel image signal is also called a small pixel image.
  • motion determination is performed based on the large pixel image (large pixel image signal), small pixel image (small pixel image signal), and HDR synthesis gain, and a predetermined synthesis coefficient obtained from the result of the motion determination is used. Assume that HDR synthesis of a large pixel image and a small pixel image is performed.
  • the HDR synthesis gain is a coefficient for matching the brightness of the large pixel image and the small pixel image, and is calculated in advance based on, for example, the sensitivity difference and the exposure time difference between the large pixel image and the small pixel image.
  • the synthesis coefficient is a coefficient that indicates the synthesis ratio (mixing ratio) of the large pixel image and the small pixel image at the time of HDR synthesis.
  • the large pixel image and the small pixel image are basically the same image.
  • the pixel corresponding to the large pixel SP1 in FIG. 1 in the large pixel image is the pixel of interest.
  • the pixel corresponding to the pixel of interest that is, the pixel on the small pixel image that has the same positional relationship as the position of the pixel of interest in the large pixel image (hereinafter also referred to as the corresponding pixel) is the small pixel SP2. becomes a pixel corresponding to .
  • the target pixel and the corresponding pixel have the same positional relationship, but as shown in FIG. 1, the large pixel SP1 and the small pixel SP2 have different arrangement positions, that is, there is a deviation in arrangement position. Therefore, even if the subject is stationary, different subjects (different parts of the subject) are captured in the pixel of interest and the corresponding pixel.
  • the presence or absence of motion is determined based on the magnitude of the difference between the large-pixel image and the small-pixel image, which have different sensitivities.
  • the signal levels (pixel values) shown in FIG. 3 can be obtained with respect to the brightness of the subject with large pixels and small pixels.
  • the horizontal axis indicates the brightness of the subject
  • the vertical axis indicates the pixel signal value (pixel value), that is, the signal level when the subject is captured.
  • the polygonal line L11 indicates the signal level (pixel signal value) obtained in the large pixel SP1 for each brightness
  • the straight line L12 indicates the signal level (pixel signal value) obtained in the small pixel SP2 for each brightness. pixel signal value).
  • the center position of the large pixel SP1 and the center position of the small pixel SP2 are at the same position.
  • a signal level P11 is obtained as the pixel signal value of the small pixel SP2, and a signal level P11 is obtained as the pixel signal value of the large pixel SP1.
  • P12 is obtained.
  • the value obtained by multiplying the signal level P11 by the HDR synthesis gain matches the signal level P12.
  • a signal level P21 is obtained as the pixel signal value of the small pixel SP2 as indicated by an arrow Q12
  • a signal level P22 is obtained as the pixel signal value of the large pixel SP1. is obtained.
  • the value obtained by multiplying the signal level P21 by the HDR synthesis gain matches the signal level P23 at the same brightness on the polygonal line L11, but the actual signal level of the large pixel SP1 Does not match P22.
  • the subject to be imaged is a moving subject, and if the exposure time, etc., differs between the large-pixel image and the small-pixel image, the brightness of the subject changes due to changes in the subject captured between the large-pixel SP1 and small-pixel SP2 pixels. Because it changes.
  • the image signal of the large pixel image and the image signal of the small pixel image that is, the pixel signal of the large pixel and the pixel signal of the small pixel are out of phase.
  • the pixel signal obtained by multiplying the pixel signal of the small pixel by the HDR synthesis gain will match the pixel signal of the large pixel corresponding to that small pixel. Therefore, the premise of motion determination that pixel signals of small pixels and large pixels match does not apply.
  • the phase difference between the pixel signals of the large pixels and the small pixels can be used even for a stationary object.
  • the signal difference may become large, resulting in an erroneous determination.
  • phase difference correction is performed on the image signal of the small pixel image, and the pixel signal of the large pixel and the pixel signal of the small pixel are signals of the same phase. It is also conceivable to perform motion determination and HDR synthesis as
  • phase difference correction when phase difference correction is performed on an image, artifacts such as false colors and rattling occur, causing adverse effects such as loss of sharpness of the image. and so on, the cost becomes high.
  • phase difference correction even if noise reduction or the like is performed, only a synthesized image with a lower sharpness than when motion determination is performed without phase difference correction may be obtained.
  • a low-frequency luminance signal is used to determine motion, and such a low-frequency signal can be phase-corrected (shifted) more easily than a high-frequency signal. Therefore, according to the present technology, occurrence of erroneous determination and erroneous correction can be suppressed without causing an increase in cost such as an increase in circuit scale.
  • FIG. 4 is a diagram illustrating a configuration example of an embodiment of a signal processing device to which the present technology is applied.
  • the signal processing device 11 shown in FIG. 4 is composed of, for example, an in-vehicle camera device.
  • the signal processing device 11 has an image sensor 21 , a signal generation section 22 , a motion determination section 23 and an HDR synthesis section 24 .
  • the image sensor 21 is an image sensor having a sub-pixel structure, and functions as an imaging unit that generates an image signal of a large-pixel image and an image signal of a small-pixel image by performing imaging.
  • the image sensor 21 is provided with a plurality of large pixels and small pixels having different sensitivities and phases (arrangement positions) from each other in the arrangement shown in FIG.
  • the image sensor 21 captures an image of a subject by photoelectrically converting incident light, and supplies image signals of the resulting large-pixel image and small-pixel image to the signal generating unit 22 and the HDR synthesizing unit 24 . . That is, an image signal of a large pixel image is generated by imaging with a plurality of large pixels forming the image sensor 21, and an image signal of a small pixel image is generated by imaging by a plurality of small pixels forming the image sensor 21.
  • the image signal of the large pixel image and the image signal of the small pixel image are signals with phases different from each other.
  • the image sensor 21 may be provided outside the signal processing device 11 .
  • the signal generation unit 22 generates a low-frequency luminance signal (low-frequency luminance signal) for motion determination based on the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21, Supplied to the motion determination unit 23 .
  • the signal generating section 22 has a large pixel filtering section 31 and a small pixel filtering section 32 .
  • the large-pixel filter processing unit 31 performs filtering (filtering) on the image signal of the large-pixel image using a large-pixel luminance generation filter to convert the large-pixel low-frequency signal, which is the low-frequency luminance signal of the large-pixel image. Generate a luminance signal.
  • the large-pixel luminance generation filter is a low-pass filter that extracts low-frequency components, more specifically, luminance components below a predetermined frequency (low-frequency luminance components) from the image signal of the large-pixel image.
  • the small pixel filter processing unit 32 performs filtering (filtering) on the image signal of the small pixel image using a small pixel luminance generation filter, thereby reducing the low-frequency luminance of the phase difference corrected small pixel image.
  • a small pixel low frequency luminance signal is generated.
  • the small-pixel low-frequency luminance signal thus obtained is a signal having the same phase as the large-pixel low-frequency luminance signal.
  • the luminance generation filter for small pixels is a filter different from the luminance generation filter for large pixels.
  • the small-pixel luminance generation filter functions as a low-pass filter that extracts low-frequency components (low-frequency luminance components) from the image signal of the small-pixel image. It is considered as a filter or the like that also realizes phase difference correction such as
  • the signal generation unit 22 converts the large pixel low-frequency luminance signal obtained by the large pixel filtering unit 31 and the small pixel low-frequency luminance signal obtained by the small pixel filtering unit 32 into low-frequency signals for motion determination. It is supplied to the motion determination unit 23 as a luminance signal.
  • the motion determination unit 23 performs motion determination based on the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal supplied from the signal generation unit 22, and supplies the determination result to the HDR synthesis unit 24.
  • the HDR synthesizing unit 24 HDR synthesizes the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21 based on the determination result of the motion determination supplied from the motion determining unit 23 .
  • the synthesized image obtained as a result is output to the subsequent stage.
  • the large pixel luminance generation filter and the small pixel luminance generation filter are the filters shown in FIG.
  • the portion indicated by arrow Q21 shows an example of the luminance generation filter for large pixels
  • the portion indicated by arrow Q22 shows an example of the luminance generation filter for small pixels.
  • the large-pixel luminance generation filter is a filter with a size of 3 ⁇ 3, and the numerical values written in each square are the coefficients of the filter that are multiplied by the pixel values of the pixels corresponding to those squares. represents.
  • the size of the filter here means the number of taps of the filter, that is, the number of pixels (size) of the pixel area to be filtered.
  • the values of the pixel signals (pixel values) of the large pixels adjacent to the upper left, right above, and upper right of the large pixel SP1 in FIG. 1 are multiplied by the coefficient values 1, 2, and 1. be.
  • the pixel signal values (pixel values) of the large pixels adjacent to the lower left, just below, and lower right of the large pixel SP1 are multiplied by coefficient values 1, 2, and 1. be.
  • the sum of a total of 9 pixel values multiplied by the coefficients in this way (the sum of the pixel values after coefficient multiplication) is obtained, and the sum is divided by 16, which is the sum of the 9 coefficients,
  • the result of the division is the value of the pixel signal of one pixel obtained by filtering, that is, the value of the large pixel low-frequency luminance signal for one pixel.
  • the phase of the newly generated large pixel low-frequency luminance signal that is, the position of the pixel is the center position of the large pixel SP1.
  • the coefficient indicated by the arrow Q21 is used regardless of the color of the large pixel color filter in the center of the target 3 ⁇ 3 pixel region.
  • pixel signals of large pixels having R, G, and B color filters are used to generate one (one pixel) large pixel low-frequency luminance signal.
  • the small pixel luminance generation filter is also a 3 ⁇ 3 size filter like the large pixel luminance generation filter.
  • the numerical values written in each rectangle represent the coefficients of the filters by which the pixel values of the pixels corresponding to those rectangles are multiplied.
  • the small-pixel luminance generation filter is a 3 ⁇ 3 filter, and the coefficients of the pixels forming the uppermost pixel row and the rightmost pixel column in the target pixel region are All are 0. Therefore, the small-pixel luminance generation filter is substantially a 2 ⁇ 2 size filter.
  • the value of the pixel signal (pixel value) of the small pixels positioned to the left and directly below the small pixel SP2 is multiplied by 1, which is the value of the coefficient.
  • the sum of the four pixel values multiplied by the coefficients in this way is obtained, the sum is divided by 4 which is the sum of the four coefficients, and the result of the division is the one obtained by filtering. It is the value of the pixel signal of the pixel, that is, the value of the small pixel low-frequency luminance signal for one pixel.
  • the phase of the newly generated small-pixel low-frequency luminance signal that is, the position of the pixel is the center position of the four small-pixels used to generate the small-pixel low-frequency luminance signal. It becomes the center position of the pixel SP1. Therefore, it can be seen that the generated small pixel low-frequency luminance signal has the same phase as the corresponding large pixel low-frequency luminance signal, and phase difference correction is realized by the small pixel luminance generation filter.
  • the large-pixel luminance generation filter and the small-pixel luminance generation filter are filters having at least different coefficients.
  • step S11 the large pixels and small pixels of the image sensor 21 receive and photoelectrically convert the incident light to capture large pixel images and small pixel images.
  • large pixels and small pixels are imaged so that the length of the exposure time and the timing of the start or end of exposure are different.
  • the image sensor 21 converts the image signal of the large pixel image obtained by imaging with each large pixel and the image signal of the small pixel image obtained by imaging with each small pixel into a signal generation unit 22 and an HDR synthesis unit. 24.
  • step S ⁇ b>12 the large pixel filter processing unit 31 performs filtering on the image signal of the large pixel image supplied from the image sensor 21 using the large pixel luminance generation filter, thereby reducing the large pixel intensity for each of a plurality of pixel positions. generating a frequency luminance signal.
  • step S ⁇ b>13 the small pixel filter processing unit 32 filters the image signal of the small pixel image supplied from the image sensor 21 using the small pixel luminance generation filter, thereby reducing the small pixel intensity for each of a plurality of pixel positions. generating a frequency luminance signal.
  • a low-frequency luminance signal is extracted from the image signal and phase difference correction (phase correction) is also performed.
  • a phase luminance signal is obtained.
  • the signal generation unit 22 supplies the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal obtained by the above processing to the motion determination unit 23 .
  • steps S12 and S13 are executed before demosaic processing is performed on the large-pixel image and the small-pixel image, it may be executed after the demosaicing processing.
  • step S14 the motion determination unit 23 performs motion determination based on the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal supplied from the signal generation unit 22, and supplies the determination result to the HDR synthesis unit 24.
  • the motion determination unit 23 calculates the difference between the value obtained by multiplying the small pixel low frequency luminance signal by the HDR synthesis gain and the large pixel low frequency luminance signal for each pixel position, and calculates the difference obtained,
  • the motion amount of the subject is obtained by comparing the noise level and a predetermined motion determination threshold. As a result of the motion determination, the amount of motion of the subject for each pixel position is obtained.
  • step S15 the HDR synthesizing unit 24 HDR synthesizes the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21 based on the determination result of the motion determination supplied from the motion determining unit 23. By doing so, an image signal of a synthesized image is generated.
  • the HDR synthesizing unit 24 generates a synthesizing coefficient for each pixel position based on the image signal of the large pixel image, adjusts the synthesizing coefficient by performing motion compensation according to the determination result of the motion determination, and finalizes the Get the composite coefficient. Then, the HDR synthesizing unit 24 synthesizes (mixes) the value obtained by multiplying the image signal of the small pixel image by the HDR synthesis gain and the image signal of the large pixel image using the synthesizing coefficient to obtain the image signal of the synthesized image. get
  • the signal processing device 11 generates a large-pixel low-frequency luminance signal and a small-pixel low-frequency luminance signal of the same phase, performs motion determination, synthesizes the large-pixel image and the small-pixel image, and combines them. Generate an image.
  • motion determination using low-frequency luminance signals of the same phase in this way, it is possible to suppress the occurrence of erroneous determination and erroneous correction. As a result, it is possible to suppress the occurrence of artifacts and obtain a composite image of higher quality.
  • the motion determination unit 23 can improve the determination accuracy of motion determination as shown in FIG. Note that FIG. 7 indicates that the brighter the part of the subject (image), the greater the motion amount obtained in the motion determination.
  • the window of the building in the region R11, a window of a building exists as a subject, and in the judgment result indicated by the arrow Q31, the window of the building has a particularly fine pattern, that is, a portion containing high-frequency components. It can be seen that an erroneous determination has occurred.
  • the signal generator 22 generates a low-frequency luminance signal from the large-pixel image and the small-pixel image obtained by imaging the subject shown in FIG.
  • the determination result when extracting and correcting the phase difference and performing the motion determination by the motion determination unit 23 is shown. That is, the portion indicated by the arrow Q32 shows the result of motion determination obtained by the processing performed by the signal generation unit 22 and the motion determination unit 23 of the signal processing device 11 .
  • phase difference correction is performed and motion determination is performed using a low-frequency luminance signal.
  • motion determination is performed using a low-frequency luminance signal.
  • the large pixel image and the small pixel image obtained by the image sensor 21, which are not low-frequency luminance signals, are used as they are to perform HDR synthesis.
  • Artifacts such as jerkyness can also be made less likely to cause adverse effects such as loss of image sharpness.
  • a composite image of higher quality and wider dynamic range can be obtained.
  • FIG. 5 illustrates an example in which the large-pixel luminance generation filter and the small-pixel luminance generation filter are filters of size 3 ⁇ 3, but these filters may be of any size, such as 5 ⁇ 5. may be a filter of
  • the large-pixel luminance generation filter and the small-pixel luminance generation filter may be filters of the same size, or may be filters of different sizes.
  • the small pixel luminance generation filter may be, for example, the filter shown in FIG.
  • each rectangle represents a pixel in the pixel area to which the small pixel luminance generation filter is applied, and the numerical value in each rectangle indicates the coefficient of the filter by which the pixel value of the pixel corresponding to the rectangle is multiplied. represent.
  • the small pixel luminance generation filter is a 3 ⁇ 3 size filter, and the pixel value of the pixel at the center position of the target pixel area and and the pixel values of the adjacent pixels below are multiplied by 63 as a factor.
  • the pixel values of the pixels adjacent to the upper left, upper, right, and lower right in the drawing are multiplied by 1 as coefficients to the pixel at the center position of the pixel region, and the pixel at the center position of the pixel region is multiplied by 1 as a coefficient.
  • the pixel value of the pixel adjacent to the upper right is multiplied by 0 as a coefficient.
  • the sum of the pixel values of each pixel multiplied by these coefficients is divided by 256, and the result of the division is the value of the pixel signal of one pixel obtained by filtering, that is, the small pixel low-frequency luminance signal for one pixel. is assumed to be the value of By doing so, extraction of the low-frequency luminance signal and correction of the phase difference can also be realized.
  • two images with different phases may be captured using two cameras, motion determination and HDR synthesis may be performed based on the two images, and a synthesized image may be generated.
  • the series of processes described above can be executed by hardware or by software.
  • a program that constitutes the software is installed in the computer.
  • the computer includes, for example, a computer built into dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs.
  • FIG. 9 is a block diagram showing a hardware configuration example of a computer that executes the series of processes described above by a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input/output interface 505 is further connected to the bus 504 .
  • An input unit 506 , an output unit 507 , a recording unit 508 , a communication unit 509 and a drive 510 are connected to the input/output interface 505 .
  • the input unit 506 consists of a keyboard, mouse, microphone, imaging device, and the like.
  • the output unit 507 includes a display, a speaker, and the like.
  • a recording unit 508 is composed of a hard disk, a nonvolatile memory, or the like.
  • a communication unit 509 includes a network interface and the like.
  • a drive 510 drives a removable recording medium 511 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
  • the CPU 501 loads the program recorded in the recording unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the above-described series of programs. is processed.
  • the program executed by the computer (CPU 501) can be provided by being recorded on a removable recording medium 511 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the recording unit 508 via the input/output interface 505 by loading the removable recording medium 511 into the drive 510 . Also, the program can be received by the communication unit 509 and installed in the recording unit 508 via a wired or wireless transmission medium. In addition, the program can be installed in the ROM 502 or the recording unit 508 in advance.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • the signal processing device 11 described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays, for example, as follows.
  • ⁇ Devices that capture images for viewing purposes, such as digital cameras and mobile devices with camera functions.
  • Devices used for transportation such as in-vehicle sensors that capture images behind, around, and inside the vehicle, surveillance cameras that monitor running vehicles and roads, and ranging sensors that measure the distance between vehicles.
  • Devices used in home appliances such as TVs, refrigerators, air conditioners, etc., to capture images and operate devices according to gestures ⁇ Endoscopes, devices that perform angiography by receiving infrared light, etc.
  • Equipment used for medical and healthcare purposes such as surveillance cameras for crime prevention and cameras for personal authentication
  • Skin measuring instruments that photograph the skin and images of the scalp Equipment used for beauty such as microscopes used for beauty
  • Equipment used for sports such as action cameras and wearable cameras for use in sports ⁇ Cameras, etc. for monitoring the condition of fields and crops , agricultural equipment
  • the technology (the present technology) according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
  • FIG. 10 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • the body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
  • the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
  • the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
  • a control command can be output to 12010 .
  • the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
  • the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 11 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior, for example.
  • An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
  • Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
  • An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
  • Forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 11 shows an example of the imaging range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of an imaging unit 12104 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the course of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
  • automatic brake control including following stop control
  • automatic acceleration control including following start control
  • the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
  • recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
  • the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031, the vehicle exterior information detection unit 12030, and the like among the configurations described above.
  • the signal processing device 11 shown in FIG. 4 can be used as the imaging unit 12031 and the vehicle exterior information detection unit 12030, suppressing the occurrence of erroneous determination during motion determination, and achieving higher quality and dynamic range. A wide composite image can be obtained.
  • this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
  • each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
  • one step includes multiple processes
  • the multiple processes included in the one step can be executed by one device or shared by multiple devices.
  • this technology can also be configured as follows.
  • a first filtering unit that generates a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components
  • a second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter.
  • a second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal
  • a signal processing device comprising: a motion determination unit that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
  • the signal processing device further comprising a synthesizing unit that synthesizes the first image signal and the second image signal based on the motion determination result.
  • the signal processing device further comprising an imaging unit that generates the first image signal and the second image signal by performing imaging.
  • the imaging unit includes a plurality of first pixels for obtaining the first image signal, and a plurality of second pixels for obtaining the second image signal, the sensitivity of which is different from that of the first pixels.
  • the signal processing device according to (3).
  • (5) (4) The signal processing device according to (4), wherein the first pixel is larger than the second pixel.
  • (6) The signal processing device according to any one of (1) to (5), wherein the second filter has the same size as the first filter.
  • a signal processing device generating a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components; A second image signal having a phase different from that of the first image signal is subjected to filtering by a second filter that extracts a low-frequency component and corrects a phase difference, unlike the first filter. generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal, A signal processing method that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present technology relates to a signal processing device and method and a program that make it possible to suppress the generation of artifacts. The signal processing device comprises: a first filter processing unit that generates a first low frequency luminance signal by filtering a first image signal using a first filter that extracts low frequency components; a second filter processing unit that generates a second low frequency luminance signal having the same phase as the first low frequency luminance signal by filtering a second image signal having a different phase from the first image signal using a second filter that is different from the first filter, extracts low frequency components, and corrects phase differences; and a movement determination unit that determines movement on the basis of the first low frequency luminance signal and the second low frequency luminance signal. The present technology can be applied to a signal processing device.

Description

信号処理装置および方法、並びにプログラムSIGNAL PROCESSING APPARATUS AND METHOD, AND PROGRAM
 本技術は、信号処理装置および方法、並びにプログラムに関し、特に、アーティファクトの発生を抑制することができるようにした信号処理装置および方法、並びにプログラムに関する。 The present technology relates to a signal processing device, method, and program, and more particularly to a signal processing device, method, and program capable of suppressing the occurrence of artifacts.
 従来、1つの単位画素内に、大きさ(感度)が異なる大画素と小画素を設けるサブピクセル構造を有するイメージセンサが知られている(例えば、特許文献1参照)。 Conventionally, an image sensor having a sub-pixel structure in which large pixels and small pixels with different sizes (sensitivities) are provided in one unit pixel is known (see, for example, Patent Document 1).
 このようなサブピクセル構造のイメージセンサを用いれば、大画素で撮像された画像と、小画素で撮像された画像とをHDR(High Dynamic Range)合成することで、よりダイナミックレンジの広い画像を得ることができる。 By using an image sensor with such a sub-pixel structure, an image with a wider dynamic range can be obtained by HDR (High Dynamic Range) synthesis of an image captured with large pixels and an image captured with small pixels. be able to.
 しかし、例えば大画素と小画素とで露光時間が異なるようにした場合、大画素と小画素とで、動被写体やLED(Light Emitting Diode)などの被写体部分に対する撮像結果に差が生じてしまう。そのような差はHDR合成時に動体アーティファクトを発生させるため、動き判定を行い、その判定結果に応じた補正を行うのが一般的である。 However, if, for example, the exposure time is set to differ between large pixels and small pixels, there will be differences in imaging results for moving subjects and subject areas such as LEDs (Light Emitting Diodes) between large pixels and small pixels. Since such a difference causes moving object artifacts during HDR synthesis, it is common to perform motion determination and perform correction according to the determination result.
 例えば、特許文献2には、異なる露光時間で撮像された複数の画像の差分を検出し、差分に応じた合成係数で、複数の画像を合成することが開示されている。 For example, Patent Literature 2 discloses detecting the difference between a plurality of images captured with different exposure times and combining the plurality of images with a composition coefficient according to the difference.
国際公開第2016/147885号WO2016/147885 特開2018-19387号公報JP 2018-19387 A
 ところで、上述の動き判定では、互いに感度の異なる2つの画像の差分、つまり2つの画像信号の差分の大きさに基づいて、動きの有無が判定される。 By the way, in the motion determination described above, the presence or absence of motion is determined based on the difference between two images having different sensitivities, that is, the magnitude of the difference between the two image signals.
 このような動き判定は、被写体が静止していれば、互いに感度の異なる画像信号の感度差を補正すれば、それらの画像信号(画素値)は一致するという前提で行われる。 Such motion determination is performed on the premise that if the subject is stationary, the image signals (pixel values) will match if the sensitivity difference between the image signals with different sensitivities is corrected.
 しかしながら、サブピクセル構造のイメージセンサでは、そもそも大画素と小画素の配置位置が異なる、つまり大画素で得られる信号と小画素で得られる信号の位相が異なるため、静止被写体に対しても上述の前提が当てはまらない(成立しない)。 However, in an image sensor with a sub-pixel structure, the arrangement positions of large pixels and small pixels are different in the first place. The premise does not apply (does not hold).
 そのため、サブピクセル構造のイメージセンサで得られた2つの画像信号に基づき動き判定を行い、その判定結果に応じてHDR合成を行ったとしても、動き判定の誤判定や誤補正が発生し、HDR合成により生成される画像にアーティファクトが生じてしまう。 Therefore, even if motion is judged based on two image signals obtained by an image sensor with a sub-pixel structure, and HDR synthesis is performed according to the judgment result, erroneous judgment and correction of motion judgment will occur, and HDR Artifacts will occur in the image generated by synthesis.
 本技術は、このような状況に鑑みてなされたものであり、アーティファクトの発生を抑制することができるようにするものである。 This technology has been developed in view of this situation, and is intended to suppress the occurrence of artifacts.
 本技術の一側面の信号処理装置は、第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成する第1のフィルタ処理部と、前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成する第2のフィルタ処理部と、前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う動き判定部とを備える。 A signal processing device according to one aspect of the present technology performs filtering on a first image signal using a first filter that extracts a low-frequency component, thereby generating a first low-frequency luminance signal. and a second filter for extracting a low-frequency component and correcting a phase difference, unlike the first filter, for a second image signal having a phase different from that of the first image signal. A second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal by performing filtering with the first low-frequency luminance signal and the second low-frequency luminance signal and a motion determination unit that performs motion determination based on the low-frequency luminance signal.
 本技術の一側面の信号処理方法またはプログラムは、第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成し、前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成し、前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行うステップを含む。 A signal processing method or program according to one aspect of the present technology generates a first low-frequency luminance signal by filtering a first image signal using a first filter that extracts low-frequency components, A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. a step of generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal and performing motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal; including.
 本技術の一側面においては、第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号が生成され、前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号が生成され、前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定が行われる。 In one aspect of the present technology, a first low-frequency luminance signal is generated by filtering a first image signal using a first filter that extracts low-frequency components, and the first image is Unlike the first filter, a second image signal having a different phase from the signal is filtered by a second filter that extracts low-frequency components and corrects the phase difference. A second low-frequency luminance signal having the same phase as the low-frequency luminance signal is generated, and motion determination is performed based on the first low-frequency luminance signal and the second low-frequency luminance signal.
サブピクセル構造のイメージセンサについて説明する図である。It is a figure explaining the image sensor of a sub-pixel structure. 大画素画像と小画素画像の位相のずれについて説明する図である。It is a figure explaining the phase shift of a large pixel image and a small pixel image. 動き判定について説明する図である。It is a figure explaining motion determination. 信号処理装置の構成例を示す図である。It is a figure which shows the structural example of a signal processing apparatus. 大画素用輝度生成フィルタと小画素用輝度生成フィルタの例を示す図である。FIG. 4 is a diagram showing an example of a large-pixel luminance generation filter and a small-pixel luminance generation filter; 画像合成処理を説明するフローチャートである。10 is a flowchart for explaining image compositing processing; 本技術の効果について説明する図である。It is a figure explaining the effect of this technique. 小画素用輝度生成フィルタの他の例を示す図である。FIG. 10 is a diagram showing another example of a small-pixel luminance generation filter; コンピュータの構成例を示す図である。It is a figure which shows the structural example of a computer. 車両制御システムの概略的な構成の一例を示すブロック図である。1 is a block diagram showing an example of a schematic configuration of a vehicle control system; FIG. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
 以下、図面を参照して、本技術を適用した実施の形態について説明する。 Embodiments to which the present technology is applied will be described below with reference to the drawings.
〈第1の実施の形態〉
〈本技術について〉
 本技術は、動き判定に低周波の輝度信号を用い、かつ動き判定に用いる信号に対してのみ位相差補正を行うことで、アーティファクトの発生を抑制できるようにするものである。
<First Embodiment>
<About this technology>
The present technology is intended to suppress the occurrence of artifacts by using a low-frequency luminance signal for motion determination and performing phase difference correction only on the signal used for motion determination.
 以下では、サブピクセル構造のイメージセンサで得られた2つの画像信号に基づいてHDR合成を行う場合を例として説明する。そのような場合、イメージセンサには、例えば図1に示す配置で単位画素が設けられている。 In the following, an example of performing HDR synthesis based on two image signals obtained by an image sensor with a sub-pixel structure will be described. In such a case, the image sensor is provided with unit pixels arranged, for example, as shown in FIG.
 例えば1つの単位画素PX1内には、互いに感度が異なる大画素SP1と小画素SP2とが隣接して設けられている。 For example, in one unit pixel PX1, a large pixel SP1 and a small pixel SP2 having different sensitivities are provided adjacent to each other.
 すなわち、大画素SP1の受光面の面積(サイズ)が、小画素SP2の受光面の面積よりも大きくなるように大画素SP1と小画素SP2が形成されており、大画素SP1は小画素SP2よりも高い感度を有している。また、ここでは大画素SP1の図中、右上側に小画素SP2が配置されている。 That is, the large pixel SP1 and the small pixel SP2 are formed so that the area (size) of the light receiving surface of the large pixel SP1 is larger than that of the small pixel SP2, and the large pixel SP1 is larger than the small pixel SP2. also has high sensitivity. Also, here, the small pixel SP2 is arranged on the upper right side of the large pixel SP1 in the figure.
 サブピクセル構造のイメージセンサには、単位画素PX1と同じ構成の単位画素が複数行列状に配置されている。 In an image sensor with a sub-pixel structure, multiple unit pixels with the same configuration as the unit pixel PX1 are arranged in a matrix.
 さらに、この例では、各単位画素にはR(赤)、G(緑)、およびB(青)の何れかのカラーフィルタが設けられており、R、G、およびBの各色のカラーフィルタを有する単位画素が市松状に配置されている。 Furthermore, in this example, each unit pixel is provided with a color filter of one of R (red), G (green), and B (blue). unit pixels are arranged in a checkered pattern.
 すなわち、単位画素PX1を含む、図中、横方向に並ぶ複数の単位画素からなる画素行では、Bのカラーフィルタを有する単位画素と、Gのカラーフィルタを有する単位画素とが図中、横方向に交互に並べられている。 That is, in a pixel row including a unit pixel PX1 and composed of a plurality of unit pixels arranged in the horizontal direction in the drawing, a unit pixel having a B color filter and a unit pixel having a G color filter are arranged in the horizontal direction in the drawing. are arranged alternately.
 また、単位画素PX1を含む画素行に対して、図中、上下方向に隣接して設けられた画素行では、Rのカラーフィルタを有する単位画素と、Gのカラーフィルタを有する単位画素とが図中、横方向に交互に並べられている。 In addition, in the pixel row provided vertically adjacent to the pixel row including the unit pixel PX1, the unit pixel having the R color filter and the unit pixel having the G color filter are arranged in the figure. They are arranged alternately in the middle and the horizontal direction.
 例えばGのカラーフィルタを有する単位画素PX1に注目すると、単位画素PX1の図中、上下方向に隣接する単位画素はRのカラーフィルタを有しており、単位画素PX1の図中、左右方向に隣接する単位画素はBのカラーフィルタを有している。 For example, focusing on a unit pixel PX1 having a G color filter, the unit pixel adjacent in the vertical direction in the figure of the unit pixel PX1 has an R color filter, and the unit pixel PX1 is adjacent in the horizontal direction in the figure. Each unit pixel has a B color filter.
 なお、イメージセンサの各単位画素はどのようなカラーフィルタを有していてもよく、また、各単位画素にカラーフィルタが設けられていない構成とされてもよい。 It should be noted that each unit pixel of the image sensor may have any color filter, and each unit pixel may have a configuration in which no color filter is provided.
 また、以下、イメージセンサに設けられた大画素により撮像された画像の画像信号、すなわちイメージセンサで露光動作(撮像処理)を行い、各大画素から読み出された画素信号からなる画像信号を大画素画像信号とも称し、大画素画像信号に基づく画像を大画素画像とも称することとする。 Further, hereinafter, an image signal of an image picked up by large pixels provided in an image sensor, that is, an image signal composed of pixel signals read out from each large pixel by performing an exposure operation (imaging processing) on the image sensor is converted into a large image signal. Also referred to as a pixel image signal, an image based on a large pixel image signal is also referred to as a large pixel image.
 同様に、以下、イメージセンサに設けられた小画素により撮像された画像の画像信号、すなわちイメージセンサで露光動作を行い、各小画素から読み出された画素信号からなる画像信号を小画素画像信号とも称し、小画素画像信号に基づく画像を小画素画像とも称することとする。 Similarly, hereinafter, an image signal of an image picked up by small pixels provided in an image sensor, that is, an image signal composed of pixel signals read out from each small pixel after an exposure operation is performed on the image sensor, is referred to as a small pixel image signal. An image based on a small pixel image signal is also called a small pixel image.
 いま、図1に示した配置の単位画素を有するイメージセンサで大画素画像と小画素画像を撮像し、HDR合成を行うことを考える。 Now, let's consider capturing a large-pixel image and a small-pixel image with an image sensor having unit pixels arranged as shown in Fig. 1, and performing HDR synthesis.
 この場合、大画素画像(大画素画像信号)、小画素画像(小画素画像信号)、およびHDR合成ゲインに基づいて動き判定を行い、その動き判定の結果等から求まる所定の合成係数を用いて大画素画像と小画素画像のHDR合成を行うとする。 In this case, motion determination is performed based on the large pixel image (large pixel image signal), small pixel image (small pixel image signal), and HDR synthesis gain, and a predetermined synthesis coefficient obtained from the result of the motion determination is used. Assume that HDR synthesis of a large pixel image and a small pixel image is performed.
 なお、HDR合成ゲインは、大画素画像と小画素画像の明るさを合わせるための係数であり、例えば大画素画像と小画素画像の感度差や露光時間の差などに基づいて予め算出される。また、合成係数は、HDR合成時における大画素画像と小画素画像の合成比(混合比)を示す係数である。 Note that the HDR synthesis gain is a coefficient for matching the brightness of the large pixel image and the small pixel image, and is calculated in advance based on, for example, the sensitivity difference and the exposure time difference between the large pixel image and the small pixel image. Also, the synthesis coefficient is a coefficient that indicates the synthesis ratio (mixing ratio) of the large pixel image and the small pixel image at the time of HDR synthesis.
 例えば図2に示す被写体をイメージセンサにより撮像すると、大画素画像と小画素画像は基本的には同じ画像となる。 For example, when the subject shown in FIG. 2 is captured by an image sensor, the large pixel image and the small pixel image are basically the same image.
 すなわち、大画素画像と小画素画像の両方の画像には、室内にいる人や、部屋の窓、窓の外のビルなどが被写体として写っている。この例では、例えば領域R11の部分では、屋外にあるビルの窓部分が被写体として存在している。 In other words, in both the large-pixel image and the small-pixel image, subjects such as people in the room, the window of the room, and the building outside the window are shown. In this example, for example, in the region R11, the window portion of an outdoor building exists as a subject.
 ところが、大画素画像と小画素画像とでは位相が異なる、つまり対応する画素の位置が異なるため、撮像結果にずれが生じる。 However, the phases of the large pixel image and the small pixel image are different, that is, the positions of the corresponding pixels are different, so there is a deviation in the imaging result.
 具体的には、例えば大画素画像における、図1の大画素SP1に対応する画素を注目画素とする。このとき、小画素画像において、注目画素と対応する画素、つまり大画素画像における注目画素の位置と同じ位置関係にある、小画素画像上の画素(以下、対応画素とも称する)は、小画素SP2に対応する画素となる。 Specifically, for example, the pixel corresponding to the large pixel SP1 in FIG. 1 in the large pixel image is the pixel of interest. At this time, in the small pixel image, the pixel corresponding to the pixel of interest, that is, the pixel on the small pixel image that has the same positional relationship as the position of the pixel of interest in the large pixel image (hereinafter also referred to as the corresponding pixel) is the small pixel SP2. becomes a pixel corresponding to .
 このように、注目画素と対応画素とは同じ位置関係にあるが、図1に示したように大画素SP1と小画素SP2は配置位置が異なる、つまり配置位置にずれがある。そのため、仮に被写体が静止していたとしても、注目画素と対応画素とでは互いに異なる被写体(被写体の異なる部分)が写っていることになる。 In this way, the target pixel and the corresponding pixel have the same positional relationship, but as shown in FIG. 1, the large pixel SP1 and the small pixel SP2 have different arrangement positions, that is, there is a deviation in arrangement position. Therefore, even if the subject is stationary, different subjects (different parts of the subject) are captured in the pixel of interest and the corresponding pixel.
 また、動き判定では、互いに感度の異なる大画素画像と小画素画像の差分の大きさに基づいて、動きの有無が判定される。 Also, in motion determination, the presence or absence of motion is determined based on the magnitude of the difference between the large-pixel image and the small-pixel image, which have different sensitivities.
 例えば、大画素と小画素とで、被写体の明るさに対して図3に示す信号レベル(画素値)が得られるとする。 For example, it is assumed that the signal levels (pixel values) shown in FIG. 3 can be obtained with respect to the brightness of the subject with large pixels and small pixels.
 なお、図3において横軸は被写体の明るさを示しており、縦軸は画素信号の値(画素値)、つまり被写体を撮像したときの信号レベルを示している。 In FIG. 3, the horizontal axis indicates the brightness of the subject, and the vertical axis indicates the pixel signal value (pixel value), that is, the signal level when the subject is captured.
 また、折れ線L11は、各明るさに対して大画素SP1で得られる信号レベル(画素信号の値)を示しており、直線L12は、各明るさに対して小画素SP2で得られる信号レベル(画素信号の値)を示している。 The polygonal line L11 indicates the signal level (pixel signal value) obtained in the large pixel SP1 for each brightness, and the straight line L12 indicates the signal level (pixel signal value) obtained in the small pixel SP2 for each brightness. pixel signal value).
 いま、仮に大画素SP1の画素信号と小画素SP2の画素信号の位相が同じである、つまり大画素SP1の中心位置と小画素SP2の中心位置とが同じ位置であるとする。 Now, suppose that the pixel signal of the large pixel SP1 and the pixel signal of the small pixel SP2 have the same phase, that is, the center position of the large pixel SP1 and the center position of the small pixel SP2 are at the same position.
 また、被写体が静止している状態で撮像を行った結果、例えば矢印Q11に示すように小画素SP2の画素信号の値として信号レベルP11が得られ、大画素SP1の画素信号の値として信号レベルP12が得られたとする。 Further, as a result of imaging with the subject stationary, for example, as indicated by an arrow Q11, a signal level P11 is obtained as the pixel signal value of the small pixel SP2, and a signal level P11 is obtained as the pixel signal value of the large pixel SP1. Suppose that P12 is obtained.
 そのような場合、信号レベルP11に対してHDR合成ゲインを乗算することで得られる値は、信号レベルP12と一致する。 In such a case, the value obtained by multiplying the signal level P11 by the HDR synthesis gain matches the signal level P12.
 そのため、例えば信号レベルP11に対するHDR合成ゲインの乗算結果と信号レベルP12との差分を、所定の閾値やノイズレベル等と比較して動き判定を行うと、動きがない旨の判定結果が得られる。 Therefore, for example, if the difference between the signal level P11 multiplied by the HDR synthesis gain and the signal level P12 is compared with a predetermined threshold value, noise level, or the like to determine motion, a determination result indicating that there is no motion can be obtained.
 一方、被写体が動いている状態で撮像を行った結果、例えば矢印Q12に示すように小画素SP2の画素信号の値として信号レベルP21が得られ、大画素SP1の画素信号の値として信号レベルP22が得られたとする。 On the other hand, as a result of capturing an image while the subject is moving, a signal level P21 is obtained as the pixel signal value of the small pixel SP2 as indicated by an arrow Q12, and a signal level P22 is obtained as the pixel signal value of the large pixel SP1. is obtained.
 そのような場合、信号レベルP21に対してHDR合成ゲインを乗算することで得られる値は、折れ線L11上における同じ明るさでの信号レベルP23とは一致するが、実際の大画素SP1の信号レベルP22とは一致しない。 In such a case, the value obtained by multiplying the signal level P21 by the HDR synthesis gain matches the signal level P23 at the same brightness on the polygonal line L11, but the actual signal level of the large pixel SP1 Does not match P22.
 すなわち、信号レベルP21に対してHDR合成ゲインを乗算しても、その結果として得られる値は信号レベルP22とは一致せず、信号レベルP21に対するHDR合成ゲインの乗算結果と信号レベルP22とには差分が生じる。 That is, even if the signal level P21 is multiplied by the HDR synthesis gain, the resulting value does not match the signal level P22. A difference occurs.
 これは、撮像対象の被写体が動被写体であるため、大画素画像と小画素画像とで露光時間等が異なると、大画素SP1と小画素SP2とでは写る被写体の変化等によって被写体の明るさが変化するからである。 This is because the subject to be imaged is a moving subject, and if the exposure time, etc., differs between the large-pixel image and the small-pixel image, the brightness of the subject changes due to changes in the subject captured between the large-pixel SP1 and small-pixel SP2 pixels. Because it changes.
 したがって、例えば信号レベルP21に対するHDR合成ゲインの乗算結果と信号レベルP22との差分を、所定の閾値やノイズレベル等と比較して動き判定を行うと、動きがある旨の判定結果が得られる。 Therefore, for example, if the difference between the multiplication result of the HDR synthesis gain for the signal level P21 and the signal level P22 is compared with a predetermined threshold value, noise level, etc., and motion determination is performed, a determination result indicating that there is motion can be obtained.
 しかし、上述のように大画素画像の画像信号と小画素画像の画像信号、すなわち大画素の画素信号と小画素の画素信号は位相が異なっている。 However, as described above, the image signal of the large pixel image and the image signal of the small pixel image, that is, the pixel signal of the large pixel and the pixel signal of the small pixel are out of phase.
 そのため、被写体が静止していれば、小画素の画素信号にHDR合成ゲインを乗算して得られるものと、その小画素に対応する大画素の画素信号とは一致する、つまり感度差を補正すれば小画素と大画素の画素信号は一致するという動き判定の前提が当てはまらない。 Therefore, if the subject is stationary, the pixel signal obtained by multiplying the pixel signal of the small pixel by the HDR synthesis gain will match the pixel signal of the large pixel corresponding to that small pixel. Therefore, the premise of motion determination that pixel signals of small pixels and large pixels match does not apply.
 したがって、実際に得られた大画素の画素信号と小画素の画素信号に基づいて動き判定を行うと、大画素と小画素との画素信号の位相差によって、静止被写体に対してもそれらの画素信号の差分が大きくなってしまい、誤判定が生じてしまうことがある。 Therefore, when motion determination is performed based on the pixel signals of the large pixels and the pixel signals of the small pixels that are actually obtained, the phase difference between the pixel signals of the large pixels and the small pixels can be used even for a stationary object. The signal difference may become large, resulting in an erroneous determination.
 また、そのような動き判定の結果を用いてHDR合成を行うと、動きの誤補正が発生してしまうおそれもある。このような誤判定や誤補正は、HDR合成により生成される画像(以下、合成画像とも称する)にアーティファクトを生じさせる要因となる。 Also, if HDR synthesis is performed using such motion determination results, there is a risk that erroneous correction of motion may occur. Such erroneous determination and erroneous correction cause artifacts in an image generated by HDR synthesis (hereinafter also referred to as a synthesized image).
 このような位相差による誤判定や誤補正を抑制するために、例えば小画素画像の画像信号に対して位相差補正を行い、大画素の画素信号と小画素の画素信号とを同じ位相の信号として動き判定およびHDR合成を行うことも考えられる。 In order to suppress erroneous determination and erroneous correction due to such a phase difference, for example, phase difference correction is performed on the image signal of the small pixel image, and the pixel signal of the large pixel and the pixel signal of the small pixel are signals of the same phase. It is also conceivable to perform motion determination and HDR synthesis as
 しかしながら、位相差補正を行う場合、画像の絵柄を壊さずに、つまり歪み等を発生させることなく高周波信号の位相を補正する(ずらす)ことは困難である。 However, when performing phase difference correction, it is difficult to correct (shift) the phase of the high-frequency signal without destroying the pattern of the image, that is, without causing distortion or the like.
 また、画像に対する位相差補正を行うと、偽色やがたつき等のアーティファクトが発生して画像の鮮鋭感が失われるなどの弊害が生じたり、位相差補正を実現するために回路規模が増大するなど、コストが高くなってしまったりする。特に、位相差補正を行う場合、ノイズリダクション等を行っても、位相差補正を行わずに動き判定をする場合よりも鮮鋭感の低い合成画像しか得られないこともある。 In addition, when phase difference correction is performed on an image, artifacts such as false colors and rattling occur, causing adverse effects such as loss of sharpness of the image. and so on, the cost becomes high. In particular, when phase difference correction is performed, even if noise reduction or the like is performed, only a synthesized image with a lower sharpness than when motion determination is performed without phase difference correction may be obtained.
 そこで、本技術では、動き判定に低周波の輝度信号を使用し、かつ動き判定に使用する信号に対してのみ位相差補正を行うことで、誤判定や誤補正の発生を抑制することができるようにした。これにより、アーティファクトの発生を抑制し、より高品質な合成画像を得ることができる。 Therefore, in the present technology, by using a low-frequency luminance signal for motion determination and performing phase difference correction only on the signal used for motion determination, it is possible to suppress the occurrence of erroneous determination and erroneous correction. I made it As a result, it is possible to suppress the occurrence of artifacts and obtain a composite image of higher quality.
 特に、本技術では、低周波の輝度信号が用いられて動き判定が行われるが、このような低周波信号は高周波信号と比較して容易に位相を補正する(ずらす)ことが可能である。したがって、本技術によれば、回路規模の増加などといったコストの増加を生じさせることなく、誤判定や誤補正の発生を抑制することができる。 In particular, in this technology, a low-frequency luminance signal is used to determine motion, and such a low-frequency signal can be phase-corrected (shifted) more easily than a high-frequency signal. Therefore, according to the present technology, occurrence of erroneous determination and erroneous correction can be suppressed without causing an increase in cost such as an increase in circuit scale.
 また、本技術では、合成画像の生成には位相差補正が行われていない信号を用いるため、アーティファクトの発生等により画像の鮮鋭感が失われてしまうなどの弊害が生じにくい。 In addition, since this technology uses signals that have not been phase-difference-corrected to generate a composite image, it is less likely that adverse effects such as loss of image sharpness due to the occurrence of artifacts will occur.
〈信号処理装置の構成例〉
 図4は、本技術を適用した信号処理装置の一実施の形態の構成例を示す図である。
<Configuration example of signal processing device>
FIG. 4 is a diagram illustrating a configuration example of an embodiment of a signal processing device to which the present technology is applied.
 図4に示す信号処理装置11は、例えば車載用のカメラデバイスなどからなる。 The signal processing device 11 shown in FIG. 4 is composed of, for example, an in-vehicle camera device.
 信号処理装置11は、イメージセンサ21、信号生成部22、動き判定部23、およびHDR合成部24を有している。 The signal processing device 11 has an image sensor 21 , a signal generation section 22 , a motion determination section 23 and an HDR synthesis section 24 .
 イメージセンサ21は、サブピクセル構造を有するイメージセンサであり、撮像を行うことで大画素画像の画像信号と小画素画像の画像信号を生成する撮像部として機能する。 The image sensor 21 is an image sensor having a sub-pixel structure, and functions as an imaging unit that generates an image signal of a large-pixel image and an image signal of a small-pixel image by performing imaging.
 例えばイメージセンサ21には、図1に示した配置で、互いに感度および位相(配置位置)の異なる大画素と小画素が複数設けられている。 For example, the image sensor 21 is provided with a plurality of large pixels and small pixels having different sensitivities and phases (arrangement positions) from each other in the arrangement shown in FIG.
 イメージセンサ21は、入射した光を光電変換することで被写体を撮像し、その結果得られた大画素画像の画像信号と小画素画像の画像信号を信号生成部22およびHDR合成部24に供給する。すなわち、イメージセンサ21を構成する複数の大画素による撮像によって大画素画像の画像信号が生成され、イメージセンサ21を構成する複数の小画素による撮像によって小画素画像の画像信号が生成される。これらの大画素画像の画像信号と小画素画像の画像信号は、互いに位相の異なる信号である。 The image sensor 21 captures an image of a subject by photoelectrically converting incident light, and supplies image signals of the resulting large-pixel image and small-pixel image to the signal generating unit 22 and the HDR synthesizing unit 24 . . That is, an image signal of a large pixel image is generated by imaging with a plurality of large pixels forming the image sensor 21, and an image signal of a small pixel image is generated by imaging by a plurality of small pixels forming the image sensor 21. FIG. The image signal of the large pixel image and the image signal of the small pixel image are signals with phases different from each other.
 なお、ここではイメージセンサ21が信号処理装置11に設けられている例について説明するが、イメージセンサ21は信号処理装置11外に設けられているようにしてもよい。 Although an example in which the image sensor 21 is provided in the signal processing device 11 will be described here, the image sensor 21 may be provided outside the signal processing device 11 .
 信号生成部22は、イメージセンサ21から供給された大画素画像の画像信号と小画素画像の画像信号に基づいて、動き判定のための低周波の輝度信号(低周波輝度信号)を生成し、動き判定部23に供給する。 The signal generation unit 22 generates a low-frequency luminance signal (low-frequency luminance signal) for motion determination based on the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21, Supplied to the motion determination unit 23 .
 信号生成部22は、大画素フィルタ処理部31および小画素フィルタ処理部32を有している。 The signal generating section 22 has a large pixel filtering section 31 and a small pixel filtering section 32 .
 大画素フィルタ処理部31は、大画素画像の画像信号に対して、大画素用輝度生成フィルタによるフィルタリング(フィルタ処理)を行うことで、大画素画像の低周波の輝度信号である大画素低周波輝度信号を生成する。 The large-pixel filter processing unit 31 performs filtering (filtering) on the image signal of the large-pixel image using a large-pixel luminance generation filter to convert the large-pixel low-frequency signal, which is the low-frequency luminance signal of the large-pixel image. Generate a luminance signal.
 例えば大画素用輝度生成フィルタは、大画素画像の画像信号から低周波成分、より詳細には所定周波数以下の輝度成分(低周波の輝度成分)を抽出するローパスフィルタなどとされる。 For example, the large-pixel luminance generation filter is a low-pass filter that extracts low-frequency components, more specifically, luminance components below a predetermined frequency (low-frequency luminance components) from the image signal of the large-pixel image.
 小画素フィルタ処理部32は、小画素画像の画像信号に対して、小画素用輝度生成フィルタによるフィルタリング(フィルタ処理)を行うことで、位相差補正が施された小画素画像の低周波の輝度信号である小画素低周波輝度信号を生成する。このようにして得られる小画素低周波輝度信号は、大画素低周波輝度信号と同じ位相を有する信号である。 The small pixel filter processing unit 32 performs filtering (filtering) on the image signal of the small pixel image using a small pixel luminance generation filter, thereby reducing the low-frequency luminance of the phase difference corrected small pixel image. A small pixel low frequency luminance signal is generated. The small-pixel low-frequency luminance signal thus obtained is a signal having the same phase as the large-pixel low-frequency luminance signal.
 ここで、小画素用輝度生成フィルタは、大画素用輝度生成フィルタとは異なるフィルタとされる。例えば小画素用輝度生成フィルタは、小画素画像の画像信号から、低周波成分(低周波の輝度成分)を抽出するローパスフィルタとして機能するとともに、小画素画像の位相が大画素画像と同じ位相となるような位相差補正も実現するフィルタなどとされる。 Here, the luminance generation filter for small pixels is a filter different from the luminance generation filter for large pixels. For example, the small-pixel luminance generation filter functions as a low-pass filter that extracts low-frequency components (low-frequency luminance components) from the image signal of the small-pixel image. It is considered as a filter or the like that also realizes phase difference correction such as
 信号生成部22は、大画素フィルタ処理部31で得られた大画素低周波輝度信号と、小画素フィルタ処理部32で得られた小画素低周波輝度信号とを、動き判定のための低周波輝度信号として動き判定部23に供給する。 The signal generation unit 22 converts the large pixel low-frequency luminance signal obtained by the large pixel filtering unit 31 and the small pixel low-frequency luminance signal obtained by the small pixel filtering unit 32 into low-frequency signals for motion determination. It is supplied to the motion determination unit 23 as a luminance signal.
 動き判定部23は、信号生成部22から供給された大画素低周波輝度信号および小画素低周波輝度信号に基づいて動き判定を行い、その判定結果をHDR合成部24に供給する。 The motion determination unit 23 performs motion determination based on the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal supplied from the signal generation unit 22, and supplies the determination result to the HDR synthesis unit 24.
 HDR合成部24は、動き判定部23から供給された動き判定の判定結果に基づいて、イメージセンサ21から供給された大画素画像の画像信号と小画素画像の画像信号とをHDR合成し、その結果得られた合成画像を後段に出力する。 The HDR synthesizing unit 24 HDR synthesizes the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21 based on the determination result of the motion determination supplied from the motion determining unit 23 . The synthesized image obtained as a result is output to the subsequent stage.
〈低周波輝度信号の生成について〉
 ここで、大画素用輝度生成フィルタと小画素用輝度生成フィルタの例について説明する。
<Generation of low-frequency luminance signal>
Here, examples of the large-pixel luminance generation filter and the small-pixel luminance generation filter will be described.
 例えば、イメージセンサ21の画素配置が図1に示した配置とされる場合、大画素用輝度生成フィルタおよび小画素用輝度生成フィルタは、図5に示すフィルタとされる。 For example, when the pixel arrangement of the image sensor 21 is the arrangement shown in FIG. 1, the large pixel luminance generation filter and the small pixel luminance generation filter are the filters shown in FIG.
 図5では、矢印Q21に示す部分に大画素用輝度生成フィルタの例が示されており、矢印Q22に示す部分に小画素用輝度生成フィルタの例が示されている。 In FIG. 5, the portion indicated by arrow Q21 shows an example of the luminance generation filter for large pixels, and the portion indicated by arrow Q22 shows an example of the luminance generation filter for small pixels.
 ここでは、大画素用輝度生成フィルタは、3×3のサイズのフィルタとなっており、各四角形内に記された数値が、それらの四角形に対応する画素の画素値に乗算されるフィルタの係数を表している。なお、ここでいうフィルタのサイズとは、フィルタのタップ数、すなわちフィルタリングの対象となる画素領域の画素数(サイズ)である。 Here, the large-pixel luminance generation filter is a filter with a size of 3×3, and the numerical values written in each square are the coefficients of the filter that are multiplied by the pixel values of the pixels corresponding to those squares. represents. Note that the size of the filter here means the number of taps of the filter, that is, the number of pixels (size) of the pixel area to be filtered.
 具体的な例として、例えば図1に示した大画素SP1を中心とする3×3の画素領域に対して、図5に示す大画素用輝度生成フィルタを用いたフィルタリングを行う場合について説明する。 As a specific example, a case will be described in which filtering is performed using the large pixel luminance generation filter shown in FIG.
 この場合、図1中、大画素SP1の左上、真上、および右上に隣接する大画素の画素信号の値(画素値)に対して、係数の値である1、2、および1が乗算される。 In this case, the values of the pixel signals (pixel values) of the large pixels adjacent to the upper left, right above, and upper right of the large pixel SP1 in FIG. 1 are multiplied by the coefficient values 1, 2, and 1. be.
 また、図1中、大画素SP1の左側に隣接する大画素、大画素SP1、および図1中、大画素SP1の右側に隣接する大画素のそれぞれの画素信号の値(画素値)に対して、係数の値である2、4、および2が乗算される。 Further, in FIG. 1, for each pixel signal value (pixel value) of the large pixel adjacent to the left side of the large pixel SP1, the large pixel SP1, and the large pixel adjacent to the right side of the large pixel SP1 in FIG. , are multiplied by the coefficient values 2, 4, and 2.
 同様に、図1中、大画素SP1の左下、真下、および右下に隣接する大画素の画素信号の値(画素値)に対して、係数の値である1、2、および1が乗算される。 Similarly, in FIG. 1, the pixel signal values (pixel values) of the large pixels adjacent to the lower left, just below, and lower right of the large pixel SP1 are multiplied by coefficient values 1, 2, and 1. be.
 さらに、このようにして係数が乗算された合計9個の画素値の和(係数乗算後の画素値の和)が求められ、その和が9個の各係数の和である16で除算され、その除算結果がフィルタリングにより得られる1つの画素の画素信号の値、つまり1画素分の大画素低周波輝度信号の値とされる。 Furthermore, the sum of a total of 9 pixel values multiplied by the coefficients in this way (the sum of the pixel values after coefficient multiplication) is obtained, and the sum is divided by 16, which is the sum of the 9 coefficients, The result of the division is the value of the pixel signal of one pixel obtained by filtering, that is, the value of the large pixel low-frequency luminance signal for one pixel.
 この場合、新たに生成される大画素低周波輝度信号の位相、つまり画素の位置は、大画素SP1の中心位置となる。 In this case, the phase of the newly generated large pixel low-frequency luminance signal, that is, the position of the pixel is the center position of the large pixel SP1.
 なお、大画素用輝度生成フィルタの適用時には、対象となる3×3の画素領域の中心にある大画素のカラーフィルタの色によらず、矢印Q21に示す係数が用いられる。換言すれば、1つ(1画素分)の大画素低周波輝度信号の生成には、R、G、およびBの各色のカラーフィルタを有する大画素の画素信号が用いられる。 It should be noted that when applying the luminance generation filter for large pixels, the coefficient indicated by the arrow Q21 is used regardless of the color of the large pixel color filter in the center of the target 3×3 pixel region. In other words, pixel signals of large pixels having R, G, and B color filters are used to generate one (one pixel) large pixel low-frequency luminance signal.
 一方、矢印Q22に示すように、小画素用輝度生成フィルタも大画素用輝度生成フィルタと同様に3×3のサイズのフィルタとなっている。ここでは、各四角形内に記された数値が、それらの四角形に対応する画素の画素値に乗算されるフィルタの係数を表している。 On the other hand, as indicated by an arrow Q22, the small pixel luminance generation filter is also a 3×3 size filter like the large pixel luminance generation filter. Here, the numerical values written in each rectangle represent the coefficients of the filters by which the pixel values of the pixels corresponding to those rectangles are multiplied.
 小画素用輝度生成フィルタは、3×3のフィルタであるが、対象となる画素領域のうち、図中、最も上側の画素行および図中、最も右側の画素列を構成する各画素の係数が全て0となっている。そのため、小画素用輝度生成フィルタは実質的には2×2のサイズのフィルタとなっている。 The small-pixel luminance generation filter is a 3×3 filter, and the coefficients of the pixels forming the uppermost pixel row and the rightmost pixel column in the target pixel region are All are 0. Therefore, the small-pixel luminance generation filter is substantially a 2×2 size filter.
 具体的な例として、例えば図1に示した小画素SP2を中心とする3×3の画素領域に対して、図5に示す小画素用輝度生成フィルタを用いたフィルタリングを行う場合について説明する。 As a specific example, a case will be described in which filtering is performed using the small pixel luminance generation filter shown in FIG.
 この場合、図1中、小画素SP2の左側に隣接する小画素および小画素SP2のそれぞれの画素信号の値(画素値)に対して、係数の値である1が乗算される。 In this case, the value of the pixel signal (pixel value) of each of the small pixels adjacent to the left of the small pixel SP2 in FIG.
 また、図1中、小画素SP2の左下および真下に位置する小画素の画素信号の値(画素値)に対して、係数の値である1が乗算される。 Also, in FIG. 1, the value of the pixel signal (pixel value) of the small pixels positioned to the left and directly below the small pixel SP2 is multiplied by 1, which is the value of the coefficient.
 さらに、このようにして係数が乗算された合計4個の画素値の和が求められ、その和が4個の各係数の和である4で除算され、その除算結果がフィルタリングにより得られる1つの画素の画素信号の値、つまり1画素分の小画素低周波輝度信号の値とされる。 Furthermore, the sum of the four pixel values multiplied by the coefficients in this way is obtained, the sum is divided by 4 which is the sum of the four coefficients, and the result of the division is the one obtained by filtering. It is the value of the pixel signal of the pixel, that is, the value of the small pixel low-frequency luminance signal for one pixel.
 この場合、新たに生成される小画素低周波輝度信号の位相、つまり画素の位置は、小画素低周波輝度信号の生成に用いられた4つの小画素の中心の位置であり、その位置は大画素SP1の中心位置となる。したがって、生成された小画素低周波輝度信号は、対応する大画素低周波輝度信号と同じ位相の信号となり、小画素用輝度生成フィルタにより位相差補正が実現されることが分かる。 In this case, the phase of the newly generated small-pixel low-frequency luminance signal, that is, the position of the pixel is the center position of the four small-pixels used to generate the small-pixel low-frequency luminance signal. It becomes the center position of the pixel SP1. Therefore, it can be seen that the generated small pixel low-frequency luminance signal has the same phase as the corresponding large pixel low-frequency luminance signal, and phase difference correction is realized by the small pixel luminance generation filter.
 なお、小画素用輝度生成フィルタの適用時には、大画素用輝度生成フィルタの適用時と同様に、対象となる3×3の画素領域の中心にある小画素のカラーフィルタの色によらず、矢印Q22に示す係数が用いられる。 Note that when applying the small pixel luminance generation filter, the arrow The factor shown in Q22 is used.
 以上のように、大画素用輝度生成フィルタと小画素用輝度生成フィルタは、少なくとも係数が異なるフィルタである。 As described above, the large-pixel luminance generation filter and the small-pixel luminance generation filter are filters having at least different coefficients.
〈画像合成処理の説明〉
 続いて、信号処理装置11の動作について説明する。すなわち、以下、図6のフローチャートを参照して、信号処理装置11による画像合成処理について説明する。
<Description of image synthesis processing>
Next, the operation of the signal processing device 11 will be described. That is, the image synthesizing process by the signal processing device 11 will be described below with reference to the flowchart of FIG.
 ステップS11においてイメージセンサ21の大画素および小画素は、入射した光を受光して光電変換することで、大画素画像および小画素画像を撮像する。 In step S11, the large pixels and small pixels of the image sensor 21 receive and photoelectrically convert the incident light to capture large pixel images and small pixel images.
 例えば大画素と小画素とでは、露光時間の長さや露光の開始または終了のタイミングが異なるように撮像が行われる。 For example, large pixels and small pixels are imaged so that the length of the exposure time and the timing of the start or end of exposure are different.
 イメージセンサ21は、各大画素での撮像により得られた大画素画像の画像信号と、各小画素での撮像により得られた小画素画像の画像信号とを、信号生成部22およびHDR合成部24に供給する。 The image sensor 21 converts the image signal of the large pixel image obtained by imaging with each large pixel and the image signal of the small pixel image obtained by imaging with each small pixel into a signal generation unit 22 and an HDR synthesis unit. 24.
 ステップS12において大画素フィルタ処理部31は、イメージセンサ21から供給された大画素画像の画像信号に対して、大画素用輝度生成フィルタによるフィルタリングを行うことで、複数の画素位置ごとに大画素低周波輝度信号を生成する。 In step S<b>12 , the large pixel filter processing unit 31 performs filtering on the image signal of the large pixel image supplied from the image sensor 21 using the large pixel luminance generation filter, thereby reducing the large pixel intensity for each of a plurality of pixel positions. generating a frequency luminance signal.
 ステップS13において小画素フィルタ処理部32は、イメージセンサ21から供給された小画素画像の画像信号に対して、小画素用輝度生成フィルタによるフィルタリングを行うことで、複数の画素位置ごとに小画素低周波輝度信号を生成する。 In step S<b>13 , the small pixel filter processing unit 32 filters the image signal of the small pixel image supplied from the image sensor 21 using the small pixel luminance generation filter, thereby reducing the small pixel intensity for each of a plurality of pixel positions. generating a frequency luminance signal.
 小画素フィルタ処理部32では、画像信号から低周波数の輝度信号が抽出されるとともに位相差補正(位相の補正)も行われるので、小画素低周波輝度信号として、大画素低周波輝度信号と同じ位相の輝度信号が得られる。 In the small pixel filter processing unit 32, a low-frequency luminance signal is extracted from the image signal and phase difference correction (phase correction) is also performed. A phase luminance signal is obtained.
 信号生成部22は、以上の処理により得られた大画素低周波輝度信号および小画素低周波輝度信号を動き判定部23に供給する。 The signal generation unit 22 supplies the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal obtained by the above processing to the motion determination unit 23 .
 なお、ここではステップS12およびステップS13の処理は、大画素画像および小画素画像に対するデモザイク処理が行われる前に実行されるが、デモザイク処理後に実行されるようにしてもよい。 Although the processing of steps S12 and S13 is executed before demosaic processing is performed on the large-pixel image and the small-pixel image, it may be executed after the demosaicing processing.
 ステップS14において動き判定部23は、信号生成部22から供給された大画素低周波輝度信号および小画素低周波輝度信号に基づいて動き判定を行い、その判定結果をHDR合成部24に供給する。 In step S14, the motion determination unit 23 performs motion determination based on the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal supplied from the signal generation unit 22, and supplies the determination result to the HDR synthesis unit 24.
 例えば動き判定部23は、HDR合成ゲインを小画素低周波輝度信号に乗算して得られる値と、大画素低周波輝度信号との差分を画素位置ごとに算出するとともに、得られた差分と、ノイズレベルや所定の動き判定閾値とを比較することで、被写体の動き量を求める。これにより、動き判定の結果として、画素位置ごとの被写体の動き量が得られる。 For example, the motion determination unit 23 calculates the difference between the value obtained by multiplying the small pixel low frequency luminance signal by the HDR synthesis gain and the large pixel low frequency luminance signal for each pixel position, and calculates the difference obtained, The motion amount of the subject is obtained by comparing the noise level and a predetermined motion determination threshold. As a result of the motion determination, the amount of motion of the subject for each pixel position is obtained.
 ステップS15においてHDR合成部24は、動き判定部23から供給された動き判定の判定結果に基づいて、イメージセンサ21から供給された大画素画像の画像信号と小画素画像の画像信号とをHDR合成することで合成画像の画像信号を生成する。 In step S15, the HDR synthesizing unit 24 HDR synthesizes the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21 based on the determination result of the motion determination supplied from the motion determining unit 23. By doing so, an image signal of a synthesized image is generated.
 例えばHDR合成部24は、大画素画像の画像信号に基づいて画素位置ごとに合成係数を生成するとともに、動き判定の判定結果に応じて動き補償を行うことで合成係数を調整し、最終的な合成係数を得る。そしてHDR合成部24は、HDR合成ゲインを小画素画像の画像信号に乗算して得られる値と、大画素画像の画像信号とを合成係数により合成(混合)することで、合成画像の画像信号を得る。 For example, the HDR synthesizing unit 24 generates a synthesizing coefficient for each pixel position based on the image signal of the large pixel image, adjusts the synthesizing coefficient by performing motion compensation according to the determination result of the motion determination, and finalizes the Get the composite coefficient. Then, the HDR synthesizing unit 24 synthesizes (mixes) the value obtained by multiplying the image signal of the small pixel image by the HDR synthesis gain and the image signal of the large pixel image using the synthesizing coefficient to obtain the image signal of the synthesized image. get
 このようにして得られた合成画像の画像信号がHDR合成部24により後段に出力されると、画像合成処理は終了する。 When the image signal of the synthesized image obtained in this way is output to the subsequent stage by the HDR synthesizing unit 24, the image synthesizing process ends.
 以上のようにして信号処理装置11は、同じ位相の大画素低周波輝度信号および小画素低周波輝度信号を生成し、動き判定を行うとともに、大画素画像と小画素画像とを合成し、合成画像を生成する。このように、同じ位相の低周波輝度信号を用いて動き判定を行うことで、誤判定や誤補正の発生を抑制することができる。これにより、アーティファクトの発生を抑制し、より高品質な合成画像を得ることができる。 As described above, the signal processing device 11 generates a large-pixel low-frequency luminance signal and a small-pixel low-frequency luminance signal of the same phase, performs motion determination, synthesizes the large-pixel image and the small-pixel image, and combines them. Generate an image. By performing motion determination using low-frequency luminance signals of the same phase in this way, it is possible to suppress the occurrence of erroneous determination and erroneous correction. As a result, it is possible to suppress the occurrence of artifacts and obtain a composite image of higher quality.
 例えば位相差補正を行うことで、動き判定部23では、図7に示すように動き判定の判定精度を向上させることができる。なお、図7では、被写体(画像)における明るい部分ほど、動き判定において得られた動き量が大きいことを表している。 For example, by performing phase difference correction, the motion determination unit 23 can improve the determination accuracy of motion determination as shown in FIG. Note that FIG. 7 indicates that the brighter the part of the subject (image), the greater the motion amount obtained in the motion determination.
 図7において、矢印Q31に示す部分には、図2に示した被写体を撮像して得られた大画素画像と小画素画像とから位相差補正を行わずに、低周波輝度信号を抽出して動き判定を行ったときの判定結果が示されている。 In FIG. 7, in the portion indicated by the arrow Q31, a low-frequency luminance signal is extracted from the large pixel image and the small pixel image obtained by imaging the subject shown in FIG. The result of motion determination is shown.
 例えば領域R11の部分にはビルの窓部分が被写体として存在しており、矢印Q31に示す判定結果では、そのビルの窓部分など、特に絵柄の細かい箇所、すなわち高周波成分が含まれている部分において誤判定が生じてしまっていることが分かる。 For example, in the region R11, a window of a building exists as a subject, and in the judgment result indicated by the arrow Q31, the window of the building has a particularly fine pattern, that is, a portion containing high-frequency components. It can be seen that an erroneous determination has occurred.
 これに対して、図7において、矢印Q32に示す部分には、図2に示した被写体を撮像して得られた大画素画像と小画素画像とから、信号生成部22で低周波輝度信号を抽出するとともに位相差補正を行い、動き判定部23で動き判定を行ったときの判定結果が示されている。すなわち、矢印Q32に示す部分には、信号処理装置11の信号生成部22および動き判定部23で処理を行うことで得られる動き判定の判定結果が示されている。 On the other hand, in FIG. 7, in the portion indicated by arrow Q32, the signal generator 22 generates a low-frequency luminance signal from the large-pixel image and the small-pixel image obtained by imaging the subject shown in FIG. The determination result when extracting and correcting the phase difference and performing the motion determination by the motion determination unit 23 is shown. That is, the portion indicated by the arrow Q32 shows the result of motion determination obtained by the processing performed by the signal generation unit 22 and the motion determination unit 23 of the signal processing device 11 .
 この例では領域R11の部分など、高周波成分が含まれている部分においても、矢印Q31に示した例と比較して、誤判定の発生が抑制されていることが分かる。 In this example, it can be seen that the occurrence of erroneous determinations is suppressed compared to the example indicated by the arrow Q31 even in a portion containing high-frequency components such as the region R11.
 以上のように、本技術では位相差補正を行い、かつ低周波の輝度信号を用いて動き判定を行うことで、回路規模の増加などといったコストの増加を生じさせることなく、図7に示したように動き判定時の誤判定の発生を抑制することができる。 As described above, in the present technology, phase difference correction is performed and motion determination is performed using a low-frequency luminance signal. Thus, it is possible to suppress the occurrence of erroneous determination at the time of motion determination.
 しかも、そのようにして得られた判定結果を用いて、低周波の輝度信号ではない、イメージセンサ21で得られた大画素画像と小画素画像をそのまま用いてHDR合成を行うことで、偽色やがたつき等のアーティファクトにより画像の鮮鋭感が失われるなどの弊害も生じにくくすることができる。その結果、より高品質でダイナミックレンジの広い合成画像を得ることができる。 Moreover, by using the determination result obtained in this way, the large pixel image and the small pixel image obtained by the image sensor 21, which are not low-frequency luminance signals, are used as they are to perform HDR synthesis. Artifacts such as jerkyness can also be made less likely to cause adverse effects such as loss of image sharpness. As a result, a composite image of higher quality and wider dynamic range can be obtained.
〈変形例〉
 なお、図5では、大画素用輝度生成フィルタや小画素用輝度生成フィルタが3×3のサイズのフィルタである例について説明したが、それらのフィルタは、例えば5×5など、どのようなサイズのフィルタであってもよい。
<Modification>
Note that FIG. 5 illustrates an example in which the large-pixel luminance generation filter and the small-pixel luminance generation filter are filters of size 3×3, but these filters may be of any size, such as 5×5. may be a filter of
 また、大画素用輝度生成フィルタと小画素用輝度生成フィルタは、同じサイズのフィルタであってもよいし、互いに異なるサイズのフィルタであってもよい。 Also, the large-pixel luminance generation filter and the small-pixel luminance generation filter may be filters of the same size, or may be filters of different sizes.
 さらに、小画素用輝度生成フィルタは、例えば図8に示すフィルタとされてもよい。 Furthermore, the small pixel luminance generation filter may be, for example, the filter shown in FIG.
 図8では、各四角形は小画素用輝度生成フィルタを適用する画素領域の画素を表しており、各四角形内の数値は、それらの四角形に対応する画素の画素値に乗算されるフィルタの係数を表している。 In FIG. 8, each rectangle represents a pixel in the pixel area to which the small pixel luminance generation filter is applied, and the numerical value in each rectangle indicates the coefficient of the filter by which the pixel value of the pixel corresponding to the rectangle is multiplied. represent.
 この例では、小画素用輝度生成フィルタは、3×3のサイズのフィルタとされており、対象となる画素領域の中心位置にある画素の画素値と、その画素の図中、左、左下、および下に隣接する画素の画素値には、係数として63が乗算される。 In this example, the small pixel luminance generation filter is a 3×3 size filter, and the pixel value of the pixel at the center position of the target pixel area and and the pixel values of the adjacent pixels below are multiplied by 63 as a factor.
 また、画素領域の中心位置にある画素に対して図中、左上、上、右、および右下に隣接する画素の画素値には係数として1が乗算され、画素領域の中心位置にある画素に対して図中、右上に隣接する画素の画素値には係数として0が乗算される。 Further, the pixel values of the pixels adjacent to the upper left, upper, right, and lower right in the drawing are multiplied by 1 as coefficients to the pixel at the center position of the pixel region, and the pixel at the center position of the pixel region is multiplied by 1 as a coefficient. On the other hand, in the drawing, the pixel value of the pixel adjacent to the upper right is multiplied by 0 as a coefficient.
 そして、それらの係数が乗算された各画素の画素値の和が256で除算され、その除算結果がフィルタリングにより得られる1つの画素の画素信号の値、つまり1画素分の小画素低周波輝度信号の値とされる。このようにすることでも、低周波輝度信号の抽出と位相差補正を実現することができる。 Then, the sum of the pixel values of each pixel multiplied by these coefficients is divided by 256, and the result of the division is the value of the pixel signal of one pixel obtained by filtering, that is, the small pixel low-frequency luminance signal for one pixel. is assumed to be the value of By doing so, extraction of the low-frequency luminance signal and correction of the phase difference can also be realized.
 その他、以上においては、サブピクセル構造を有する1つのイメージセンサ21で撮像された大画素画像と小画素画像の画像信号を対象として、動き判定やHDR合成を行う例について説明した。しかし、これに限らず、動き判定やHDR合成の対象とする2つの画像は、同じ被写体で互いに位相の異なる画像であれば、どのような画像であってもよい。 In addition, an example of performing motion determination and HDR synthesis on image signals of a large-pixel image and a small-pixel image captured by a single image sensor 21 having a sub-pixel structure has been described above. However, the two images to be subjected to motion determination and HDR synthesis are not limited to this, and may be any images as long as they are images of the same subject and of different phases.
 例えば2つのカメラを用いて互いに位相の異なる2つの画像を撮像し、それらの2つの画像に基づいて動き判定およびHDR合成を行って、合成画像を生成してもよい。 For example, two images with different phases may be captured using two cameras, motion determination and HDR synthesis may be performed based on the two images, and a synthesized image may be generated.
〈コンピュータの構成例〉
 ところで、上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウェアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
<Computer configuration example>
By the way, the series of processes described above can be executed by hardware or by software. When executing a series of processes by software, a program that constitutes the software is installed in the computer. Here, the computer includes, for example, a computer built into dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs.
 図9は、上述した一連の処理をプログラムにより実行するコンピュータのハードウェアの構成例を示すブロック図である。 FIG. 9 is a block diagram showing a hardware configuration example of a computer that executes the series of processes described above by a program.
 コンピュータにおいて、CPU(Central Processing Unit)501,ROM(Read Only Memory)502,RAM(Random Access Memory)503は、バス504により相互に接続されている。 In the computer, a CPU (Central Processing Unit) 501, a ROM (Read Only Memory) 502, and a RAM (Random Access Memory) 503 are interconnected by a bus 504.
 バス504には、さらに、入出力インターフェース505が接続されている。入出力インターフェース505には、入力部506、出力部507、記録部508、通信部509、及びドライブ510が接続されている。 An input/output interface 505 is further connected to the bus 504 . An input unit 506 , an output unit 507 , a recording unit 508 , a communication unit 509 and a drive 510 are connected to the input/output interface 505 .
 入力部506は、キーボード、マウス、マイクロフォン、撮像素子などよりなる。出力部507は、ディスプレイ、スピーカなどよりなる。記録部508は、ハードディスクや不揮発性のメモリなどよりなる。通信部509は、ネットワークインターフェースなどよりなる。ドライブ510は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリなどのリムーバブル記録媒体511を駆動する。 The input unit 506 consists of a keyboard, mouse, microphone, imaging device, and the like. The output unit 507 includes a display, a speaker, and the like. A recording unit 508 is composed of a hard disk, a nonvolatile memory, or the like. A communication unit 509 includes a network interface and the like. A drive 510 drives a removable recording medium 511 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
 以上のように構成されるコンピュータでは、CPU501が、例えば、記録部508に記録されているプログラムを、入出力インターフェース505及びバス504を介して、RAM503にロードして実行することにより、上述した一連の処理が行われる。 In the computer configured as described above, for example, the CPU 501 loads the program recorded in the recording unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the above-described series of programs. is processed.
 コンピュータ(CPU501)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブル記録媒体511に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 The program executed by the computer (CPU 501) can be provided by being recorded on a removable recording medium 511 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
 コンピュータでは、プログラムは、リムーバブル記録媒体511をドライブ510に装着することにより、入出力インターフェース505を介して、記録部508にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部509で受信し、記録部508にインストールすることができる。その他、プログラムは、ROM502や記録部508に、あらかじめインストールしておくことができる。 In the computer, the program can be installed in the recording unit 508 via the input/output interface 505 by loading the removable recording medium 511 into the drive 510 . Also, the program can be received by the communication unit 509 and installed in the recording unit 508 via a wired or wireless transmission medium. In addition, the program can be installed in the ROM 502 or the recording unit 508 in advance.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
 また、上述の信号処理装置11は、例えば、以下のように、可視光や、赤外光、紫外光、X線等の光をセンシングする様々なケースに使用することができる。 Also, the signal processing device 11 described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays, for example, as follows.
 ・デジタルカメラや、カメラ機能付きの携帯機器等の、鑑賞の用に供される画像を撮像する装置
 ・自動停止等の安全運転や、運転者の状態の認識等のために、自動車の前方や後方、周囲、車内等を撮像する車載用センサ、走行車両や道路を監視する監視カメラ、車両間等の測距を行う測距センサ等の、交通の用に供される装置
 ・ユーザのジェスチャを撮像して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される装置
 ・内視鏡や、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される装置
 ・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される装置
 ・肌を撮影する肌測定器や、頭皮を撮像するマイクロスコープ等の、美容の用に供される装置
 ・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される装置
 ・畑や作物の状態を監視するためのカメラ等の、農業の用に供される装置
・Devices that capture images for viewing purposes, such as digital cameras and mobile devices with camera functions. Devices used for transportation, such as in-vehicle sensors that capture images behind, around, and inside the vehicle, surveillance cameras that monitor running vehicles and roads, and ranging sensors that measure the distance between vehicles. Devices used in home appliances such as TVs, refrigerators, air conditioners, etc., to capture images and operate devices according to gestures ・Endoscopes, devices that perform angiography by receiving infrared light, etc. equipment used for medical and healthcare purposes ・Equipment used for security purposes, such as surveillance cameras for crime prevention and cameras for personal authentication ・Skin measuring instruments that photograph the skin and images of the scalp Equipment used for beauty, such as microscopes used for beauty ・Equipment used for sports, such as action cameras and wearable cameras for use in sports ・Cameras, etc. for monitoring the condition of fields and crops , agricultural equipment
〈移動体への応用例〉
 このように、本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<Example of application to a moving object>
In this way, the technology (the present technology) according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
 図10は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 10 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図10に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 A vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example shown in FIG. 10, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050. Also, as the functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps. In this case, the body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches. The body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed. For example, the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 . The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light. The imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information. Also, the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects in-vehicle information. The in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver. The driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit. A control command can be output to 12010 . For example, the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 In addition, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Also, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図10の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle. In the example of FIG. 10, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices. The display unit 12062 may include at least one of an on-board display and a head-up display, for example.
 図11は、撮像部12031の設置位置の例を示す図である。 FIG. 11 is a diagram showing an example of the installation position of the imaging unit 12031. FIG.
 図11では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 11, the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior, for example. An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 . Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 . An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 . Forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
 なお、図11には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 11 shows an example of the imaging range of the imaging units 12101 to 12104. In FIG. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively, and the imaging range 12114 The imaging range of an imaging unit 12104 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the course of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 . Such recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. This is done by a procedure that determines When the microcomputer 12051 determines that a pedestrian exists in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
 以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部12031や車外情報検出ユニット12030などに適用され得る。具体的には、例えば図4に示した信号処理装置11を撮像部12031および車外情報検出ユニット12030として用いることができ、動き判定時の誤判定の発生を抑制し、より高品質でダイナミックレンジの広い合成画像を得ることができる。 An example of a vehicle control system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure can be applied to the imaging unit 12031, the vehicle exterior information detection unit 12030, and the like among the configurations described above. Specifically, for example, the signal processing device 11 shown in FIG. 4 can be used as the imaging unit 12031 and the vehicle exterior information detection unit 12030, suppressing the occurrence of erroneous determination during motion determination, and achieving higher quality and dynamic range. A wide composite image can be obtained.
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 It should be noted that the embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, when one step includes multiple processes, the multiple processes included in the one step can be executed by one device or shared by multiple devices.
 さらに、本技術は、以下の構成とすることも可能である。 Furthermore, this technology can also be configured as follows.
(1)
 第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成する第1のフィルタ処理部と、
 前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成する第2のフィルタ処理部と、
 前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う動き判定部と
 を備える信号処理装置。
(2)
 前記動き判定の結果に基づいて、前記第1の画像信号と前記第2の画像信号を合成する合成部をさらに備える
 (1)に記載の信号処理装置。
(3)
 撮像を行うことで、前記第1の画像信号および前記第2の画像信号を生成する撮像部をさらに備える
 (1)または(2)に記載の信号処理装置。
(4)
 前記撮像部は、前記第1の画像信号を得るための複数の第1の画素と、前記第1の画素とは感度が異なる、前記第2の画像信号を得るための複数の第2の画素とを有している
 (3)に記載の信号処理装置。
(5)
 前記第1の画素は、前記第2の画素よりも大きい
 (4)に記載の信号処理装置。
(6)
 前記第2のフィルタは、前記第1のフィルタと同じサイズのフィルタである
 (1)乃至(5)の何れか一項に記載の信号処理装置。
(7)
 前記第2のフィルタは、前記第1のフィルタとは異なるサイズのフィルタである
 (1)乃至(5)の何れか一項に記載の信号処理装置。
(8)
 信号処理装置が、
 第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成し、
 前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成し、
 前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う
 信号処理方法。
(9)
 第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成し、
 前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成し、
 前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う
 ステップを含む処理をコンピュータに実行させるプログラム。
(1)
a first filtering unit that generates a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components;
A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. a second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal;
A signal processing device comprising: a motion determination unit that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
(2)
The signal processing device according to (1), further comprising a synthesizing unit that synthesizes the first image signal and the second image signal based on the motion determination result.
(3)
The signal processing device according to (1) or (2), further comprising an imaging unit that generates the first image signal and the second image signal by performing imaging.
(4)
The imaging unit includes a plurality of first pixels for obtaining the first image signal, and a plurality of second pixels for obtaining the second image signal, the sensitivity of which is different from that of the first pixels. The signal processing device according to (3).
(5)
(4) The signal processing device according to (4), wherein the first pixel is larger than the second pixel.
(6)
The signal processing device according to any one of (1) to (5), wherein the second filter has the same size as the first filter.
(7)
The signal processing device according to any one of (1) to (5), wherein the second filter has a size different from that of the first filter.
(8)
A signal processing device
generating a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components;
A second image signal having a phase different from that of the first image signal is subjected to filtering by a second filter that extracts a low-frequency component and corrects a phase difference, unlike the first filter. generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal,
A signal processing method that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
(9)
generating a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components;
A second image signal having a phase different from that of the first image signal is subjected to filtering by a second filter that extracts a low-frequency component and corrects a phase difference, unlike the first filter. generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal,
A program that causes a computer to execute processing including a step of motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
 11 信号処理装置, 21 イメージセンサ, 22 信号生成部, 23 動き判定部, 24 HDR合成部, 31 大画素フィルタ処理部, 32 小画素フィルタ処理部 11 Signal processing unit, 21 Image sensor, 22 Signal generation unit, 23 Motion determination unit, 24 HDR synthesis unit, 31 Large pixel filter processing unit, 32 Small pixel filter processing unit

Claims (9)

  1.  第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成する第1のフィルタ処理部と、
     前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成する第2のフィルタ処理部と、
     前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う動き判定部と
     を備える信号処理装置。
    a first filtering unit that generates a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components;
    A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. a second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal;
    A signal processing device comprising: a motion determination unit that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
  2.  前記動き判定の結果に基づいて、前記第1の画像信号と前記第2の画像信号を合成する合成部をさらに備える
     請求項1に記載の信号処理装置。
    The signal processing device according to claim 1, further comprising a synthesizing unit that synthesizes the first image signal and the second image signal based on the motion determination result.
  3.  撮像を行うことで、前記第1の画像信号および前記第2の画像信号を生成する撮像部をさらに備える
     請求項1に記載の信号処理装置。
    The signal processing device according to claim 1, further comprising an imaging unit that generates the first image signal and the second image signal by performing imaging.
  4.  前記撮像部は、前記第1の画像信号を得るための複数の第1の画素と、前記第1の画素とは感度が異なる、前記第2の画像信号を得るための複数の第2の画素とを有している
     請求項3に記載の信号処理装置。
    The imaging unit includes a plurality of first pixels for obtaining the first image signal, and a plurality of second pixels for obtaining the second image signal, the sensitivity of which is different from that of the first pixels. The signal processing device according to claim 3, comprising:
  5.  前記第1の画素は、前記第2の画素よりも大きい
     請求項4に記載の信号処理装置。
    The signal processing device according to claim 4, wherein the first pixel is larger than the second pixel.
  6.  前記第2のフィルタは、前記第1のフィルタと同じサイズのフィルタである
     請求項1に記載の信号処理装置。
    The signal processing device according to claim 1, wherein the second filter has the same size as the first filter.
  7.  前記第2のフィルタは、前記第1のフィルタとは異なるサイズのフィルタである
     請求項1に記載の信号処理装置。
    The signal processing device according to claim 1, wherein the second filter has a size different from that of the first filter.
  8.  信号処理装置が、
     第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成し、
     前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成し、
     前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う
     信号処理方法。
    A signal processing device
    generating a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components;
    A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal,
    A signal processing method that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
  9.  第1の画像信号に対して、低周波成分を抽出する第1のフィルタによるフィルタリングを行うことで、第1の低周波輝度信号を生成し、
     前記第1の画像信号とは位相の異なる第2の画像信号に対して、前記第1のフィルタとは異なり、低周波成分を抽出するとともに位相差補正を行う第2のフィルタによるフィルタリングを行うことで、前記第1の低周波輝度信号と同じ位相の第2の低周波輝度信号を生成し、
     前記第1の低周波輝度信号および前記第2の低周波輝度信号に基づいて動き判定を行う
     ステップを含む処理をコンピュータに実行させるプログラム。
    generating a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components;
    A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal,
    A program that causes a computer to execute a process including a step of motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
PCT/JP2022/002779 2021-04-15 2022-01-26 Signal processing device and method, and program WO2022219874A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021068990A JP2022163882A (en) 2021-04-15 2021-04-15 Signal processing device and method, and program
JP2021-068990 2021-04-15

Publications (1)

Publication Number Publication Date
WO2022219874A1 true WO2022219874A1 (en) 2022-10-20

Family

ID=83639531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/002779 WO2022219874A1 (en) 2021-04-15 2022-01-26 Signal processing device and method, and program

Country Status (2)

Country Link
JP (1) JP2022163882A (en)
WO (1) WO2022219874A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07107368A (en) * 1993-09-29 1995-04-21 Canon Inc Image processor
JP2010183127A (en) * 2009-02-03 2010-08-19 Sony Corp Apparatus and method of processing image, and imaging apparatus
JP2014039170A (en) * 2012-08-16 2014-02-27 Sony Corp Image processing device, image processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07107368A (en) * 1993-09-29 1995-04-21 Canon Inc Image processor
JP2010183127A (en) * 2009-02-03 2010-08-19 Sony Corp Apparatus and method of processing image, and imaging apparatus
JP2014039170A (en) * 2012-08-16 2014-02-27 Sony Corp Image processing device, image processing method, and program

Also Published As

Publication number Publication date
JP2022163882A (en) 2022-10-27

Similar Documents

Publication Publication Date Title
JP7105754B2 (en) IMAGING DEVICE AND METHOD OF CONTROLLING IMAGING DEVICE
US10432847B2 (en) Signal processing apparatus and imaging apparatus
US11082626B2 (en) Image processing device, imaging device, and image processing method
WO2017175492A1 (en) Image processing device, image processing method, computer program and electronic apparatus
WO2021060118A1 (en) Imaging device
WO2020153261A1 (en) Light-receiving device and ranging device
WO2018008426A1 (en) Signal processing device and method, and imaging device
WO2017195459A1 (en) Imaging device and imaging method
WO2017169233A1 (en) Imaging processing device, imaging processing method, computer program and electronic device
WO2018207666A1 (en) Imaging element, method for driving same, and electronic device
US20200036881A1 (en) Capturing apparatus, capturing module, capturing system, and capturing apparatus control method
WO2017169274A1 (en) Imaging control device, imaging control method, computer program, and electronic equipment
US11924568B2 (en) Signal processing device, signal processing method, and imaging apparatus
WO2020153272A1 (en) Measuring device, ranging device, and method of measurement
WO2020209079A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
US20200402206A1 (en) Image processing device, image processing method, and program
WO2022219874A1 (en) Signal processing device and method, and program
JP2004040523A (en) Surveillance apparatus for vehicle surroundings
US10873732B2 (en) Imaging device, imaging system, and method of controlling imaging device
US20210217146A1 (en) Image processing apparatus and image processing method
WO2020149094A1 (en) Imaging device, imaging system and failure detection method
WO2022249562A1 (en) Signal processing device, method, and program
WO2018220993A1 (en) Signal processing device, signal processing method and computer program
TWI842952B (en) Camera
EP3905656A1 (en) Image processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22787802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22787802

Country of ref document: EP

Kind code of ref document: A1