WO2022219874A1 - Dispositif et procédé de traitement de signaux, et programme - Google Patents

Dispositif et procédé de traitement de signaux, et programme Download PDF

Info

Publication number
WO2022219874A1
WO2022219874A1 PCT/JP2022/002779 JP2022002779W WO2022219874A1 WO 2022219874 A1 WO2022219874 A1 WO 2022219874A1 JP 2022002779 W JP2022002779 W JP 2022002779W WO 2022219874 A1 WO2022219874 A1 WO 2022219874A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
signal
image
low
filter
Prior art date
Application number
PCT/JP2022/002779
Other languages
English (en)
Japanese (ja)
Inventor
駿 阿久津
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2022219874A1 publication Critical patent/WO2022219874A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems

Definitions

  • the present technology relates to a signal processing device, method, and program, and more particularly to a signal processing device, method, and program capable of suppressing the occurrence of artifacts.
  • Patent Document 1 an image sensor having a sub-pixel structure in which large pixels and small pixels with different sizes (sensitivities) are provided in one unit pixel is known (see, for example, Patent Document 1).
  • an image with a wider dynamic range can be obtained by HDR (High Dynamic Range) synthesis of an image captured with large pixels and an image captured with small pixels. be able to.
  • HDR High Dynamic Range
  • the exposure time is set to differ between large pixels and small pixels, there will be differences in imaging results for moving subjects and subject areas such as LEDs (Light Emitting Diodes) between large pixels and small pixels. Since such a difference causes moving object artifacts during HDR synthesis, it is common to perform motion determination and perform correction according to the determination result.
  • LEDs Light Emitting Diodes
  • Patent Literature 2 discloses detecting the difference between a plurality of images captured with different exposure times and combining the plurality of images with a composition coefficient according to the difference.
  • the presence or absence of motion is determined based on the difference between two images having different sensitivities, that is, the magnitude of the difference between the two image signals.
  • Such motion determination is performed on the premise that if the subject is stationary, the image signals (pixel values) will match if the sensitivity difference between the image signals with different sensitivities is corrected.
  • This technology has been developed in view of this situation, and is intended to suppress the occurrence of artifacts.
  • a signal processing device performs filtering on a first image signal using a first filter that extracts a low-frequency component, thereby generating a first low-frequency luminance signal. and a second filter for extracting a low-frequency component and correcting a phase difference, unlike the first filter, for a second image signal having a phase different from that of the first image signal.
  • a second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal by performing filtering with the first low-frequency luminance signal and the second low-frequency luminance signal and a motion determination unit that performs motion determination based on the low-frequency luminance signal.
  • a signal processing method or program generates a first low-frequency luminance signal by filtering a first image signal using a first filter that extracts low-frequency components, A second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter. a step of generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal and performing motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal; including.
  • a first low-frequency luminance signal is generated by filtering a first image signal using a first filter that extracts low-frequency components, and the first image is Unlike the first filter, a second image signal having a different phase from the signal is filtered by a second filter that extracts low-frequency components and corrects the phase difference.
  • a second low-frequency luminance signal having the same phase as the low-frequency luminance signal is generated, and motion determination is performed based on the first low-frequency luminance signal and the second low-frequency luminance signal.
  • FIG. 4 is a diagram showing an example of a large-pixel luminance generation filter and a small-pixel luminance generation filter; 10 is a flowchart for explaining image compositing processing; It is a figure explaining the effect of this technique.
  • FIG. 10 is a diagram showing another example of a small-pixel luminance generation filter; It is a figure which shows the structural example of a computer.
  • 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
  • FIG. FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • the present technology is intended to suppress the occurrence of artifacts by using a low-frequency luminance signal for motion determination and performing phase difference correction only on the signal used for motion determination.
  • the image sensor is provided with unit pixels arranged, for example, as shown in FIG.
  • a large pixel SP1 and a small pixel SP2 having different sensitivities are provided adjacent to each other.
  • the large pixel SP1 and the small pixel SP2 are formed so that the area (size) of the light receiving surface of the large pixel SP1 is larger than that of the small pixel SP2, and the large pixel SP1 is larger than the small pixel SP2. also has high sensitivity. Also, here, the small pixel SP2 is arranged on the upper right side of the large pixel SP1 in the figure.
  • multiple unit pixels with the same configuration as the unit pixel PX1 are arranged in a matrix.
  • each unit pixel is provided with a color filter of one of R (red), G (green), and B (blue).
  • unit pixels are arranged in a checkered pattern.
  • a unit pixel having a B color filter and a unit pixel having a G color filter are arranged in the horizontal direction in the drawing. are arranged alternately.
  • the unit pixel having the R color filter and the unit pixel having the G color filter are arranged in the figure. They are arranged alternately in the middle and the horizontal direction.
  • the unit pixel adjacent in the vertical direction in the figure of the unit pixel PX1 has an R color filter, and the unit pixel PX1 is adjacent in the horizontal direction in the figure.
  • Each unit pixel has a B color filter.
  • each unit pixel of the image sensor may have any color filter, and each unit pixel may have a configuration in which no color filter is provided.
  • an image signal of an image picked up by large pixels provided in an image sensor that is, an image signal composed of pixel signals read out from each large pixel by performing an exposure operation (imaging processing) on the image sensor is converted into a large image signal.
  • an image based on a large pixel image signal is also referred to as a large pixel image.
  • an image signal of an image picked up by small pixels provided in an image sensor that is, an image signal composed of pixel signals read out from each small pixel after an exposure operation is performed on the image sensor, is referred to as a small pixel image signal.
  • An image based on a small pixel image signal is also called a small pixel image.
  • motion determination is performed based on the large pixel image (large pixel image signal), small pixel image (small pixel image signal), and HDR synthesis gain, and a predetermined synthesis coefficient obtained from the result of the motion determination is used. Assume that HDR synthesis of a large pixel image and a small pixel image is performed.
  • the HDR synthesis gain is a coefficient for matching the brightness of the large pixel image and the small pixel image, and is calculated in advance based on, for example, the sensitivity difference and the exposure time difference between the large pixel image and the small pixel image.
  • the synthesis coefficient is a coefficient that indicates the synthesis ratio (mixing ratio) of the large pixel image and the small pixel image at the time of HDR synthesis.
  • the large pixel image and the small pixel image are basically the same image.
  • the pixel corresponding to the large pixel SP1 in FIG. 1 in the large pixel image is the pixel of interest.
  • the pixel corresponding to the pixel of interest that is, the pixel on the small pixel image that has the same positional relationship as the position of the pixel of interest in the large pixel image (hereinafter also referred to as the corresponding pixel) is the small pixel SP2. becomes a pixel corresponding to .
  • the target pixel and the corresponding pixel have the same positional relationship, but as shown in FIG. 1, the large pixel SP1 and the small pixel SP2 have different arrangement positions, that is, there is a deviation in arrangement position. Therefore, even if the subject is stationary, different subjects (different parts of the subject) are captured in the pixel of interest and the corresponding pixel.
  • the presence or absence of motion is determined based on the magnitude of the difference between the large-pixel image and the small-pixel image, which have different sensitivities.
  • the signal levels (pixel values) shown in FIG. 3 can be obtained with respect to the brightness of the subject with large pixels and small pixels.
  • the horizontal axis indicates the brightness of the subject
  • the vertical axis indicates the pixel signal value (pixel value), that is, the signal level when the subject is captured.
  • the polygonal line L11 indicates the signal level (pixel signal value) obtained in the large pixel SP1 for each brightness
  • the straight line L12 indicates the signal level (pixel signal value) obtained in the small pixel SP2 for each brightness. pixel signal value).
  • the center position of the large pixel SP1 and the center position of the small pixel SP2 are at the same position.
  • a signal level P11 is obtained as the pixel signal value of the small pixel SP2, and a signal level P11 is obtained as the pixel signal value of the large pixel SP1.
  • P12 is obtained.
  • the value obtained by multiplying the signal level P11 by the HDR synthesis gain matches the signal level P12.
  • a signal level P21 is obtained as the pixel signal value of the small pixel SP2 as indicated by an arrow Q12
  • a signal level P22 is obtained as the pixel signal value of the large pixel SP1. is obtained.
  • the value obtained by multiplying the signal level P21 by the HDR synthesis gain matches the signal level P23 at the same brightness on the polygonal line L11, but the actual signal level of the large pixel SP1 Does not match P22.
  • the subject to be imaged is a moving subject, and if the exposure time, etc., differs between the large-pixel image and the small-pixel image, the brightness of the subject changes due to changes in the subject captured between the large-pixel SP1 and small-pixel SP2 pixels. Because it changes.
  • the image signal of the large pixel image and the image signal of the small pixel image that is, the pixel signal of the large pixel and the pixel signal of the small pixel are out of phase.
  • the pixel signal obtained by multiplying the pixel signal of the small pixel by the HDR synthesis gain will match the pixel signal of the large pixel corresponding to that small pixel. Therefore, the premise of motion determination that pixel signals of small pixels and large pixels match does not apply.
  • the phase difference between the pixel signals of the large pixels and the small pixels can be used even for a stationary object.
  • the signal difference may become large, resulting in an erroneous determination.
  • phase difference correction is performed on the image signal of the small pixel image, and the pixel signal of the large pixel and the pixel signal of the small pixel are signals of the same phase. It is also conceivable to perform motion determination and HDR synthesis as
  • phase difference correction when phase difference correction is performed on an image, artifacts such as false colors and rattling occur, causing adverse effects such as loss of sharpness of the image. and so on, the cost becomes high.
  • phase difference correction even if noise reduction or the like is performed, only a synthesized image with a lower sharpness than when motion determination is performed without phase difference correction may be obtained.
  • a low-frequency luminance signal is used to determine motion, and such a low-frequency signal can be phase-corrected (shifted) more easily than a high-frequency signal. Therefore, according to the present technology, occurrence of erroneous determination and erroneous correction can be suppressed without causing an increase in cost such as an increase in circuit scale.
  • FIG. 4 is a diagram illustrating a configuration example of an embodiment of a signal processing device to which the present technology is applied.
  • the signal processing device 11 shown in FIG. 4 is composed of, for example, an in-vehicle camera device.
  • the signal processing device 11 has an image sensor 21 , a signal generation section 22 , a motion determination section 23 and an HDR synthesis section 24 .
  • the image sensor 21 is an image sensor having a sub-pixel structure, and functions as an imaging unit that generates an image signal of a large-pixel image and an image signal of a small-pixel image by performing imaging.
  • the image sensor 21 is provided with a plurality of large pixels and small pixels having different sensitivities and phases (arrangement positions) from each other in the arrangement shown in FIG.
  • the image sensor 21 captures an image of a subject by photoelectrically converting incident light, and supplies image signals of the resulting large-pixel image and small-pixel image to the signal generating unit 22 and the HDR synthesizing unit 24 . . That is, an image signal of a large pixel image is generated by imaging with a plurality of large pixels forming the image sensor 21, and an image signal of a small pixel image is generated by imaging by a plurality of small pixels forming the image sensor 21.
  • the image signal of the large pixel image and the image signal of the small pixel image are signals with phases different from each other.
  • the image sensor 21 may be provided outside the signal processing device 11 .
  • the signal generation unit 22 generates a low-frequency luminance signal (low-frequency luminance signal) for motion determination based on the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21, Supplied to the motion determination unit 23 .
  • the signal generating section 22 has a large pixel filtering section 31 and a small pixel filtering section 32 .
  • the large-pixel filter processing unit 31 performs filtering (filtering) on the image signal of the large-pixel image using a large-pixel luminance generation filter to convert the large-pixel low-frequency signal, which is the low-frequency luminance signal of the large-pixel image. Generate a luminance signal.
  • the large-pixel luminance generation filter is a low-pass filter that extracts low-frequency components, more specifically, luminance components below a predetermined frequency (low-frequency luminance components) from the image signal of the large-pixel image.
  • the small pixel filter processing unit 32 performs filtering (filtering) on the image signal of the small pixel image using a small pixel luminance generation filter, thereby reducing the low-frequency luminance of the phase difference corrected small pixel image.
  • a small pixel low frequency luminance signal is generated.
  • the small-pixel low-frequency luminance signal thus obtained is a signal having the same phase as the large-pixel low-frequency luminance signal.
  • the luminance generation filter for small pixels is a filter different from the luminance generation filter for large pixels.
  • the small-pixel luminance generation filter functions as a low-pass filter that extracts low-frequency components (low-frequency luminance components) from the image signal of the small-pixel image. It is considered as a filter or the like that also realizes phase difference correction such as
  • the signal generation unit 22 converts the large pixel low-frequency luminance signal obtained by the large pixel filtering unit 31 and the small pixel low-frequency luminance signal obtained by the small pixel filtering unit 32 into low-frequency signals for motion determination. It is supplied to the motion determination unit 23 as a luminance signal.
  • the motion determination unit 23 performs motion determination based on the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal supplied from the signal generation unit 22, and supplies the determination result to the HDR synthesis unit 24.
  • the HDR synthesizing unit 24 HDR synthesizes the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21 based on the determination result of the motion determination supplied from the motion determining unit 23 .
  • the synthesized image obtained as a result is output to the subsequent stage.
  • the large pixel luminance generation filter and the small pixel luminance generation filter are the filters shown in FIG.
  • the portion indicated by arrow Q21 shows an example of the luminance generation filter for large pixels
  • the portion indicated by arrow Q22 shows an example of the luminance generation filter for small pixels.
  • the large-pixel luminance generation filter is a filter with a size of 3 ⁇ 3, and the numerical values written in each square are the coefficients of the filter that are multiplied by the pixel values of the pixels corresponding to those squares. represents.
  • the size of the filter here means the number of taps of the filter, that is, the number of pixels (size) of the pixel area to be filtered.
  • the values of the pixel signals (pixel values) of the large pixels adjacent to the upper left, right above, and upper right of the large pixel SP1 in FIG. 1 are multiplied by the coefficient values 1, 2, and 1. be.
  • the pixel signal values (pixel values) of the large pixels adjacent to the lower left, just below, and lower right of the large pixel SP1 are multiplied by coefficient values 1, 2, and 1. be.
  • the sum of a total of 9 pixel values multiplied by the coefficients in this way (the sum of the pixel values after coefficient multiplication) is obtained, and the sum is divided by 16, which is the sum of the 9 coefficients,
  • the result of the division is the value of the pixel signal of one pixel obtained by filtering, that is, the value of the large pixel low-frequency luminance signal for one pixel.
  • the phase of the newly generated large pixel low-frequency luminance signal that is, the position of the pixel is the center position of the large pixel SP1.
  • the coefficient indicated by the arrow Q21 is used regardless of the color of the large pixel color filter in the center of the target 3 ⁇ 3 pixel region.
  • pixel signals of large pixels having R, G, and B color filters are used to generate one (one pixel) large pixel low-frequency luminance signal.
  • the small pixel luminance generation filter is also a 3 ⁇ 3 size filter like the large pixel luminance generation filter.
  • the numerical values written in each rectangle represent the coefficients of the filters by which the pixel values of the pixels corresponding to those rectangles are multiplied.
  • the small-pixel luminance generation filter is a 3 ⁇ 3 filter, and the coefficients of the pixels forming the uppermost pixel row and the rightmost pixel column in the target pixel region are All are 0. Therefore, the small-pixel luminance generation filter is substantially a 2 ⁇ 2 size filter.
  • the value of the pixel signal (pixel value) of the small pixels positioned to the left and directly below the small pixel SP2 is multiplied by 1, which is the value of the coefficient.
  • the sum of the four pixel values multiplied by the coefficients in this way is obtained, the sum is divided by 4 which is the sum of the four coefficients, and the result of the division is the one obtained by filtering. It is the value of the pixel signal of the pixel, that is, the value of the small pixel low-frequency luminance signal for one pixel.
  • the phase of the newly generated small-pixel low-frequency luminance signal that is, the position of the pixel is the center position of the four small-pixels used to generate the small-pixel low-frequency luminance signal. It becomes the center position of the pixel SP1. Therefore, it can be seen that the generated small pixel low-frequency luminance signal has the same phase as the corresponding large pixel low-frequency luminance signal, and phase difference correction is realized by the small pixel luminance generation filter.
  • the large-pixel luminance generation filter and the small-pixel luminance generation filter are filters having at least different coefficients.
  • step S11 the large pixels and small pixels of the image sensor 21 receive and photoelectrically convert the incident light to capture large pixel images and small pixel images.
  • large pixels and small pixels are imaged so that the length of the exposure time and the timing of the start or end of exposure are different.
  • the image sensor 21 converts the image signal of the large pixel image obtained by imaging with each large pixel and the image signal of the small pixel image obtained by imaging with each small pixel into a signal generation unit 22 and an HDR synthesis unit. 24.
  • step S ⁇ b>12 the large pixel filter processing unit 31 performs filtering on the image signal of the large pixel image supplied from the image sensor 21 using the large pixel luminance generation filter, thereby reducing the large pixel intensity for each of a plurality of pixel positions. generating a frequency luminance signal.
  • step S ⁇ b>13 the small pixel filter processing unit 32 filters the image signal of the small pixel image supplied from the image sensor 21 using the small pixel luminance generation filter, thereby reducing the small pixel intensity for each of a plurality of pixel positions. generating a frequency luminance signal.
  • a low-frequency luminance signal is extracted from the image signal and phase difference correction (phase correction) is also performed.
  • a phase luminance signal is obtained.
  • the signal generation unit 22 supplies the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal obtained by the above processing to the motion determination unit 23 .
  • steps S12 and S13 are executed before demosaic processing is performed on the large-pixel image and the small-pixel image, it may be executed after the demosaicing processing.
  • step S14 the motion determination unit 23 performs motion determination based on the large-pixel low-frequency luminance signal and the small-pixel low-frequency luminance signal supplied from the signal generation unit 22, and supplies the determination result to the HDR synthesis unit 24.
  • the motion determination unit 23 calculates the difference between the value obtained by multiplying the small pixel low frequency luminance signal by the HDR synthesis gain and the large pixel low frequency luminance signal for each pixel position, and calculates the difference obtained,
  • the motion amount of the subject is obtained by comparing the noise level and a predetermined motion determination threshold. As a result of the motion determination, the amount of motion of the subject for each pixel position is obtained.
  • step S15 the HDR synthesizing unit 24 HDR synthesizes the image signal of the large pixel image and the image signal of the small pixel image supplied from the image sensor 21 based on the determination result of the motion determination supplied from the motion determining unit 23. By doing so, an image signal of a synthesized image is generated.
  • the HDR synthesizing unit 24 generates a synthesizing coefficient for each pixel position based on the image signal of the large pixel image, adjusts the synthesizing coefficient by performing motion compensation according to the determination result of the motion determination, and finalizes the Get the composite coefficient. Then, the HDR synthesizing unit 24 synthesizes (mixes) the value obtained by multiplying the image signal of the small pixel image by the HDR synthesis gain and the image signal of the large pixel image using the synthesizing coefficient to obtain the image signal of the synthesized image. get
  • the signal processing device 11 generates a large-pixel low-frequency luminance signal and a small-pixel low-frequency luminance signal of the same phase, performs motion determination, synthesizes the large-pixel image and the small-pixel image, and combines them. Generate an image.
  • motion determination using low-frequency luminance signals of the same phase in this way, it is possible to suppress the occurrence of erroneous determination and erroneous correction. As a result, it is possible to suppress the occurrence of artifacts and obtain a composite image of higher quality.
  • the motion determination unit 23 can improve the determination accuracy of motion determination as shown in FIG. Note that FIG. 7 indicates that the brighter the part of the subject (image), the greater the motion amount obtained in the motion determination.
  • the window of the building in the region R11, a window of a building exists as a subject, and in the judgment result indicated by the arrow Q31, the window of the building has a particularly fine pattern, that is, a portion containing high-frequency components. It can be seen that an erroneous determination has occurred.
  • the signal generator 22 generates a low-frequency luminance signal from the large-pixel image and the small-pixel image obtained by imaging the subject shown in FIG.
  • the determination result when extracting and correcting the phase difference and performing the motion determination by the motion determination unit 23 is shown. That is, the portion indicated by the arrow Q32 shows the result of motion determination obtained by the processing performed by the signal generation unit 22 and the motion determination unit 23 of the signal processing device 11 .
  • phase difference correction is performed and motion determination is performed using a low-frequency luminance signal.
  • motion determination is performed using a low-frequency luminance signal.
  • the large pixel image and the small pixel image obtained by the image sensor 21, which are not low-frequency luminance signals, are used as they are to perform HDR synthesis.
  • Artifacts such as jerkyness can also be made less likely to cause adverse effects such as loss of image sharpness.
  • a composite image of higher quality and wider dynamic range can be obtained.
  • FIG. 5 illustrates an example in which the large-pixel luminance generation filter and the small-pixel luminance generation filter are filters of size 3 ⁇ 3, but these filters may be of any size, such as 5 ⁇ 5. may be a filter of
  • the large-pixel luminance generation filter and the small-pixel luminance generation filter may be filters of the same size, or may be filters of different sizes.
  • the small pixel luminance generation filter may be, for example, the filter shown in FIG.
  • each rectangle represents a pixel in the pixel area to which the small pixel luminance generation filter is applied, and the numerical value in each rectangle indicates the coefficient of the filter by which the pixel value of the pixel corresponding to the rectangle is multiplied. represent.
  • the small pixel luminance generation filter is a 3 ⁇ 3 size filter, and the pixel value of the pixel at the center position of the target pixel area and and the pixel values of the adjacent pixels below are multiplied by 63 as a factor.
  • the pixel values of the pixels adjacent to the upper left, upper, right, and lower right in the drawing are multiplied by 1 as coefficients to the pixel at the center position of the pixel region, and the pixel at the center position of the pixel region is multiplied by 1 as a coefficient.
  • the pixel value of the pixel adjacent to the upper right is multiplied by 0 as a coefficient.
  • the sum of the pixel values of each pixel multiplied by these coefficients is divided by 256, and the result of the division is the value of the pixel signal of one pixel obtained by filtering, that is, the small pixel low-frequency luminance signal for one pixel. is assumed to be the value of By doing so, extraction of the low-frequency luminance signal and correction of the phase difference can also be realized.
  • two images with different phases may be captured using two cameras, motion determination and HDR synthesis may be performed based on the two images, and a synthesized image may be generated.
  • the series of processes described above can be executed by hardware or by software.
  • a program that constitutes the software is installed in the computer.
  • the computer includes, for example, a computer built into dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs.
  • FIG. 9 is a block diagram showing a hardware configuration example of a computer that executes the series of processes described above by a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input/output interface 505 is further connected to the bus 504 .
  • An input unit 506 , an output unit 507 , a recording unit 508 , a communication unit 509 and a drive 510 are connected to the input/output interface 505 .
  • the input unit 506 consists of a keyboard, mouse, microphone, imaging device, and the like.
  • the output unit 507 includes a display, a speaker, and the like.
  • a recording unit 508 is composed of a hard disk, a nonvolatile memory, or the like.
  • a communication unit 509 includes a network interface and the like.
  • a drive 510 drives a removable recording medium 511 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
  • the CPU 501 loads the program recorded in the recording unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the above-described series of programs. is processed.
  • the program executed by the computer (CPU 501) can be provided by being recorded on a removable recording medium 511 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the recording unit 508 via the input/output interface 505 by loading the removable recording medium 511 into the drive 510 . Also, the program can be received by the communication unit 509 and installed in the recording unit 508 via a wired or wireless transmission medium. In addition, the program can be installed in the ROM 502 or the recording unit 508 in advance.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • the signal processing device 11 described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays, for example, as follows.
  • ⁇ Devices that capture images for viewing purposes, such as digital cameras and mobile devices with camera functions.
  • Devices used for transportation such as in-vehicle sensors that capture images behind, around, and inside the vehicle, surveillance cameras that monitor running vehicles and roads, and ranging sensors that measure the distance between vehicles.
  • Devices used in home appliances such as TVs, refrigerators, air conditioners, etc., to capture images and operate devices according to gestures ⁇ Endoscopes, devices that perform angiography by receiving infrared light, etc.
  • Equipment used for medical and healthcare purposes such as surveillance cameras for crime prevention and cameras for personal authentication
  • Skin measuring instruments that photograph the skin and images of the scalp Equipment used for beauty such as microscopes used for beauty
  • Equipment used for sports such as action cameras and wearable cameras for use in sports ⁇ Cameras, etc. for monitoring the condition of fields and crops , agricultural equipment
  • the technology (the present technology) according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
  • FIG. 10 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • the body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
  • the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
  • the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
  • a control command can be output to 12010 .
  • the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
  • the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 11 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior, for example.
  • An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
  • Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
  • An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
  • Forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 11 shows an example of the imaging range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of an imaging unit 12104 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the course of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
  • automatic brake control including following stop control
  • automatic acceleration control including following start control
  • the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
  • recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
  • the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031, the vehicle exterior information detection unit 12030, and the like among the configurations described above.
  • the signal processing device 11 shown in FIG. 4 can be used as the imaging unit 12031 and the vehicle exterior information detection unit 12030, suppressing the occurrence of erroneous determination during motion determination, and achieving higher quality and dynamic range. A wide composite image can be obtained.
  • this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
  • each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
  • one step includes multiple processes
  • the multiple processes included in the one step can be executed by one device or shared by multiple devices.
  • this technology can also be configured as follows.
  • a first filtering unit that generates a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components
  • a second image signal having a phase different from that of the first image signal is filtered by a second filter that extracts low-frequency components and corrects a phase difference, unlike the first filter.
  • a second filter processing unit that generates a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal
  • a signal processing device comprising: a motion determination unit that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.
  • the signal processing device further comprising a synthesizing unit that synthesizes the first image signal and the second image signal based on the motion determination result.
  • the signal processing device further comprising an imaging unit that generates the first image signal and the second image signal by performing imaging.
  • the imaging unit includes a plurality of first pixels for obtaining the first image signal, and a plurality of second pixels for obtaining the second image signal, the sensitivity of which is different from that of the first pixels.
  • the signal processing device according to (3).
  • (5) (4) The signal processing device according to (4), wherein the first pixel is larger than the second pixel.
  • (6) The signal processing device according to any one of (1) to (5), wherein the second filter has the same size as the first filter.
  • a signal processing device generating a first low-frequency luminance signal by filtering the first image signal with a first filter that extracts low-frequency components; A second image signal having a phase different from that of the first image signal is subjected to filtering by a second filter that extracts a low-frequency component and corrects a phase difference, unlike the first filter. generating a second low-frequency luminance signal having the same phase as the first low-frequency luminance signal, A signal processing method that performs motion determination based on the first low-frequency luminance signal and the second low-frequency luminance signal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

La présente technologie concerne un dispositif et un procédé de traitement de signal, ainsi qu'un programme qui permettent de supprimer la génération d'artéfacts. Le dispositif de traitement de signal comprend : une première unité de traitement de filtre qui génère un premier signal de luminance basse fréquence par filtrage d'un premier signal d'image à l'aide d'un premier filtre qui extrait des composants basse fréquence; une seconde unité de traitement de filtre qui génère un second signal de luminance basse fréquence ayant la même phase que le premier signal de luminance basse fréquence par filtrage d'un second signal d'image ayant une phase différente du premier signal d'image à l'aide d'un second filtre qui est différent du premier filtre, extrait des composants basse fréquence, puis corrige les différences de phase; et une unité de détermination de mouvement qui détermine un mouvement d'après le premier signal de luminance basse fréquence et le second signal de luminance basse fréquence. La présente invention peut être appliquée à un dispositif de traitement de signal.
PCT/JP2022/002779 2021-04-15 2022-01-26 Dispositif et procédé de traitement de signaux, et programme WO2022219874A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021068990A JP2022163882A (ja) 2021-04-15 2021-04-15 信号処理装置および方法、並びにプログラム
JP2021-068990 2021-04-15

Publications (1)

Publication Number Publication Date
WO2022219874A1 true WO2022219874A1 (fr) 2022-10-20

Family

ID=83639531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/002779 WO2022219874A1 (fr) 2021-04-15 2022-01-26 Dispositif et procédé de traitement de signaux, et programme

Country Status (2)

Country Link
JP (1) JP2022163882A (fr)
WO (1) WO2022219874A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07107368A (ja) * 1993-09-29 1995-04-21 Canon Inc 画像処理装置
JP2010183127A (ja) * 2009-02-03 2010-08-19 Sony Corp 画像処理装置、画像処理方法および撮像装置
JP2014039170A (ja) * 2012-08-16 2014-02-27 Sony Corp 画像処理装置、および画像処理方法、並びにプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07107368A (ja) * 1993-09-29 1995-04-21 Canon Inc 画像処理装置
JP2010183127A (ja) * 2009-02-03 2010-08-19 Sony Corp 画像処理装置、画像処理方法および撮像装置
JP2014039170A (ja) * 2012-08-16 2014-02-27 Sony Corp 画像処理装置、および画像処理方法、並びにプログラム

Also Published As

Publication number Publication date
JP2022163882A (ja) 2022-10-27

Similar Documents

Publication Publication Date Title
JP7105754B2 (ja) 撮像装置、及び、撮像装置の制御方法
US10432847B2 (en) Signal processing apparatus and imaging apparatus
US11082626B2 (en) Image processing device, imaging device, and image processing method
WO2021060118A1 (fr) Dispositif d'imagerie
WO2017175492A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
JP6803989B2 (ja) 固体撮像装置及びその駆動方法
WO2018008426A1 (fr) Dispositif et procédé de traitement de signaux, et dispositif d'imagerie
WO2017195459A1 (fr) Dispositif d'imagerie et procédé d'imagerie
WO2018207666A1 (fr) Élément d'imagerie, procédé de commande associé et dispositif électronique
US20200036881A1 (en) Capturing apparatus, capturing module, capturing system, and capturing apparatus control method
US20200402206A1 (en) Image processing device, image processing method, and program
WO2017169274A1 (fr) Dispositif de commande d'imagerie, procédé de commande d'imagerie, programme informatique et équipement électronique
US11924568B2 (en) Signal processing device, signal processing method, and imaging apparatus
WO2020153272A1 (fr) Dispositif de mesure, dispositif de télémétrie et procédé de mesure
WO2020209079A1 (fr) Capteur de mesure de distance, procédé de traitement de signal et module de mesure de distance
US10873732B2 (en) Imaging device, imaging system, and method of controlling imaging device
WO2022219874A1 (fr) Dispositif et procédé de traitement de signaux, et programme
JP2004040523A (ja) 車両周囲監視装置
JP2018201158A (ja) 信号処理装置、信号処理方法及びコンピュータプログラム
US20210217146A1 (en) Image processing apparatus and image processing method
WO2020149094A1 (fr) Dispositif d'imagerie, système d'imagerie, et procédé de détection de défaillance
WO2022249562A1 (fr) Dispositif de traitement de signal, procédé et programme
EP3905656A1 (fr) Dispositif de traitement d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22787802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22787802

Country of ref document: EP

Kind code of ref document: A1