US20190072697A1 - Lens device and image capturing device - Google Patents

Lens device and image capturing device Download PDF

Info

Publication number
US20190072697A1
US20190072697A1 US16/179,904 US201816179904A US2019072697A1 US 20190072697 A1 US20190072697 A1 US 20190072697A1 US 201816179904 A US201816179904 A US 201816179904A US 2019072697 A1 US2019072697 A1 US 2019072697A1
Authority
US
United States
Prior art keywords
filter
image
light
filter region
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/179,904
Inventor
Yusuke MORIUCHI
Nao Mishima
Nobuyuki Matsumoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to US16/179,904 priority Critical patent/US20190072697A1/en
Publication of US20190072697A1 publication Critical patent/US20190072697A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/201Filters in the form of arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N5/2254

Definitions

  • Embodiments described herein relate generally to a lens device and an image capturing device.
  • a conventional technology for acquiring distance information based on a captured image.
  • an image capturing device including a lens, a red (R) filter, a green (G) filter, and a blue (B) filter are disposed at the position of an aperture of the optical system of the lens, and light that has passed through the filters is received by an RGB sensor, thereby capturing images displaced depending on the depth to a subject.
  • the depth of a scene can be calculated based on one of the captured images.
  • FIG. 1 is a block diagram illustrating the hardware configuration of an image capturing device according to a first embodiment
  • FIG. 2 is a block diagram illustrating the functional configuration of the image capturing device according to the first embodiment
  • FIG. 3 is a diagram for explaining filter regions according to the first embodiment
  • FIG. 4 is a diagram for explaining changes of the captured images according to the first embodiment.
  • FIG. 5 is a flowchart illustrating a process executed by the image capturing device according to the first embodiment
  • FIG. 6 is a block diagram illustrating the functional configuration of an image capturing device according to a second embodiment
  • FIG. 7A is a diagram for explaining filter regions according to the second embodiment
  • FIG. 7B is a diagram for explaining filter regions according to the second embodiment.
  • FIG. 8 is a flowchart illustrating a process executed by the image capturing device according to the second embodiment
  • FIG. 9 is a block diagram illustrating the functional configuration of an image capturing device according to a third embodiment.
  • FIG. 10A is a diagram for explaining filter regions according to the third embodiment.
  • FIG. 10B is a diagram for explaining filter regions according to the third embodiment.
  • FIG. 10C is a diagram for explaining filter regions according to the third embodiment.
  • FIG. 10D is a diagram for explaining filter regions according to the third embodiment.
  • FIG. 11 is a flowchart illustrating a process executed by the image capturing device according to the third embodiment.
  • a lens includes a filter, an entire area of which transmits light of a common color.
  • the filter includes a first filter region and a second filter region.
  • the first filter region transmits light that is a first combination of colors out of colors of light to be received by an image sensor.
  • the second filter region transmits light that is a second combination of colors of the colors of the light to be received by the image sensor.
  • the first and second combinations of colors each include the common color.
  • FIG. 1 is a block diagram illustrating an example of the hardware configuration of an image capturing device according to a first embodiment.
  • an image capturing device 100 includes a lens 10 , which may be referred to as a lens device, a sensor 20 , a central processing unit (CPU) 30 , a hard disk drive (HDD) 40 , a memory card slot 50 , and a display 60 . These hardware components are coupled to each other through a bus.
  • the image capturing device 100 captures images including any subject to be an image subject.
  • the image capturing device 100 then generates a display image through image processing based on the information on the depth to the subject and displays the generated display image.
  • the lens 10 includes at its opening a filter (a color filter) on which the light reflected off the subject in image capturing is incident and through which the incident light passes. For example, while condensing rays of incident light, the lens 10 transmits light in a certain wavelength band according to the color of the filter. The light that has passed through the filter reaches the sensor 20 .
  • the sensor 20 is an image sensor and receives the light that has passed through the opening of the lens 10 . Examples of the sensor 20 include a charge coupled device (CCD) image sensor and a complementary metal oxide semiconductor (CMOS) image sensor.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the senor 20 includes a sensor that receives red light (an R sensor), a sensor that receives green light (a G sensor), and a sensor that receives blue light (a B sensor).
  • the sensors receive respective light components corresponding to particular wavelength bands, and generate captured images (an R image, a G image, and a B image).
  • the CPU 30 controls operations of the image capturing device 100 totally. Specifically, the CPU 30 executes a computer program stored in the HDD 40 , for example, and controls operations of the image capturing device 100 totally. For example, the CPU 30 calculates the depth to the subject based on the captured images generated by the sensor 20 . The CPU 30 then generates an image with distance information in accordance with the calculated depth. After that, the CPU 30 generates a display image to be presented to a user based on the captured images and the image with distance information.
  • the HDD 40 is a nonvolatile and rewritable storage device.
  • the HDD 40 stores therein a computer program and various types of data relating to control of the image capturing device 100 .
  • the memory card slot 50 is an interface of a portable storage medium.
  • a portable storage medium such as a secure digital (SD) memory card or a secure digital high capacity (SDHC) memory card can be inserted, for example.
  • the display 60 displays various types of information or images (e.g., the display image).
  • the display 60 is a liquid crystal display or a touch panel, for example.
  • FIG. 2 is a block diagram illustrating an example of the functional configuration of the image capturing device 100 according to the first embodiment.
  • the image capturing device 100 includes the lens 10 including a filter 110 at its opening, a sensor 120 , and an image processor 130 .
  • the filter 110 includes a first filter region 111 and a second filter region 112 .
  • the sensor 120 includes a first sensor 121 , a second sensor 122 , and a third sensor 123 .
  • the filter 110 is disposed on one of both surfaces of the lens 10 , closer to the sensor 120 or on the other surface farther from the sensor 120 . In other words, the lens 10 may be disposed between the sensor 120 and the filter 110 .
  • the filter 110 may be disposed between the sensor 120 and the lens 10 .
  • the image processor 130 includes an input unit 131 , a depth calculator 132 , and an image generator 133 .
  • the whole or a part of the image processor 130 may be implemented with software (a computer program) or a hardware circuitry.
  • the sensor may include other wavelength bands.
  • the filter 110 is a color filter that transmits the light in a particular wavelength band.
  • the entire area of the filter 110 is constituted of a plurality of regions not overlapping with each other.
  • the regions are the first filter region 111 and the second filter region 112 . That is, the entire area of the filter 110 is constituted of the first filter region 111 and the second filter region 112 .
  • the entire area of the opening constituted of such filter regions transmits a common color representing a color that is common to the filter regions. It is noted that the filter regions have different optical transparencies.
  • the plane of the filter 110 is preferably arranged in parallel with the image pickup plane of the sensor 120 so that a larger amount of light passes through the filter.
  • the first filter region 111 transmits a plurality of colors of a first combination out of the colors of the light received by the sensor 120 .
  • the first combination can be defined arbitrarily.
  • the first filter region 111 is a filter region called a “yellow (Y) filter” that transmits red light and green light.
  • the second filter region 112 transmits a plurality of colors of a second combination different from the first combination out of the colors of the light received by the sensor 120 .
  • the second filter region 112 is a filter region called a “cyan (C) filter” that transmits green light and blue light. That is, in this example, the common color is green.
  • RGB sensors include therein a larger number of sensors that receive green light than the number of sensors that receive red light or the number of sensors that receive blue light, rather than including the respective sensors uniformly.
  • Such an RGB sensor can receive a larger amount of light by defining the common color as green, in this example.
  • the first filter region 111 may be a filter region called a “Y filter” that transmits red light and green light
  • the second filter region 112 may be a filter region called a “magenta (M) filter” that transmits red light and blue light
  • the common color is red
  • the first filter region 111 may be the M filter that transmits red light and blue light
  • the second filter region 112 may be the C filter that transmits green light and blue light.
  • the common color is blue.
  • the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example.
  • complementary-color filters of Y, C, and M are known to transmit a larger amount of light in comparison with primary-color filters of R, G, and B having the same transmittance in the same wavelength band.
  • FIG. 3 is a diagram for explaining exemplary filter regions according to the first embodiment.
  • the filter 110 of the lens 10 includes the first filter region 111 (the area represented with vertical lines) that is the Y filter; and the second filter region 112 (the area represented with horizontal lines) that is the C filter.
  • the entire area of the filter 110 is constituted of a plurality of regions not overlapping with each other.
  • the first filter region 111 is preferably arranged such that it matches the second filter region if it is rotated around the optical center of the lens 10 . If the first filter region 111 and the second filter region 112 do not constitute the entire opening of the lens 10 , they are preferably arranged at positions evenly away from the optical center of the lens 10 , in addition to being arranged in the above-described manner.
  • the sensor 120 is an image sensor and receives the light that has passed through the opening of the lens 10 .
  • the first sensor 121 receives red light and generates a captured image (an R image).
  • the second sensor 122 receives green light and generates a captured image (a G image).
  • the third sensor 123 receives blue light and generates a captured image (a B image).
  • the generated captured images (the R image, the G image, and the B image) are output to the image processor 130 .
  • the first sensor 121 receives red light out of the light that has passed through the first filter region 111 ; the second sensor 122 receives green light out of the light that has passed through the first filter region 111 and the second filter region 112 ; and the third sensor 123 receives blue light out of the light that has passed through the second filter region 112 .
  • the second sensor 122 receives green light that has passed through both the first filter region 111 and the second filter region 112 .
  • the green light is less influenced by the absorption of light in the entire filter 110 .
  • This configuration can provide a brighter and less noisy G image in comparison with other captured images.
  • This configuration allows the green light to pass through a plurality of filter regions, whereby the G image is less influenced by installation of the filter 110 .
  • the generated G image is therefore almost an ideal image obtained when no filter 110 is installed (referred to as a “reference image”).
  • the R image and the B image are generated from the received light that has passed through one of the first filter region 111 and the second filter region 112 .
  • the R image and the B image are more likely to change than the reference image or the G image.
  • FIG. 4 is a diagram for explaining exemplary changes of the captured images according to the first embodiment.
  • the arrangement of the filter regions in FIG. 4 is the same as the example illustrated in FIG. 3 .
  • the upper diagram in FIG. 4 illustrates a subject being at a position nearer than the position of a focal distance.
  • the middle diagram in FIG. 4 illustrates a subject being at a position farther than the position of the focal distance.
  • the lower diagram in FIG. 4 illustrates a subject being at the focal distance.
  • the R image (a cylinder represented with positive slopes) is displaced in the first direction (e.g., to the right) relative to the reference image or the G image (that is, a phase shift occurs).
  • the R image is displaced in the second direction (e.g., to the left) opposite to the first direction relative to the reference image or the G image.
  • the B image (a cylinder represented with negative slopes) is displaced in the second direction (e.g., to the left) relative to the reference image or the G image.
  • the B image is displaced in the first direction (e.g., to the right) relative to the reference image or the G image. That is, in scenes captured in RGB images that are captured images, an image can be observed at displaced positions to the right or left in accordance with the depth relative to the subject at the focal distance. This phase shift is inverted between an image nearer than the position of the focal distance and an image farther than the position of the focal distance. In the area where the phase has shifted, the image blurs. The following describes calculation of the depth of scenes by utilizing such characteristics of captured images.
  • the image processor 130 calculates the depth to the subject based on RGB images that are captured images.
  • the image processor 130 then generates an image with distance information in accordance with the calculated depth.
  • the image with distance information is a two-dimensional image associated with the depth information in the space.
  • the image processor 130 generates a display image to be presented to a user based on the R image, the G image, and the B image, which are the captured images, and the generated image with distance information.
  • the input unit 131 inputs the captured images. For example, the input unit 131 receives the captured images (the R image, the G image, and the B image) from the sensor 120 and inputs them to the depth calculator 132 and the image generator 133 .
  • the depth calculator 132 calculates the depth to the subject based on the captured images.
  • the depth calculator 132 then generates an image with distance information in accordance with the calculated depth. More specifically, the depth calculator 132 calculates color displacement amounts of the respective images relative to the G image out of the captured images (the R image, the G image, and the B image) input by the input unit 131 .
  • the depth calculator 132 then calculates the distance to the object captured at the position of the pixels, thereby generating the image with distance information.
  • the color displacement amounts of the respective images are calculated on the basis of the image in the common color.
  • the pixels in the R image, the G image, and the B image are shifted to the positions of the respective images in the first direction or the second direction (e.g., to the right or the left) in accordance with the depth to the subject whose image is captured.
  • the G image has a little color displacement amount in comparison with the reference image.
  • the depth calculator 132 calculates the color displacement amounts of the respective images on the basis of the G image.
  • the pixel of the G image that is the target for calculating the distance is defined as Y G (x, y)
  • the pixel of the R image corresponding to Y G (x, y) is defined as Y R (x, y)
  • the pixel of the B image corresponding to Y G (x, y) is defined as Y B (x, y).
  • the displacement amount in accordance with the depth is represented with d.
  • the pixel Y R (x, y) in the R image can therefore be represented by Y R (x ⁇ d, y).
  • the pixel Y B (x, y) in the B image can be represented by B (x+d, y).
  • the value of the displacement amount d is 0 if the subject is at the focal distance.
  • the value of d is positive if the subject is at a distance larger than the focal distance, and the value of d is negative if the subject is at a distance smaller than the focal distance.
  • of the displacement amount d has a characteristic of simply increasing in proportion to the difference between the distance to the subject and the focal distance.
  • the distance to the object captured in the pixels can be uniquely calculated based on the displacement amount d.
  • the distance to the subject is determined by the focal distance, the displacement amount, the size of the opening of the lens, and the f-number of the lens, for example. These pieces of information can be obtained by measuring in advance.
  • the matching methods include one based on the luminance difference of images and one based on the color distribution by utilizing the characteristic of color components in nature description images having a linear relation locally.
  • the displacement amount d is calculated by utilizing the linear characteristic of the color distribution of the pixel value in the local area around the pixel (x, y).
  • the image is virtually defined in the local area around Y G (x, y), Y R (x ⁇ d, y), and Y B (x+d, y).
  • an index L is calculated (refer to Equation (1)).
  • ⁇ 0 , ⁇ 1 , and ⁇ 2 represent the dispersion along the main component axes of the color distribution in the image (the eigenvalues of the covariance matrix in the image); and ⁇ R 2 , ⁇ G 2 , and ⁇ B 2 represent the dispersion along the R, G, and B axes of the color distribution.
  • L increases with an increase in the amount of the color displacement in the image, Therefore, calculating the value of d corresponding to the minimum L(x, y; d) can obtain the depth of the coordinates (x, y).
  • the image generator 133 generates a display image based on the captured images and the image with distance information.
  • the display image may be the image with distance information or the RGB images, for example.
  • the display image may also be a corrected image in terms of the color displacement in which the displacement of the R image and the B image out of the RGB images is corrected in accordance with the displacement amount d of the pixels corresponding to the distance.
  • the display image may also be an image obtained by removing the focal blurring from the corrected image in terms of the color displacement in accordance with the distance.
  • the display image may also be an image obtained by adding the focal blurring from the corrected image in terms of the color displacement in accordance with the distance. If the lens 10 includes a focusing mechanism, the depth information can be output to the focusing mechanism of the lens 10 and used for improving a focus speed.
  • FIG. 5 is a flowchart illustrating an exemplary process executed by the image capturing device 100 according to the first embodiment.
  • the first filter region 111 (the Y filter) transmits red light and green light (Step S 101 ).
  • the second filter region 112 (the C filter) transmits green light and blue light (Step S 102 ).
  • the first sensor 121 receives red light and generates an R image that is one of captured images (Step S 103 ).
  • the second sensor 122 receives green light and generates a G image that is one of the captured images (Step S 104 ).
  • the third sensor 123 receives blue light and generates a B image that is one of the captured images (Step S 105 ).
  • the R image generated by the first sensor 121 , the G image generated by the second sensor 122 , and the B image generated by the third sensor 123 are output to the input unit 131 .
  • the input unit 131 in turn inputs the R image, the G image, and the B image to the depth calculator 132 and the image generator 133 .
  • the depth calculator 132 calculates the color displacement amounts of the respective images relative to the G image out of the R image, the G image, and the B image input by the input unit 131 .
  • the depth calculator 132 then calculates the distance to an object captured at the position of the pixels, thereby generating an image with distance information (Step S 106 ).
  • the image generator 133 generates a display image to be presented to a user based on the R image, the G image, and the B image input by the input unit 131 and the image with distance information generated by the depth calculator 132 (Step S 107 ).
  • the image of the subject is captured through the lens 10 in which two kinds of filter regions having different optical transparencies and transmitting the common color light are disposed.
  • a brighter and less noisy image can be acquired and an image with distance information is generated on the basis of the image corresponding to the common color.
  • the distance information can be acquired with high accuracy.
  • FIG. 6 is a block diagram illustrating an example of the functional configuration of an image capturing device according to a second embodiment.
  • common numerals are assigned to similar components to those in the first embodiment, and detailed explanation thereof may be omitted.
  • the function, the configuration, and the processing in the second embodiment are similar to those in the first embodiment except for a filter 210 (a third filter region 213 ) described below.
  • an image capturing device 200 includes the lens 10 including the filter 210 at its opening, the sensor 120 , and the image processor 130 .
  • the filter 210 includes the first filter region 111 , the second filter region 112 , and the third filter region 213 .
  • the sensor 120 includes the first sensor 121 , the second sensor 122 , and the third sensor 123 .
  • the image processor 130 includes the input unit 131 , the depth calculator 132 , and the image generator 133 . The whole or a part of the image processor 130 may be implemented with software (a computer program) or a hardware circuitry.
  • the filter 210 is a color filter that transmits the light in a particular wavelength band.
  • the entire area of the filter 210 is constituted of a plurality of regions not overlapping with each other.
  • the regions are the first filter region 111 , the second filter region 112 , and the third filter region 213 . That is, the entire area of the filter 210 is constituted of the first filter region 111 , the second filter region 112 , and the third filter region 213 .
  • the entire area of the opening constituted of such filter regions transmits a common color representing a color that is common to the filter regions. It is noted that the filter regions have different optical transparencies.
  • the plane of the filter 210 is preferably arranged in parallel with the image pickup plane of the sensor 120 so that a larger amount of light passes through the filter.
  • the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example. That is, the common color is green in the second embodiment.
  • the third filter region 213 transmits the common color out of the colors of the light received by the sensor 120 .
  • the third filter region 213 is a G filter that transmits green light, which is a common color of the light passing through the first filter region 111 and the second filter region 112 .
  • the third filter region 213 blocks the light other than the light of the common color (e.g., the red light and the blue light if the common color is green).
  • the common color is green and the third filter region 213 is the G filter. This is provided merely for exemplary purpose and not limiting. If the common color is another color (e.g., red or blue), the third filter region 213 is a filter corresponding to the color.
  • FIGS. 7A and 7B are diagrams for explaining exemplary filter regions according to the second embodiment.
  • the filter 210 of the lens 10 includes the first filter region 111 (the area represented with vertical lines) that is the Y filter; the second filter region 112 (the area represented with horizontal lines) that is the C filter; and the third filter region 213 (the area represented with shading) that is the G filter.
  • the entire area of the filter 210 is constituted of a plurality of regions not overlapping with each other.
  • the first filter region 111 is preferably arranged symmetrically with the second filter region 112 with respect to the straight perpendicular line passing through the optical center of the lens 10 and the respective center points of gravity of the filter regions.
  • the shape of the first filter region 111 and the second filter region 112 is not limited to a circle or an ellipse as illustrated in FIG. 7A or 7B .
  • the position, shape, and size of the filter regions may be defined arbitrarily.
  • the shape of the first filter region 111 and the second filter region 112 can control the blurring state of the R image and the B image generated by the sensor 120 .
  • changing the shape of the filter regions can control arbitrarily a point-spread function (PSE) that is a blur function of the sensor 120 , thereby controlling the blurring state of the R image and the B image generated by the sensor 120 .
  • PSE point-spread function
  • FIGS. 7A and 7B the blur functions of the R image and the B image are identical to each other. Controlling the blurring of the images to be symmetrical with each other can prevent a display image from being recognized as unnatural. Identical blurring of the R image and the B image facilitates matching of the R image and the B image, thereby improving the accuracy of depth estimation.
  • the displacement of the R image and the B image is the same as that in the first embodiment.
  • the processing executed by the sensor 120 is similar to that in the first embodiment.
  • the second sensor 122 receives green light that has passed through the filter regions and generates a captured image (the G image). That is, the second sensor 122 generates the G image serving as the basis of generating the image with distance information by receiving the green light that has passed through the three filter regions, namely, the first filter region 111 , the second filter region 112 , and the third filter region 213 .
  • This processing can generate the G image that is more similar to an ideal reference image.
  • FIG. 8 is a flowchart illustrating an exemplary process executed by the image capturing device 200 according to the second embodiment.
  • the first filter region 111 (the Y filter) transmits red light and green light (Step S 201 ).
  • the second filter region 112 (the C filter) transmits green light and blue light (Step S 202 ).
  • the third filter region 213 (the G filter) transmits green light (Step S 203 ).
  • the first sensor 121 receives red light and generates an R image that is one of captured images (Step S 204 ).
  • the second sensor 122 receives green light and generates a G image that is one of the captured images (Step S 205 ).
  • the third sensor 123 receives blue light and generates a B image that is one of the captured images (Step S 206 ).
  • the R image generated by the first sensor 121 , the G image generated by the second sensor 122 , and the B image generated by the third sensor 123 are output to the input unit 131 .
  • the input unit 131 in turn inputs the R image, the G image, and the B image to the depth calculator 132 and the image generator 133 .
  • the depth calculator 132 calculates the color displacement amounts of the respective images relative to the G image out of the R image, the G image, and the B image input by the input unit 131 .
  • the depth calculator 132 then calculates the distance to the object captured at the position of the pixels, thereby generating an image with distance information (Step S 207 ).
  • the image generator 133 generates a display image to be presented to a user based on the R image, the G image, and the B image generated by the input unit 131 and the image with distance information generated by the depth calculator 132 (Step S 208 ).
  • the image of the subject is captured through the lens 10 including one kind of filter region that transmits light of a common color and is capable of controlling the blur function of the image to be acquired in addition to two kinds of filter regions that have different optical transparencies and transmit the common color light.
  • the distance information can be obtained with high accuracy and a display image to be presented to a user is prevented from being recognized as unnatural.
  • FIG. 9 is a block diagram illustrating an example of the functional configuration of an image capturing device according to a third embodiment.
  • common numerals are assigned to similar components to those in the first embodiment, and detailed explanation thereof may be omitted.
  • the function, configuration, and processing in the third embodiment are similar to those in the first embodiment except for a filter 310 (a fourth filter region 314 ) described below.
  • an image capturing device 300 includes the lens 10 including the filter 310 at its opening, the sensor 120 , and the image processor 130 .
  • the filter 310 includes the first filter region 111 , the second filter region 112 , and the fourth filter region 314 .
  • the sensor 120 includes the first sensor 121 , the second sensor 122 , and the third sensor 123 .
  • the image processor 130 includes the input unit 131 , the depth calculator 132 , and the image generator 133 . The whole or a part of the image processor 130 may be implemented with software (a computer program) or a hardware circuitry.
  • the filter 310 is a color filter that transmits the light in a particular wavelength band.
  • the entire area of the filter 310 is constituted of a plurality of regions not overlapping with each other.
  • the regions are the first filter region 111 , the second filter region 112 , and the fourth filter region 314 . That is, the entire area of the filter 310 is constituted of the first filter region 111 , the second filter region 112 , and the fourth filter region 314 .
  • the entire area of the opening constituted of such filter regions transmits a common color representing a color that is common to the filter regions. It is noted that the filter regions have different optical transparencies.
  • the plane of the filter 310 is preferably arranged in parallel with the image pickup plane of the sensor 120 so that a larger amount of light passes through the filter.
  • the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example. That is, the common color is green in the third embodiment.
  • the fourth filter region 314 transmits light of all colors of the light received by the sensor 120 .
  • the fourth filter region 314 is a filter region called a “transparent filter” that transmits light of all colors of the light coming to the filter 310 .
  • the fourth filter region 314 that is a transparent filter transmits green light, which is a common color of the light passing through the first filter region 111 and the second filter region 112 .
  • the light that has passed through the fourth filter region 314 is received by all of the image sensors included in the sensor 120 . No filter may be provided at the position of the fourth filter region 314 .
  • the first sensor 121 receives red light out of the light that has passed through the first filter region 111 and the fourth filter region 314 and generates a captured image (the R image).
  • the second sensor 122 receives green light out of the light that has passed through the first filter region 111 , the second filter region 112 , and the fourth filter region 314 and generates a captured image (the G image).
  • the third sensor 123 receives blue light out of the light that has passed through the second filter region 112 and the fourth filter region 314 and generates a captured image (the B image). That is, the image sensors generating the R image, the G image, and the B image, respectively, receive the light that has passed through the fourth filter region 314 that is a transparent filter. This configuration allows a larger amount of light to be received. As a result, the image capturing device 300 can generate a brighter and less noisy display image.
  • FIGS. 10A, 10B, 10C, and 10D are diagrams for explaining exemplary filter regions according to the third embodiment.
  • the filter 310 of the lens 10 includes the first filter region 111 (the area represented with vertical lines) that is the Y filter; the second filter region 112 (the area represented with horizontal lines) that is the C filter; and the fourth filter region 314 (the area represented in white) that is the transparent filter.
  • the entire area of the filter 310 is constituted of a plurality of regions not overlapping with each other.
  • the position, shape, and size of the filter regions may be defined arbitrarily.
  • the fourth filter region 314 is preferably arranged symmetrically with respect to the straight line perpendicularly dividing into two lines the straight line passing through the respective center points of gravity of the filter regions.
  • the fourth filter region 314 may be in the shape illustrated in FIG. 10C rather than the ellipse illustrated in FIGS. 10A and 10B .
  • the fourth filter region 314 may be divided into a plurality of regions as illustrated in FIG. 10D .
  • the displacement of the R image and the B image are the same as that in the first embodiment.
  • FIG. 11 is a flowchart illustrating an exemplary process executed by the image capturing device 300 according to the third embodiment.
  • the first filter region 111 (the Y filter) transmits red light and green light (Step S 301 ).
  • the second filter region 112 (the C filter) transmits green light and blue light (Step S 302 ).
  • the fourth filter region 314 (the transparent filter) transmits light of all colors (Step S 303 ).
  • the first sensor 121 receives red light and generates an R image that is one of captured images (Step S 304 ).
  • the second sensor 122 receives green light and generates a G image that is one of the captured images (Step S 305 ).
  • the third sensor 123 receives blue light and generates a B image that is one of the captured images (Step S 306 ).
  • the R image generated by the first sensor 121 , the G image generated by the second sensor 122 , and the B image generated by the third sensor 123 are output to the input unit 131 .
  • the input unit 131 in turn inputs the R image, the G image, and the B image to the depth calculator 132 and the image generator 133 .
  • the depth calculator 132 calculates the color displacement amounts of the respective images relative to the G image out of the R image, the G image, and the B image input by the input unit 131 .
  • the depth calculator 132 then calculates the distance to the object captured at the position of the pixels, thereby generating an image with distance information (Step S 307 ).
  • the image generator 133 generates a display image to be presented to a user based on the R image, the G image, and the B image generated by the input unit 131 and the image with distance information generated by the depth calculator 132 (Step S 308 ).
  • the image of the subject is captured through the lens 10 including one kind of filter region that transmits light of all colors in addition to two kinds of filter regions that have different optical transparencies and transmit the common color light.
  • the lens 10 including one kind of filter region that transmits light of all colors in addition to two kinds of filter regions that have different optical transparencies and transmit the common color light.
  • the first embodiment describes the lens 10 including the filter 110 constituted of two kinds of filter regions (the first filter region 111 and the second filter region 112 ) that transmit a common color light and have different optical transparencies.
  • the second embodiment describes the lens 10 including the filter 210 constituted of two kinds of filter regions (the first filter region 111 and the second filter region 112 ) that transmit a common color light and have different optical transparencies and one kind of filter region (the third filter region 213 ) that transmits the common color light.
  • the third embodiment describes the lens 10 including the filter 310 constituted of two kinds of filter regions (the first filter region 111 and the second filter region 112 ) that transmits a common color light and have different optical transparencies and one kind of filter region (the fourth filter region 314 ) that transmits light of all colors.
  • the number and types of the filter regions are not limited to the examples described above.
  • any number and types of filter regions may be disposed in addition to the first filter region 111 and the second filter region 112 .
  • any number and types of filter regions may be disposed as an additional filter region out of the R filter, the G filter, the B filter, the Y filter, the C filter, the M filter, and the transparent filter in addition to the first filter region 111 and the second filter region 112 .
  • the filter regions transmit the light of a common color.
  • a plurality of filter regions are disposed in the entire area of the opening of the lens 10 .
  • the description is provided merely for exemplary purpose and not limiting.
  • the filter regions may not be disposed across the entire area of the opening of the lens 10 .
  • the entire area of the opening of the lens 10 may be constituted of a filter region and a region without a filter. In this example, the entire area of the opening of the lens 10 still transmits a common color of the light.
  • the image processor 130 can be implemented by using a general-purpose computer as basic hardware, for example.
  • a computer program to be executed has a module configuration including the above-described functions.
  • the computer program to be executed may be recorded and provided in a computer-readable recording medium such as a compact disc read only memory (CD-ROM), a compact disc recordable (CD-R), and a digital versatile disc (DVD) as an installable or executable file.
  • the computer program to be executed may be embedded and provided in a read only memory (ROM), for example.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)
  • Optical Filters (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

According to an embodiment, a lens device includes a filter, an entire area of which transmits light of a common color. The filter includes a first filter region and a second filter region. The first filter region transmits light that is a first combination of colors out of colors of light to be received by an image sensor. The second filter region transmits light that is a second combination of colors of the colors of the light to be received by the image sensor. The first and second combinations of colors each include the common color.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2014-241744, filed on Nov. 28, 2014; the entire contents of which are incorporated herein by reference.
  • FIELD
  • Embodiments described herein relate generally to a lens device and an image capturing device.
  • BACKGROUND
  • A conventional technology is known for acquiring distance information based on a captured image. With the technology, for example, in an image capturing device including a lens, a red (R) filter, a green (G) filter, and a blue (B) filter are disposed at the position of an aperture of the optical system of the lens, and light that has passed through the filters is received by an RGB sensor, thereby capturing images displaced depending on the depth to a subject. In this configuration, the depth of a scene can be calculated based on one of the captured images.
  • If the light that has passed through the filters is received by sensors in the respective identical colors corresponding to the filters, the light of a color different from the color of the filter is absorbed rather than passing through. This configuration may reduce the amount of light received by the sensor, which makes a captured image darker and noisier, leading to difficulty in calculating the depth to the subject uniquely. As a result, the distance information with high accuracy cannot be acquired based on a captured image with the conventional technology.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating the hardware configuration of an image capturing device according to a first embodiment;
  • FIG. 2 is a block diagram illustrating the functional configuration of the image capturing device according to the first embodiment;
  • FIG. 3 is a diagram for explaining filter regions according to the first embodiment;
  • FIG. 4 is a diagram for explaining changes of the captured images according to the first embodiment;.
  • FIG. 5 is a flowchart illustrating a process executed by the image capturing device according to the first embodiment;
  • FIG. 6 is a block diagram illustrating the functional configuration of an image capturing device according to a second embodiment;
  • FIG. 7A is a diagram for explaining filter regions according to the second embodiment;
  • FIG. 7B is a diagram for explaining filter regions according to the second embodiment;
  • FIG. 8 is a flowchart illustrating a process executed by the image capturing device according to the second embodiment;
  • FIG. 9 is a block diagram illustrating the functional configuration of an image capturing device according to a third embodiment;
  • FIG. 10A is a diagram for explaining filter regions according to the third embodiment;
  • FIG. 10B is a diagram for explaining filter regions according to the third embodiment;
  • FIG. 10C is a diagram for explaining filter regions according to the third embodiment;
  • FIG. 10D is a diagram for explaining filter regions according to the third embodiment; and
  • FIG. 11 is a flowchart illustrating a process executed by the image capturing device according to the third embodiment.
  • DETAILED DESCRIPTION
  • According to an embodiment, a lens includes a filter, an entire area of which transmits light of a common color. The filter includes a first filter region and a second filter region. The first filter region transmits light that is a first combination of colors out of colors of light to be received by an image sensor. The second filter region transmits light that is a second combination of colors of the colors of the light to be received by the image sensor. The first and second combinations of colors each include the common color.
  • First Embodiment
  • FIG. 1 is a block diagram illustrating an example of the hardware configuration of an image capturing device according to a first embodiment. As illustrated in FIG. 1, an image capturing device 100 includes a lens 10, which may be referred to as a lens device, a sensor 20, a central processing unit (CPU) 30, a hard disk drive (HDD) 40, a memory card slot 50, and a display 60. These hardware components are coupled to each other through a bus. The image capturing device 100 captures images including any subject to be an image subject. The image capturing device 100 then generates a display image through image processing based on the information on the depth to the subject and displays the generated display image.
  • The lens 10 includes at its opening a filter (a color filter) on which the light reflected off the subject in image capturing is incident and through which the incident light passes. For example, while condensing rays of incident light, the lens 10 transmits light in a certain wavelength band according to the color of the filter. The light that has passed through the filter reaches the sensor 20. The sensor 20 is an image sensor and receives the light that has passed through the opening of the lens 10. Examples of the sensor 20 include a charge coupled device (CCD) image sensor and a complementary metal oxide semiconductor (CMOS) image sensor. According to one aspect of the present embodiment, the sensor 20 includes a sensor that receives red light (an R sensor), a sensor that receives green light (a G sensor), and a sensor that receives blue light (a B sensor). The sensors receive respective light components corresponding to particular wavelength bands, and generate captured images (an R image, a G image, and a B image).
  • The CPU 30 controls operations of the image capturing device 100 totally. Specifically, the CPU 30 executes a computer program stored in the HDD 40, for example, and controls operations of the image capturing device 100 totally. For example, the CPU 30 calculates the depth to the subject based on the captured images generated by the sensor 20. The CPU 30 then generates an image with distance information in accordance with the calculated depth. After that, the CPU 30 generates a display image to be presented to a user based on the captured images and the image with distance information.
  • The HDD 40 is a nonvolatile and rewritable storage device. For example, the HDD 40 stores therein a computer program and various types of data relating to control of the image capturing device 100. The memory card slot 50 is an interface of a portable storage medium. Into the memory card slot 50, a portable storage medium such as a secure digital (SD) memory card or a secure digital high capacity (SDHC) memory card can be inserted, for example. The display 60 displays various types of information or images (e.g., the display image). The display 60 is a liquid crystal display or a touch panel, for example.
  • FIG. 2 is a block diagram illustrating an example of the functional configuration of the image capturing device 100 according to the first embodiment. As illustrated in FIG. 2, the image capturing device 100 includes the lens 10 including a filter 110 at its opening, a sensor 120, and an image processor 130. The filter 110 includes a first filter region 111 and a second filter region 112. The sensor 120 includes a first sensor 121, a second sensor 122, and a third sensor 123. The filter 110 is disposed on one of both surfaces of the lens 10, closer to the sensor 120 or on the other surface farther from the sensor 120. In other words, the lens 10 may be disposed between the sensor 120 and the filter 110. Alternatively, the filter 110 may be disposed between the sensor 120 and the lens 10. The image processor 130 includes an input unit 131, a depth calculator 132, and an image generator 133. The whole or a part of the image processor 130 may be implemented with software (a computer program) or a hardware circuitry. The following describes the sensor 120 including an RGB sensor for exemplary purpose and is not limiting. The sensor may include other wavelength bands.
  • The filter 110 is a color filter that transmits the light in a particular wavelength band. For example, the entire area of the filter 110 is constituted of a plurality of regions not overlapping with each other. In the first embodiment, the regions are the first filter region 111 and the second filter region 112. That is, the entire area of the filter 110 is constituted of the first filter region 111 and the second filter region 112. The entire area of the opening constituted of such filter regions transmits a common color representing a color that is common to the filter regions. It is noted that the filter regions have different optical transparencies. The plane of the filter 110 is preferably arranged in parallel with the image pickup plane of the sensor 120 so that a larger amount of light passes through the filter.
  • The first filter region 111 transmits a plurality of colors of a first combination out of the colors of the light received by the sensor 120. The first combination can be defined arbitrarily. For example, the first filter region 111 is a filter region called a “yellow (Y) filter” that transmits red light and green light. The second filter region 112 transmits a plurality of colors of a second combination different from the first combination out of the colors of the light received by the sensor 120. For example, the second filter region 112 is a filter region called a “cyan (C) filter” that transmits green light and blue light. That is, in this example, the common color is green. Some RGB sensors include therein a larger number of sensors that receive green light than the number of sensors that receive red light or the number of sensors that receive blue light, rather than including the respective sensors uniformly. Such an RGB sensor can receive a larger amount of light by defining the common color as green, in this example.
  • The above-described combination of the first filter region 111 and the second filter region 112 is provided merely for exemplary purpose and not limiting. For another example, the first filter region 111 may be a filter region called a “Y filter” that transmits red light and green light, and the second filter region 112 may be a filter region called a “magenta (M) filter” that transmits red light and blue light. In this example, the common color is red. In addition, the first filter region 111 may be the M filter that transmits red light and blue light, and the second filter region 112 may be the C filter that transmits green light and blue light. In this example, the common color is blue. In the description of the present embodiment, the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example. Typically, complementary-color filters of Y, C, and M are known to transmit a larger amount of light in comparison with primary-color filters of R, G, and B having the same transmittance in the same wavelength band.
  • FIG. 3 is a diagram for explaining exemplary filter regions according to the first embodiment. As illustrated in FIG. 3, the filter 110 of the lens 10 includes the first filter region 111 (the area represented with vertical lines) that is the Y filter; and the second filter region 112 (the area represented with horizontal lines) that is the C filter. As described above, the entire area of the filter 110 is constituted of a plurality of regions not overlapping with each other. At the opening of the lens 10, the first filter region 111 is preferably arranged such that it matches the second filter region if it is rotated around the optical center of the lens 10. If the first filter region 111 and the second filter region 112 do not constitute the entire opening of the lens 10, they are preferably arranged at positions evenly away from the optical center of the lens 10, in addition to being arranged in the above-described manner.
  • Returning to FIG. 2, the sensor 120 is an image sensor and receives the light that has passed through the opening of the lens 10. The first sensor 121 receives red light and generates a captured image (an R image). The second sensor 122 receives green light and generates a captured image (a G image). The third sensor 123 receives blue light and generates a captured image (a B image). The generated captured images (the R image, the G image, and the B image) are output to the image processor 130. If the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example, the first sensor 121 receives red light out of the light that has passed through the first filter region 111; the second sensor 122 receives green light out of the light that has passed through the first filter region 111 and the second filter region 112; and the third sensor 123 receives blue light out of the light that has passed through the second filter region 112.
  • As described above, the second sensor 122 receives green light that has passed through both the first filter region 111 and the second filter region 112. Thus, the green light is less influenced by the absorption of light in the entire filter 110. This configuration can provide a brighter and less noisy G image in comparison with other captured images. This configuration allows the green light to pass through a plurality of filter regions, whereby the G image is less influenced by installation of the filter 110. The generated G image is therefore almost an ideal image obtained when no filter 110 is installed (referred to as a “reference image”). Out of the captured images, the R image and the B image are generated from the received light that has passed through one of the first filter region 111 and the second filter region 112. The R image and the B image are more likely to change than the reference image or the G image.
  • FIG. 4 is a diagram for explaining exemplary changes of the captured images according to the first embodiment. The arrangement of the filter regions in FIG. 4 is the same as the example illustrated in FIG. 3. The upper diagram in FIG. 4 illustrates a subject being at a position nearer than the position of a focal distance. The middle diagram in FIG. 4 illustrates a subject being at a position farther than the position of the focal distance. The lower diagram in FIG. 4 illustrates a subject being at the focal distance.
  • As illustrated in the upper diagram in FIG. 4, if the subject is positioned nearer than an object being focused, the R image (a cylinder represented with positive slopes) is displaced in the first direction (e.g., to the right) relative to the reference image or the G image (that is, a phase shift occurs). As illustrated in the middle diagram in FIG. 4, if the subject is positioned farther than the object being focused, the R image is displaced in the second direction (e.g., to the left) opposite to the first direction relative to the reference image or the G image. By contrast, as illustrated in the upper diagram in FIG. 4, if the subject is positioned nearer than the object being focused, the B image (a cylinder represented with negative slopes) is displaced in the second direction (e.g., to the left) relative to the reference image or the G image. As illustrated in the middle diagram in FIG. 4, if the subject is positioned farther than the object being focused, the B image is displaced in the first direction (e.g., to the right) relative to the reference image or the G image. That is, in scenes captured in RGB images that are captured images, an image can be observed at displaced positions to the right or left in accordance with the depth relative to the subject at the focal distance. This phase shift is inverted between an image nearer than the position of the focal distance and an image farther than the position of the focal distance. In the area where the phase has shifted, the image blurs. The following describes calculation of the depth of scenes by utilizing such characteristics of captured images.
  • The image processor 130 calculates the depth to the subject based on RGB images that are captured images. The image processor 130 then generates an image with distance information in accordance with the calculated depth. The image with distance information is a two-dimensional image associated with the depth information in the space. The image processor 130 generates a display image to be presented to a user based on the R image, the G image, and the B image, which are the captured images, and the generated image with distance information. The input unit 131 inputs the captured images. For example, the input unit 131 receives the captured images (the R image, the G image, and the B image) from the sensor 120 and inputs them to the depth calculator 132 and the image generator 133.
  • The depth calculator 132 calculates the depth to the subject based on the captured images. The depth calculator 132 then generates an image with distance information in accordance with the calculated depth. More specifically, the depth calculator 132 calculates color displacement amounts of the respective images relative to the G image out of the captured images (the R image, the G image, and the B image) input by the input unit 131. The depth calculator 132 then calculates the distance to the object captured at the position of the pixels, thereby generating the image with distance information. The color displacement amounts of the respective images are calculated on the basis of the image in the common color. As described above, the pixels in the R image, the G image, and the B image are shifted to the positions of the respective images in the first direction or the second direction (e.g., to the right or the left) in accordance with the depth to the subject whose image is captured. The G image has a little color displacement amount in comparison with the reference image. In this manner, the depth calculator 132 calculates the color displacement amounts of the respective images on the basis of the G image.
  • For example, the pixel of the G image that is the target for calculating the distance is defined as YG(x, y), the pixel of the R image corresponding to YG(x, y) is defined as YR(x, y), and the pixel of the B image corresponding to YG(x, y) is defined as YB(x, y). The displacement amount in accordance with the depth is represented with d. The pixel YR(x, y) in the R image can therefore be represented by YR(x−d, y). In the same manner, the pixel YB(x, y) in the B image can be represented by B(x+d, y). The value of the displacement amount d is 0 if the subject is at the focal distance. The value of d is positive if the subject is at a distance larger than the focal distance, and the value of d is negative if the subject is at a distance smaller than the focal distance. The magnitude |d| of the displacement amount d has a characteristic of simply increasing in proportion to the difference between the distance to the subject and the focal distance. Thus, the distance to the object captured in the pixels can be uniquely calculated based on the displacement amount d. The distance to the subject is determined by the focal distance, the displacement amount, the size of the opening of the lens, and the f-number of the lens, for example. These pieces of information can be obtained by measuring in advance.
  • Conventional matching methods can be used for calculating the displacement amount d. Examples of the matching methods include one based on the luminance difference of images and one based on the color distribution by utilizing the characteristic of color components in nature description images having a linear relation locally. In the matching method based on the color distribution, the displacement amount d is calculated by utilizing the linear characteristic of the color distribution of the pixel value in the local area around the pixel (x, y). In the method, assuming that the displacement amount d is generated in an image, the image is virtually defined in the local area around YG(x, y), YR(x−d, y), and YB(x+d, y). Based on dispersion in the color distribution of the defined image, an index L is calculated (refer to Equation (1)).

  • L(x, y; d)=λ0λ1λ2R 2σ2σB 2   (1)
  • where λ0, λ1, and λ2 represent the dispersion along the main component axes of the color distribution in the image (the eigenvalues of the covariance matrix in the image); and σR 2, σG 2, and σB 2 represent the dispersion along the R, G, and B axes of the color distribution. It is publicly known that the value of L increases with an increase in the amount of the color displacement in the image, Therefore, calculating the value of d corresponding to the minimum L(x, y; d) can obtain the depth of the coordinates (x, y).
  • The image generator 133 generates a display image based on the captured images and the image with distance information. The display image may be the image with distance information or the RGB images, for example. The display image may also be a corrected image in terms of the color displacement in which the displacement of the R image and the B image out of the RGB images is corrected in accordance with the displacement amount d of the pixels corresponding to the distance. The display image may also be an image obtained by removing the focal blurring from the corrected image in terms of the color displacement in accordance with the distance. The display image may also be an image obtained by adding the focal blurring from the corrected image in terms of the color displacement in accordance with the distance. If the lens 10 includes a focusing mechanism, the depth information can be output to the focusing mechanism of the lens 10 and used for improving a focus speed.
  • The following describes a process executed by the image capturing device 100 according to the first embodiment with reference to FIG. 5. FIG. 5 is a flowchart illustrating an exemplary process executed by the image capturing device 100 according to the first embodiment.
  • As illustrated in FIG. 5, the first filter region 111 (the Y filter) transmits red light and green light (Step S101). The second filter region 112 (the C filter) transmits green light and blue light (Step S102). The first sensor 121 receives red light and generates an R image that is one of captured images (Step S103). The second sensor 122 receives green light and generates a G image that is one of the captured images (Step S104).
  • The third sensor 123 receives blue light and generates a B image that is one of the captured images (Step S105). The R image generated by the first sensor 121, the G image generated by the second sensor 122, and the B image generated by the third sensor 123 are output to the input unit 131. The input unit 131 in turn inputs the R image, the G image, and the B image to the depth calculator 132 and the image generator 133.
  • The depth calculator 132 calculates the color displacement amounts of the respective images relative to the G image out of the R image, the G image, and the B image input by the input unit 131. The depth calculator 132 then calculates the distance to an object captured at the position of the pixels, thereby generating an image with distance information (Step S106). The image generator 133 generates a display image to be presented to a user based on the R image, the G image, and the B image input by the input unit 131 and the image with distance information generated by the depth calculator 132 (Step S107).
  • According to the present embodiment, the image of the subject is captured through the lens 10 in which two kinds of filter regions having different optical transparencies and transmitting the common color light are disposed. In this configuration, a brighter and less noisy image can be acquired and an image with distance information is generated on the basis of the image corresponding to the common color. As a result, the distance information can be acquired with high accuracy.
  • Second Embodiment
  • FIG. 6 is a block diagram illustrating an example of the functional configuration of an image capturing device according to a second embodiment. In the description of the second embodiment, common numerals are assigned to similar components to those in the first embodiment, and detailed explanation thereof may be omitted. The function, the configuration, and the processing in the second embodiment are similar to those in the first embodiment except for a filter 210 (a third filter region 213) described below.
  • As illustrated in FIG. 6, an image capturing device 200 includes the lens 10 including the filter 210 at its opening, the sensor 120, and the image processor 130. The filter 210 includes the first filter region 111, the second filter region 112, and the third filter region 213. The sensor 120 includes the first sensor 121, the second sensor 122, and the third sensor 123. The image processor 130 includes the input unit 131, the depth calculator 132, and the image generator 133. The whole or a part of the image processor 130 may be implemented with software (a computer program) or a hardware circuitry.
  • The filter 210 is a color filter that transmits the light in a particular wavelength band. For example, the entire area of the filter 210 is constituted of a plurality of regions not overlapping with each other. In the second embodiment, the regions are the first filter region 111, the second filter region 112, and the third filter region 213. That is, the entire area of the filter 210 is constituted of the first filter region 111, the second filter region 112, and the third filter region 213. The entire area of the opening constituted of such filter regions transmits a common color representing a color that is common to the filter regions. It is noted that the filter regions have different optical transparencies. The plane of the filter 210 is preferably arranged in parallel with the image pickup plane of the sensor 120 so that a larger amount of light passes through the filter. In the description of the second embodiment, in the same manner as the first embodiment, the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example. That is, the common color is green in the second embodiment.
  • The third filter region 213 transmits the common color out of the colors of the light received by the sensor 120. For example, the third filter region 213 is a G filter that transmits green light, which is a common color of the light passing through the first filter region 111 and the second filter region 112. In other words, the third filter region 213 blocks the light other than the light of the common color (e.g., the red light and the blue light if the common color is green). In the above-described description, the common color is green and the third filter region 213 is the G filter. This is provided merely for exemplary purpose and not limiting. If the common color is another color (e.g., red or blue), the third filter region 213 is a filter corresponding to the color.
  • FIGS. 7A and 7B are diagrams for explaining exemplary filter regions according to the second embodiment. As illustrated in FIGS. 7A and 7B, the filter 210 of the lens 10 includes the first filter region 111 (the area represented with vertical lines) that is the Y filter; the second filter region 112 (the area represented with horizontal lines) that is the C filter; and the third filter region 213 (the area represented with shading) that is the G filter. As described above, the entire area of the filter 210 is constituted of a plurality of regions not overlapping with each other. The first filter region 111 is preferably arranged symmetrically with the second filter region 112 with respect to the straight perpendicular line passing through the optical center of the lens 10 and the respective center points of gravity of the filter regions. The shape of the first filter region 111 and the second filter region 112 is not limited to a circle or an ellipse as illustrated in FIG. 7A or 7B.
  • The position, shape, and size of the filter regions may be defined arbitrarily. The shape of the first filter region 111 and the second filter region 112 can control the blurring state of the R image and the B image generated by the sensor 120. Specifically, changing the shape of the filter regions can control arbitrarily a point-spread function (PSE) that is a blur function of the sensor 120, thereby controlling the blurring state of the R image and the B image generated by the sensor 120. In FIGS. 7A and 7B, the blur functions of the R image and the B image are identical to each other. Controlling the blurring of the images to be symmetrical with each other can prevent a display image from being recognized as unnatural. Identical blurring of the R image and the B image facilitates matching of the R image and the B image, thereby improving the accuracy of depth estimation. The displacement of the R image and the B image is the same as that in the first embodiment.
  • The processing executed by the sensor 120 is similar to that in the first embodiment. The second sensor 122 receives green light that has passed through the filter regions and generates a captured image (the G image). That is, the second sensor 122 generates the G image serving as the basis of generating the image with distance information by receiving the green light that has passed through the three filter regions, namely, the first filter region 111, the second filter region 112, and the third filter region 213. This processing can generate the G image that is more similar to an ideal reference image.
  • The following describes a process executed by the image capturing device 200 according to the second embodiment with reference to FIG. 8. FIG. 8 is a flowchart illustrating an exemplary process executed by the image capturing device 200 according to the second embodiment.
  • As illustrated in FIG. 8, the first filter region 111 (the Y filter) transmits red light and green light (Step S201). The second filter region 112 (the C filter) transmits green light and blue light (Step S202). The third filter region 213 (the G filter) transmits green light (Step S203). The first sensor 121 receives red light and generates an R image that is one of captured images (Step S204). The second sensor 122 receives green light and generates a G image that is one of the captured images (Step S205).
  • The third sensor 123 receives blue light and generates a B image that is one of the captured images (Step S206). The R image generated by the first sensor 121, the G image generated by the second sensor 122, and the B image generated by the third sensor 123 are output to the input unit 131. The input unit 131 in turn inputs the R image, the G image, and the B image to the depth calculator 132 and the image generator 133.
  • The depth calculator 132 calculates the color displacement amounts of the respective images relative to the G image out of the R image, the G image, and the B image input by the input unit 131. The depth calculator 132 then calculates the distance to the object captured at the position of the pixels, thereby generating an image with distance information (Step S207). The image generator 133 generates a display image to be presented to a user based on the R image, the G image, and the B image generated by the input unit 131 and the image with distance information generated by the depth calculator 132 (Step S208).
  • According to the present embodiment, the image of the subject is captured through the lens 10 including one kind of filter region that transmits light of a common color and is capable of controlling the blur function of the image to be acquired in addition to two kinds of filter regions that have different optical transparencies and transmit the common color light. In this configuration, the distance information can be obtained with high accuracy and a display image to be presented to a user is prevented from being recognized as unnatural.
  • Third Embodiment
  • FIG. 9 is a block diagram illustrating an example of the functional configuration of an image capturing device according to a third embodiment. In the description of the third embodiment, common numerals are assigned to similar components to those in the first embodiment, and detailed explanation thereof may be omitted. The function, configuration, and processing in the third embodiment are similar to those in the first embodiment except for a filter 310 (a fourth filter region 314) described below.
  • As illustrated in FIG. 9, an image capturing device 300 includes the lens 10 including the filter 310 at its opening, the sensor 120, and the image processor 130. The filter 310 includes the first filter region 111, the second filter region 112, and the fourth filter region 314. The sensor 120 includes the first sensor 121, the second sensor 122, and the third sensor 123. The image processor 130 includes the input unit 131, the depth calculator 132, and the image generator 133. The whole or a part of the image processor 130 may be implemented with software (a computer program) or a hardware circuitry.
  • The filter 310 is a color filter that transmits the light in a particular wavelength band. For example, the entire area of the filter 310 is constituted of a plurality of regions not overlapping with each other. In the third embodiment, the regions are the first filter region 111, the second filter region 112, and the fourth filter region 314. That is, the entire area of the filter 310 is constituted of the first filter region 111, the second filter region 112, and the fourth filter region 314. The entire area of the opening constituted of such filter regions transmits a common color representing a color that is common to the filter regions. It is noted that the filter regions have different optical transparencies. The plane of the filter 310 is preferably arranged in parallel with the image pickup plane of the sensor 120 so that a larger amount of light passes through the filter. In the description of the third embodiment, in the same manner as the first embodiment, the first filter region 111 is the Y filter and the second filter region 112 is the C filter, for example. That is, the common color is green in the third embodiment.
  • The fourth filter region 314 transmits light of all colors of the light received by the sensor 120. For example, the fourth filter region 314 is a filter region called a “transparent filter” that transmits light of all colors of the light coming to the filter 310. The fourth filter region 314 that is a transparent filter transmits green light, which is a common color of the light passing through the first filter region 111 and the second filter region 112. The light that has passed through the fourth filter region 314 is received by all of the image sensors included in the sensor 120. No filter may be provided at the position of the fourth filter region 314.
  • Specifically, the first sensor 121 receives red light out of the light that has passed through the first filter region 111 and the fourth filter region 314 and generates a captured image (the R image). The second sensor 122 receives green light out of the light that has passed through the first filter region 111, the second filter region 112, and the fourth filter region 314 and generates a captured image (the G image). The third sensor 123 receives blue light out of the light that has passed through the second filter region 112 and the fourth filter region 314 and generates a captured image (the B image). That is, the image sensors generating the R image, the G image, and the B image, respectively, receive the light that has passed through the fourth filter region 314 that is a transparent filter. This configuration allows a larger amount of light to be received. As a result, the image capturing device 300 can generate a brighter and less noisy display image.
  • FIGS. 10A, 10B, 10C, and 10D are diagrams for explaining exemplary filter regions according to the third embodiment. As illustrated in FIGS. 10A to 10D, the filter 310 of the lens 10 includes the first filter region 111 (the area represented with vertical lines) that is the Y filter; the second filter region 112 (the area represented with horizontal lines) that is the C filter; and the fourth filter region 314 (the area represented in white) that is the transparent filter. As described above, the entire area of the filter 310 is constituted of a plurality of regions not overlapping with each other. The position, shape, and size of the filter regions may be defined arbitrarily. The fourth filter region 314 is preferably arranged symmetrically with respect to the straight line perpendicularly dividing into two lines the straight line passing through the respective center points of gravity of the filter regions. The fourth filter region 314 may be in the shape illustrated in FIG. 10C rather than the ellipse illustrated in FIGS. 10A and 10B. The fourth filter region 314 may be divided into a plurality of regions as illustrated in FIG. 10D. The displacement of the R image and the B image are the same as that in the first embodiment.
  • The following describes a process executed by the image capturing device 300 according to the third embodiment with reference to FIG. 11. FIG. 11 is a flowchart illustrating an exemplary process executed by the image capturing device 300 according to the third embodiment.
  • As illustrated in FIG. 11, the first filter region 111 (the Y filter) transmits red light and green light (Step S301). The second filter region 112 (the C filter) transmits green light and blue light (Step S302). The fourth filter region 314 (the transparent filter) transmits light of all colors (Step S303). The first sensor 121 receives red light and generates an R image that is one of captured images (Step S304). The second sensor 122 receives green light and generates a G image that is one of the captured images (Step S305).
  • The third sensor 123 receives blue light and generates a B image that is one of the captured images (Step S306). The R image generated by the first sensor 121, the G image generated by the second sensor 122, and the B image generated by the third sensor 123 are output to the input unit 131. The input unit 131 in turn inputs the R image, the G image, and the B image to the depth calculator 132 and the image generator 133.
  • The depth calculator 132 calculates the color displacement amounts of the respective images relative to the G image out of the R image, the G image, and the B image input by the input unit 131. The depth calculator 132 then calculates the distance to the object captured at the position of the pixels, thereby generating an image with distance information (Step S307). The image generator 133 generates a display image to be presented to a user based on the R image, the G image, and the B image generated by the input unit 131 and the image with distance information generated by the depth calculator 132 (Step S308).
  • According to the present embodiment, the image of the subject is captured through the lens 10 including one kind of filter region that transmits light of all colors in addition to two kinds of filter regions that have different optical transparencies and transmit the common color light. In this configuration, a brighter and less noisy image can be acquired and the distance information can be obtained with high accuracy.
  • Fourth Embodiment
  • While certain embodiments of the lens and the image capturing device have been described, the technology disclosed herein may be embodied in other different forms. The following describes other different embodiments in terms of (1) the number and types of filter regions, (2) configuration, and (3) computer program.
  • (1) The Number and Types of Filter Regions
  • The first embodiment describes the lens 10 including the filter 110 constituted of two kinds of filter regions (the first filter region 111 and the second filter region 112) that transmit a common color light and have different optical transparencies. The second embodiment describes the lens 10 including the filter 210 constituted of two kinds of filter regions (the first filter region 111 and the second filter region 112) that transmit a common color light and have different optical transparencies and one kind of filter region (the third filter region 213) that transmits the common color light. The third embodiment describes the lens 10 including the filter 310 constituted of two kinds of filter regions (the first filter region 111 and the second filter region 112) that transmits a common color light and have different optical transparencies and one kind of filter region (the fourth filter region 314) that transmits light of all colors. In the present embodiment, the number and types of the filter regions are not limited to the examples described above.
  • Specifically, any number and types of filter regions may be disposed in addition to the first filter region 111 and the second filter region 112. For example, any number and types of filter regions may be disposed as an additional filter region out of the R filter, the G filter, the B filter, the Y filter, the C filter, the M filter, and the transparent filter in addition to the first filter region 111 and the second filter region 112. In the same manner as the above-described embodiments, the filter regions transmit the light of a common color.
  • In the above-described embodiments, a plurality of filter regions are disposed in the entire area of the opening of the lens 10. The description is provided merely for exemplary purpose and not limiting. The filter regions may not be disposed across the entire area of the opening of the lens 10. Specifically, the entire area of the opening of the lens 10 may be constituted of a filter region and a region without a filter. In this example, the entire area of the opening of the lens 10 still transmits a common color of the light.
  • (2) Configuration
  • In addition, processing or controlling procedures, specific names, and information including various types of data and parameters may be modified in any manner unless specified otherwise. Each of the components of the apparatuses illustrated herein is merely a depiction of concepts or functionality, and does not necessarily configured physically in the manner illustrated in the drawings. In other words, specific configurations in which the apparatuses are distributed or integrated are not limited to those illustrated in the drawings, and the whole or a part of the apparatuses may be distributed or integrated functionally or physically in any units depending on various loads or utilization.
  • (3) Computer Program
  • The image processor 130 according to the embodiments can be implemented by using a general-purpose computer as basic hardware, for example. A computer program to be executed has a module configuration including the above-described functions. The computer program to be executed may be recorded and provided in a computer-readable recording medium such as a compact disc read only memory (CD-ROM), a compact disc recordable (CD-R), and a digital versatile disc (DVD) as an installable or executable file. In addition, the computer program to be executed may be embedded and provided in a read only memory (ROM), for example.
  • While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.

Claims (15)

1-12. (canceled)
13. An image capturing device comprising:
a lens that transmits light;
a filter that transmits light;
an image sensor for receiving light transmitted through the lens and the filter; and
one or more processors, wherein
the filter includes a first filter region and a second filter region, the first filter region being a region for transmitting light in a first wavelength band, the second filter region being a region for transmitting light in a second wavelength band, and a common wavelength band being included in the first wavelength band and the second wavelength band,
the image sensor generates a reference image by receiving, light in the common wavelength band and generates a first image by receiving at least a part of light in the first wavelength band, and
the one or more processors calculate a distance to an object by calculating a color deviation of the first image with respect to the reference image.
14. The image capturing device according to claim 13, wherein
a number of sensors for receiving light in the common wavelength band is larger than a number of sensors for receiving light in a wavelength band differing from the common wavelength in the first wavelength band.
15. The image capturing device according to claim 13, wherein
the filter further includes a third filter region for transmitting light in a third wavelength band, and
the third wavelength band includes a part of the first waveband and a part of the second waveband.
16. The image capturing device according to claim 13, wherein the third filter is one of a red filter, a green filter, and a blue filter.
17. The image capturing device according to claim 13, wherein
the first wavelength band includes two out of red, and green and blue, the second wavelength band includes two out of red, green and blue, and
one of the two included in the second wavelength band is included in the first wavelength band, and the other of the two included in the second wavelength band is not included in the first wavelength band.
18. The image capturing device according to claim 13, wherein
a combination of the first filter region and the second filter region is any one of:
a combination of a yellow filter and a cyan filter,
a combination of a cyan filter and a magenta filter, and
a combination of a magenta filter and a yellow filter.
19. The image capturing device according, to claim 13, wherein the lens is arranged between the image sensor and the filter.
20. The image capturing device according to claim 13, wherein the filter is arranged between the image sensor and the lens.
21. New The image capturing device according to claim 13, wherein
a shape of the first filter region is point-symmetric to a shape of the second filter region with respect to an optical center of the lens.
22. The image capturing device according to claim 13, wherein
a distance between the first filter region and an optical center of the lens is equal to a distance between the second filter region and the optical center of the lens.
23. The image capturing device according to claim 13, wherein
the filter further includes a fourth filter region for transmitting light corresponding to light received by the image sensor, and
the fourth filter region is symmetrically arranged with respect to a line perpendicularly bisecting a line which connects a center of gravity of the first filter region and a center of gravity of the second filter region.
24. The image capturing device according to claim 13, wherein
the filter further includes fourth filter regions for transmitting light corresponding to light received by the image sensor, and
the fourth filter regions are symmetrically arranged with respect to a line perpendicularly bisecting a line which connects a center of gravity of the first filter region and a center of gravity of the second filter region.
25. An image capturing device comprising:
a lens that transmits light;
a filter that transmits light;
an image sensor for receiving light transmitted through the lens and the filter; and
one or more processors, wherein
the filter includes a first filter region and a second filter region, the first filter region being a region for transmitting light in a first combination of colors, the second filter region being a region for transmitting light in a second combination of colors, and the first filter region and the second filter region transmitting light in a common color,
the image sensor generates a reference image by receiving light in the common color and generates a first image by receiving light in other colors than the common color among the first combination of colors, and
the one or more processors calculate a distance to an object by calculating a color deviation of the first image with respect to the reference image.
26. An image processing device comprising:
one or more processors, wherein
the one or more processors obtain a reference image and a first image from an image sensor,
the image sensor receives light that has passed through a filter which includes a first filter region and a second filter region,
the reference image is generated from light in a common wavelength band that has passed through both the first filter region and the second filter region,
the first image is generated from at least a part of light, in a wavelength band that has passed through the first filter region, and
the one or more processors calculate a distance to an object by calculating a color deviation of the first image with respect to the reference image.
US16/179,904 2014-11-28 2018-11-03 Lens device and image capturing device Abandoned US20190072697A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/179,904 US20190072697A1 (en) 2014-11-28 2018-11-03 Lens device and image capturing device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014241744A JP2016102733A (en) 2014-11-28 2014-11-28 Lens and image capturing device
JP2014-241744 2014-11-28
US14/950,591 US10145994B2 (en) 2014-11-28 2015-11-24 Lens device and image capturing device for acquiring distance information at high accuracy
US16/179,904 US20190072697A1 (en) 2014-11-28 2018-11-03 Lens device and image capturing device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/950,591 Continuation US10145994B2 (en) 2014-11-28 2015-11-24 Lens device and image capturing device for acquiring distance information at high accuracy

Publications (1)

Publication Number Publication Date
US20190072697A1 true US20190072697A1 (en) 2019-03-07

Family

ID=56079077

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/950,591 Expired - Fee Related US10145994B2 (en) 2014-11-28 2015-11-24 Lens device and image capturing device for acquiring distance information at high accuracy
US16/179,904 Abandoned US20190072697A1 (en) 2014-11-28 2018-11-03 Lens device and image capturing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/950,591 Expired - Fee Related US10145994B2 (en) 2014-11-28 2015-11-24 Lens device and image capturing device for acquiring distance information at high accuracy

Country Status (2)

Country Link
US (2) US10145994B2 (en)
JP (1) JP2016102733A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11796722B2 (en) 2019-10-30 2023-10-24 Fujifilm Corporation Optical element, optical device, and imaging apparatus for acquiring multispectral images

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10382684B2 (en) 2015-08-20 2019-08-13 Kabushiki Kaisha Toshiba Image processing apparatus and image capturing apparatus
JP6608763B2 (en) 2015-08-20 2019-11-20 株式会社東芝 Image processing apparatus and photographing apparatus
JP6585006B2 (en) 2016-06-07 2019-10-02 株式会社東芝 Imaging device and vehicle
JP6699898B2 (en) * 2016-11-11 2020-05-27 株式会社東芝 Processing device, imaging device, and automatic control system
CN108076266A (en) * 2016-11-11 2018-05-25 株式会社东芝 Processing unit and photographic device
JP6699897B2 (en) 2016-11-11 2020-05-27 株式会社東芝 Imaging device, automatic control system and system
JP2018084571A (en) * 2016-11-11 2018-05-31 株式会社東芝 Processing device, imaging device, and automatic control system
JP6778636B2 (en) * 2017-02-27 2020-11-04 アイホン株式会社 Distance measuring device
US20180270413A1 (en) * 2017-03-15 2018-09-20 Kabushiki Kaisha Toshiba Processing apparatus and processing system
JP7030431B2 (en) * 2017-06-26 2022-03-07 株式会社東芝 Inspection support system and inspection support control program
JP2019011971A (en) 2017-06-29 2019-01-24 株式会社東芝 Estimation system and automobile
JP2019015575A (en) 2017-07-05 2019-01-31 株式会社東芝 Image processor, distance measuring device, and processing system
JP6878219B2 (en) 2017-09-08 2021-05-26 株式会社東芝 Image processing device and ranging device
WO2019215211A1 (en) 2018-05-09 2019-11-14 Trinamix Gmbh Detector for optically detecting at least one object
JP6971934B2 (en) * 2018-08-10 2021-11-24 株式会社東芝 Image processing device
JP6989466B2 (en) 2018-09-13 2022-01-05 株式会社東芝 Optical filter, image pickup device and ranging device
JP7263493B2 (en) * 2018-09-18 2023-04-24 株式会社東芝 Electronic devices and notification methods
JP7021036B2 (en) 2018-09-18 2022-02-16 株式会社東芝 Electronic devices and notification methods
JP7204586B2 (en) 2019-06-17 2023-01-16 株式会社東芝 LEARNING METHOD, PROGRAM AND IMAGE PROCESSING DEVICE
JP7446985B2 (en) 2020-12-15 2024-03-11 株式会社東芝 Learning method, program and image processing device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130250109A1 (en) * 2012-03-21 2013-09-26 Ricoh Company, Ltd. Multi-lens camera system, vehicle mounting the multi-lens camera system, and range-finding method executed by the multi-lens camera system

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4437112A (en) * 1980-02-15 1984-03-13 Canon Kabushiki Kaisha Solid-state color imaging apparatus
DE3312962A1 (en) * 1982-04-12 1983-10-13 Canon K.K., Tokyo Image pick-up device
DE3855833T2 (en) * 1987-04-30 1997-07-31 Toshiba Kawasaki Kk Color image sensor and its manufacturing process
JP3212335B2 (en) 1991-12-25 2001-09-25 セコム株式会社 Passive type distance measuring device
KR0168451B1 (en) * 1994-03-31 1999-01-15 다까노 야스아끼 Color solid image sensing device
US6657663B2 (en) * 1998-05-06 2003-12-02 Intel Corporation Pre-subtracting architecture for enabling multiple spectrum image sensing
US6807295B1 (en) 1999-06-29 2004-10-19 Fuji Photo Film Co., Ltd. Stereoscopic imaging apparatus and method
JP3863319B2 (en) 1999-06-29 2006-12-27 富士フイルムホールディングス株式会社 Parallax image capturing apparatus and camera
JP2008288629A (en) * 2007-05-15 2008-11-27 Sony Corp Image signal processing apparatus, imaging device, image signal processing method, and computer program
US7675024B2 (en) * 2008-04-23 2010-03-09 Aptina Imaging Corporation Method and apparatus providing color filter array with non-uniform color filter sizes
JP2009276294A (en) 2008-05-16 2009-11-26 Toshiba Corp Image processing method
US8384818B2 (en) * 2008-06-18 2013-02-26 Panasonic Corporation Solid-state imaging device including arrays of optical elements and photosensitive cells
GB2463480A (en) * 2008-09-12 2010-03-17 Sharp Kk Camera Having Large Depth of Field
JP5237998B2 (en) * 2010-07-12 2013-07-17 パナソニック株式会社 Solid-state imaging device, imaging device, and signal processing method
US8902293B2 (en) 2011-01-17 2014-12-02 Panasonic Corporation Imaging device
US9628776B2 (en) * 2011-04-07 2017-04-18 Panasonic Intellectual Property Management Co., Ltd. Three-dimensional imaging device, image processing device, image processing method, and image processing program
JP2013003159A (en) 2011-06-10 2013-01-07 Olympus Corp Imaging apparatus
JP2013017138A (en) * 2011-07-06 2013-01-24 Olympus Corp Imaging device and image processing device
JP2013037294A (en) 2011-08-10 2013-02-21 Olympus Corp Image pickup apparatus
CN103052914B (en) * 2011-08-11 2016-09-28 松下知识产权经营株式会社 Three-dimensional image pickup device
JP2013044806A (en) 2011-08-22 2013-03-04 Olympus Corp Imaging apparatus
JP2013057761A (en) 2011-09-07 2013-03-28 Olympus Corp Distance measuring device, imaging device, and distance measuring method
JP2013097154A (en) * 2011-10-31 2013-05-20 Olympus Corp Distance measurement device, imaging apparatus, and distance measurement method
JP2013246052A (en) 2012-05-25 2013-12-09 Olympus Corp Distance measuring apparatus
JP2014026050A (en) 2012-07-25 2014-02-06 Olympus Corp Image capturing device and image processing device
JP2014026051A (en) 2012-07-25 2014-02-06 Olympus Corp Image capturing device and image processing device
JP2014038151A (en) 2012-08-13 2014-02-27 Olympus Corp Imaging apparatus and phase difference detection method
WO2016003253A1 (en) 2014-07-04 2016-01-07 Samsung Electronics Co., Ltd. Method and apparatus for image capturing and simultaneous depth extraction

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130250109A1 (en) * 2012-03-21 2013-09-26 Ricoh Company, Ltd. Multi-lens camera system, vehicle mounting the multi-lens camera system, and range-finding method executed by the multi-lens camera system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11796722B2 (en) 2019-10-30 2023-10-24 Fujifilm Corporation Optical element, optical device, and imaging apparatus for acquiring multispectral images

Also Published As

Publication number Publication date
US10145994B2 (en) 2018-12-04
JP2016102733A (en) 2016-06-02
US20160154152A1 (en) 2016-06-02

Similar Documents

Publication Publication Date Title
US20190072697A1 (en) Lens device and image capturing device
US10997696B2 (en) Image processing method, apparatus and device
US9154697B2 (en) Camera selection based on occlusion of field of view
US20190327417A1 (en) Image processing apparatus and image capturing apparatus
JP6173156B2 (en) Image processing apparatus, imaging apparatus, and image processing method
WO2016067541A1 (en) Data processing apparatus, imaging apparatus and data processing method
US20150358542A1 (en) Image processing apparatus, image capturing apparatus, image processing method, image capturing method, and non-transitory computer-readable medium for focus bracketing
CN102685511B (en) Image processing apparatus and image processing method
US10395348B2 (en) Image pickup apparatus, image processing apparatus, and control method of image pickup apparatus
JP5246078B2 (en) Object location program and camera
CN102227746A (en) Stereoscopic image processing device, method, recording medium and stereoscopic imaging apparatus
JP5406151B2 (en) 3D imaging device
CN106296625A (en) Image processing apparatus and image processing method, camera head and image capture method
US9148552B2 (en) Image processing apparatus, image pickup apparatus, non-transitory storage medium storing image processing program and image processing method
US20160035099A1 (en) Depth estimation apparatus, imaging device, and depth estimation method
CN110998228A (en) Distance measuring camera
JP2013061850A (en) Image processing apparatus and image processing method for noise reduction
JP6034197B2 (en) Image processing apparatus, three-dimensional imaging apparatus, image processing method, and image processing program
JP2013097154A (en) Distance measurement device, imaging apparatus, and distance measurement method
US9113088B2 (en) Method and apparatus for photographing an image using light from multiple light sources
KR20190051371A (en) Camera module including filter array of complementary colors and electronic device including the camera module
US20150009394A1 (en) Focusing control method using colour channel analysis
KR101747844B1 (en) method for processing image of camera module
US11595625B2 (en) Mechanical infrared light filter
JP6352150B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION