WO2016038886A1 - Imaging device, image processing device, and imaging method - Google Patents

Imaging device, image processing device, and imaging method Download PDF

Info

Publication number
WO2016038886A1
WO2016038886A1 PCT/JP2015/004566 JP2015004566W WO2016038886A1 WO 2016038886 A1 WO2016038886 A1 WO 2016038886A1 JP 2015004566 W JP2015004566 W JP 2015004566W WO 2016038886 A1 WO2016038886 A1 WO 2016038886A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
imaging
segmented
segmented images
Prior art date
Application number
PCT/JP2015/004566
Other languages
French (fr)
Inventor
Norikatsu Niinami
Original Assignee
Ricoh Company, Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Company, Limited filed Critical Ricoh Company, Limited
Publication of WO2016038886A1 publication Critical patent/WO2016038886A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention relates to an imaging device, an image processing device, and an imaging method.
  • Patent Literature 1 discloses, for example, an invention of an image processing device that can highly accurately detect a backlight area from a whole celestial sphere image and appropriately correct backlight.
  • an imaging device comprising: an imaging unit that captures an image; a segmentation unit that segments the image into a plurality of segmented images; a determination unit that determines a scene of each of the segmented images; and a processing unit that processes each of the segmented images by a correction parameter according to the scene.
  • FIG. 1 is a pattern diagram illustrating an exemplary imaging device of an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an exemplary configuration of the imaging device of the embodiment.
  • FIG. 3 is a diagram illustrating an exemplary whole celestial sphere image of the embodiment.
  • FIG. 4 is a diagram illustrating an exemplary segmented image of the embodiment.
  • FIG. 5 is a diagram illustrating an exemplary correction parameter of the embodiment.
  • FIG. 6 is a diagram illustrating an exemplary scene selected by the imaging device of the embodiment.
  • FIG. 7 is a flowchart illustrating an exemplary method for operating the imaging device of the embodiment.
  • FIG. 8 is a flowchart illustrating an exemplary indoor image/outdoor image determining method of the embodiment.
  • FIG. 1 is a pattern diagram illustrating an exemplary imaging device of an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an exemplary configuration of the imaging device of the embodiment.
  • FIG. 3 is a diagram illustrating an exemplary
  • FIG. 9 is a flowchart illustrating an exemplary boundary image determining method of the embodiment.
  • FIG. 10 is a flowchart illustrating an exemplary image processing method of the embodiment.
  • FIG. 11 is a diagram illustrating an exemplary hardware configuration of the imaging device of the embodiment.
  • an imaging device an image processing device, and an imaging method will be described below with reference to the attached figures.
  • the device for imaging a whole celestial sphere, using two imaging units will be described.
  • the present invention can be applied when a captured image includes a plurality of categories and scenes. That is, the present invention can also be applied to an imaging device such as an ordinary single-lens reflex camera and a smartphone, as long as an image can be segmented into a plurality of areas, and a category and a scene can be set for each segmented area.
  • FIG. 1 is a pattern diagram illustrating an imaging device 100 of the embodiment.
  • the imaging device 100 of the embodiment includes an imaging element 101, a fish-eye lens 102, an imaging element 103, a fish-eye lens 104, a housing 121, and an imaging switch 122.
  • the imaging element 101 converts light, having passed through the fish-eye lens 102 having an angle of view of 180 degrees or more in the front of the imaging device 100, into an electric signal.
  • the imaging element 103 converts light, having passed through the fish-eye lens 104 having an angle of view of 180 degrees or more in the back of the imaging device 100, into an electric signal.
  • the imaging elements 101 and 103 include a complementary metal oxide semiconductor (CMOS) sensor, for example.
  • CMOS complementary metal oxide semiconductor
  • the imaging switch 122 is provided on one side of the housing 121.
  • the imaging device 100 also includes various types of operation buttons, a power switch, a touch panel, and the like.
  • FIG. 2 is a diagram illustrating an exemplary configuration of the imaging device 100 of the embodiment.
  • the imaging device 100 of the embodiment includes a first imaging unit 1, a second imaging unit 2, a storage unit 3, a combining unit 4, a first detection unit 5, a first correction unit 6, a segmentation unit 7, a determining unit 8, a determination unit 9, a processing unit 10, a second detection unit 11, a calculation unit 12, a second correction unit 13, a display unit 14, and a selection unit 15.
  • the first imaging unit 1 captures a first image.
  • the first image is an image including an object in the front (0 to 180 degrees) of the imaging device 100.
  • an angle of view photographed by the first imaging unit 1 is 180 degrees or more (preferably 180 degrees or more and 190 degrees or less).
  • This first image photographed by the first imaging unit 1 has an area overlapping a second image in the back which will be described later.
  • the first imaging unit 1 stores the first image in the storage unit 3.
  • the second imaging unit 2 captures the second image.
  • the second image is an image including an object in the back (180 to 360 degrees) of the imaging device 100.
  • an angle of view photographed by the second imaging unit 2 is 180 degrees or more (preferably 180 degrees or more and 190 degrees or less).
  • This second image photographed by the second imaging unit 2 has an area overlapping the first image in the front described above.
  • the second imaging unit 2 stores the second image in the storage unit 3.
  • the storage unit 3 stores the first and second images.
  • the storage unit 3 also combines the first and second images and stores the generated whole celestial sphere image which will be described later. Furthermore, the storage unit 3 stores a category and a scene for each segmented image (image corresponding to a part of the areas of the whole celestial sphere image) which will be described later.
  • the scene indicates the type of the segmented image, determined in accordance with an object included in the segmented image.
  • the category is information on the segmented image classified in accordance with the scene.
  • the category of the segmented image includes an outdoor image obtained by imaging outdoors, an indoor image obtained by imaging indoors, or a boundary image positioned at the boundary between the outdoor image and the indoor image.
  • the storage unit 3 also stores the (type of) category and the (type of) scene included therein. In addition, the storage unit 3 stores a correction parameter used by the processing unit 10 which will be described later. The storage unit 3 stores the correction parameter for each scene. An example of the correction parameter stored for each scene will be described later with reference to FIG. 5.
  • the combining unit 4 generates the whole celestial sphere image showing images in all directions from the imaging spot by combining the first and second images by superimposing the overlapping areas.
  • the combining unit 4 stores the whole celestial sphere image in the storage unit 3.
  • FIG. 3 is a diagram illustrating an example of the whole celestial sphere image 23 of the embodiment.
  • FIG. 3 is an example of a case where the whole celestial sphere image 23 including the objects in all directions from the imaging spot is generated by combining the overlapping areas of the first image 21 and the second image 22 as a mark for each other.
  • the whole celestial sphere image 23 is created by the equidistant cylindrical projection, Mercator projection, or the like. When this whole celestial sphere image 23 is pasted and displayed on a spherical object, the images in all directions will be displayed.
  • the whole celestial sphere image 23 may be a moving image or a still image.
  • the first detection unit 5 detects a zenith direction of the imaging device 100 at the time of imaging (inclination of the imaging device 100 at the time of imaging).
  • the first detection unit 5 detects the zenith direction of the imaging device 100 at the time of imaging by an acceleration sensor, for example.
  • the first detection unit 5 inputs, into the first correction unit 6, information on zenith direction that indicates the zenith direction at the time of imaging.
  • the first correction unit 6 receives the information on zenith direction from the first detection unit 5.
  • the first correction unit 6 reads the whole celestial sphere image from the storage unit 3.
  • the first correction unit 6 corrects the whole celestial sphere image based on the information on zenith direction. Specifically, the first correction unit 6 corrects the zenith direction of the whole celestial sphere image by the rotation of the whole celestial sphere image or the movement of the position thereof. Accordingly, a deviation in the zenith direction of the whole celestial sphere image, attributable to the inclination of the imaging device 100 at the time of imaging, is corrected.
  • the first correction unit 6 overwrites the whole celestial sphere image in the storage unit 3 with the whole celestial sphere image after correction.
  • the segmentation unit 7 reads, from the storage unit 3, the whole celestial sphere image with the deviation in the zenith direction corrected.
  • the segmentation unit 7 segments the whole celestial sphere image into a plurality of segmented images.
  • FIG. 4 is a diagram illustrating an example of the segmented images of the embodiment.
  • the determining unit 8 reads, from the storage unit 3, the segmented images in the upper half of the whole celestial sphere image 23 (segmented images Fun (1 ⁇ n ⁇ N/2) and Run (1 ⁇ n ⁇ N/2)).
  • the determining unit 8 determines a category of segmented image for each of the segmented images. Determining processing is processing of determining the category (outdoor, indoor, or boundary) of the segmented image, indicating any one of outdoor images obtained by imaging outdoors, indoor images obtained by imaging indoors, and boundary images positioned at the boundary between the outdoor images and the indoor images. Details of the determining processing will be described later with reference to FIGS. 8 and 9.
  • the determining unit 8 determines, by the categories of the segmented images in the upper half, the category of each of the segmented images in the bottom half (segmented images Fbn (1 ⁇ n ⁇ N/2) and segmented images (Rbn (1 ⁇ n ⁇ N/2)), the position in the horizontal direction of which is the same as that of the upper half.
  • the determining unit 8 associates the categories determined for the segmented images and stores them in the storage unit 3.
  • the determination unit 9 determines the scene of segmented image for each segmented image. Specifically, the determination unit 9 reads the segmented images and the categories thereof from the storage unit 3. The determination unit 9 determines the scene of the segmented image by recognizing the optimum scene among a plurality of scenes associated with the category. The scene is determined in order for the processing unit 10 described later to appropriately process each segmented image by using a correction parameter according to an object. Scenes in the indoor category include, for example, cooking, furniture, and the like. Scenes in the outdoor category include autumn leaf, snow, cloud, wood, and the like. In a case where the category of segmented image is boundary, the determination unit 9 does not determine the scene of the segmented image. The determination unit 9 stores the scene determined for each segmented image in the storage unit 3. Therefore, in the storage unit 3, the category and the scene are associated and stored for each segmented image. In a case where the category is boundary, the scene is stored as Null.
  • the processing unit 10 processes each segmented image by a correction parameter according to the scene.
  • the processing unit 10 processes the image by an intermediate-value correction parameter between a correction parameter used for the outdoor image adjacent to the boundary image and a correction parameter used for the indoor image adjacent to the boundary image.
  • an intermediate-value correction parameter between a correction parameter used in a case where the scene is autumn leaf, and a correction parameter used in a case where the scene is cooking is used.
  • the processing unit 10 overwrites the segmented image in the storage unit 3 (some area of the whole celestial sphere image) with the segmented image after image processing.
  • FIG. 5 is a diagram illustrating an example of the correction parameter of the embodiment.
  • the processing unit 10 processes the image by a correction parameter related to exposure setting to make the segmented image brighter and a correction parameter related to color matrix parameter (color reproduction correction) to increase the saturation of the segmented image.
  • the actual values of the correction parameters may be set at appropriate values in accordance with, for example, photographing conditions.
  • the second detection unit 11 reads the segmented image after image processing from the storage unit 3.
  • the second detection unit 11 detects the luminance of the segmented image.
  • the second detection unit 11 inputs luminance information indicating the luminance of the segmented image into the calculation unit 12.
  • the calculation unit 12 receives luminance information of each of the segmented images from the second detection unit 11.
  • the calculation unit 12 calculates, for each of the segmented images in the upper half and in the bottom half, a difference in luminance of the segmented images between the upper half and the bottom half, the positions in the horizontal direction of which are the same.
  • the calculation unit 12 inputs, into the second correction unit 13, luminance-difference information indicating a difference in luminance of the segmented images between the upper half and the bottom half, the positions in the horizontal direction of which are the same.
  • the second correction unit 13 receives luminance-difference information from the calculation unit 12.
  • the second correction unit 13 makes different corrections for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half, in a case where the number of sets of the segmented images in the upper half and the segmented images in the bottom half, the luminance difference between which is greater than a first threshold, is greater than a second threshold.
  • a whole celestial sphere image with a wider D range of luminance can be obtained.
  • the second correction unit 13 overwrites the whole celestial sphere image in the storage unit 3 with the whole celestial sphere image where different corrections have been made for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half.
  • the display unit 14 displays, for example, the conditions of the imaging device 100.
  • the display unit 14 displays a monitoring screen which displays the conditions of all directions of the imaging spot at the time of imaging by the imaging device 100, for example.
  • the processing in the first imaging unit 1, the second imaging unit 2, the storage unit 3, the combining unit 4, the first detection unit 5, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 is executed in real time during monitoring.
  • the whole celestial sphere image obtained as a result of the processing is displayed (reproduced) on the display unit 14.
  • the display unit 14 also displays one or more types of scenes (autumn leaf, cooking, and the like) determined by the determination unit 9.
  • the selection unit 15 selects a scene from among one or more types of scenes displayed on the display unit 14 based on the operation by a user of the imaging device 100. At this time, the display unit 14 displays (reproduces), as one image, one or more segmented images recognized as the scenes.
  • the storage unit 3 stores, as one image, one or more segmented images recognized as the scenes. As a result, only images including scenes the user of the imaging device 100 wishes to image can be displayed and stored.
  • FIG. 6 is a pattern diagram illustrating an exemplary scene selected by the imaging device 100.
  • FIG. 6 illustrates a case where a dog is selected as a scene the user of the imaging device 100 wishes to image.
  • the display unit 14 displays, as one image, one or more segmented images including the dog as an object (four images in the example of FIG. 6).
  • the storage unit 3 stores, as one image, one or more segmented images including the dog as an object.
  • FIG. 7 is a flowchart illustrating an exemplary method for operating the imaging device 100 of the embodiment.
  • the first imaging unit 1 captures the first image
  • the second imaging unit 2 captures the second image
  • the combining unit 4 generates the whole celestial sphere image showing images in all directions of the imaging spot by combining the first and second images (step S2).
  • the first correction unit 6 corrects the zenith direction of the whole celestial sphere image by the rotation of the whole celestial sphere image or the movement of the position thereof in accordance with the zenith direction (step S3).
  • the segmentation unit 7 segments the whole celestial sphere image into a plurality of segmented images (step S4).
  • the determining unit 8 determines the category (indoor image/outdoor image) of segmented image for each of the segmented images (step S5). Details of the indoor image/outdoor image determining processing in step S4 will be described later with reference to FIG. 8.
  • the determining unit 8 determines the category (boundary image) of segmented image for each of the segmented images (step S6). Details of the boundary image determining processing in step S6 will be described later with reference to FIG. 9.
  • the determination unit 9 determines the scene of each segmented image by recognizing the optimum scene among a plurality of scenes associated with the category of the segmented image, and the processing unit 10 processes each segmented image by the correction parameter according to the scene (step S7). Details of the image processing in step S7 will be described later with reference to FIG. 10.
  • the second correction unit 13 makes corrections for gradation characteristics of the whole celestial sphere image (step S8). Specifically, the second detection unit 11 first detects the luminance of the segmented image. Next, the calculation unit 12 calculates, for each of the segmented images in the upper half and in the bottom half, a difference in luminance of the segmented images between the upper half and the bottom half, the positions in the horizontal direction of which are the same.
  • the second correction unit 13 makes different corrections for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half, in a case where the number of sets of the segmented images in the upper half and the segmented images in the bottom half, the luminance difference between which is greater than a first threshold, is greater than a second threshold.
  • step S5 details of the indoor image/outdoor image determining method in step S5 will be described.
  • FIG. 8 is a flowchart illustrating an example of the indoor image/outdoor image determining method of the embodiment.
  • the indoor image/outdoor image determining method of the segmented images Fun (1 ⁇ n ⁇ N/2) corresponding to the area in the upper half of the first image 21 and the segmented images Fbn (1 ⁇ n ⁇ N/2) corresponding to the area in the bottom half of the first image 21 will be described.
  • the determining unit 8 determines whether or not an image showing a blue sky is included in the segmented images Fun (1 ⁇ n ⁇ N/2) (step S21).
  • the determining unit 8 compares, for example, the segmented images Fun with an image showing a blue sky stored in advance, and determines whether or not the image showing the blue sky is included in the segmented images Fun based on the similarity between the two images. If the image showing the blue sky is included (step S21, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
  • step S21 determines whether or not an image showing a cloud is included in the segmented images Fun (step S22). If the image showing the cloud is included (step S22, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
  • the determining unit 8 determines whether or not an image showing a sunrise glow or a sunset glow is included in the segmented images Fun (step S23). If the image showing the sunrise glow or the sunset glow is included (step S23, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
  • the determining unit 8 determines whether an image showing a night sky is included in the segmented images Fun (step S24). If the image showing the night sky is included (step S24, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
  • step S24 If the image showing the night sky is not included (step S24, No), the determining unit 8 determines that the category of the segmented images Fun is indoor (step S25).
  • the flowchart of FIG. 8 also applies to segmented images Run (1 ⁇ n ⁇ N/2) corresponding to the area in the upper half of the second image 22 and segmented images Rbn (1 ⁇ n ⁇ N/2) corresponding to the area in the bottom half of the second image 22.
  • step S6 details of a boundary image determining method in step S6 will be described.
  • FIG. 9 is a flowchart illustrating an example of the boundary image determining method of the embodiment.
  • a case will be described, where the boundary of the area in the upper half of the first image 21 (segmented images Fun (1 ⁇ n ⁇ N/2)) is determined.
  • the flowchart of FIG. 8 also applies to the area in the bottom half of the first image 21 (segmented images Fbn (1 ⁇ n ⁇ N/2)), the area in the upper half of the second image 22 (segmented images Run (1 ⁇ n ⁇ N/2)), and the area in the bottom half of the second image 22 (segmented images Rbn (1 ⁇ n ⁇ N/2)).
  • step S7 details of the image processing method in step S7 will be described.
  • FIG. 10 a case will be described, where the image of the area in the upper half of the first image 21 (segmented images Fun (1 ⁇ n ⁇ N/2)) is processed.
  • the flowchart of FIG. 10 also applies to the area in the bottom half of the first image 21 (segmented images Fbn (1 ⁇ n ⁇ N/2)), the area in the upper half of the second image 22 (segmented images Run (1 ⁇ n ⁇ N/2)), and the area in the bottom half of the second image 22 (segmented images Rbn (1 ⁇ n ⁇ N/2)).
  • FIG. 10 is a flowchart illustrating an example of the image processing method of the embodiment.
  • the determination unit 9 assigns a value of 1 to a variable x (step S61).
  • Fux(S) indicates the category (outdoor, indoor, or boundary) of the segmented image Fux, determined by the flowcharts of FIGS. 8 and 9.
  • the processing unit 10 subjects the segmented image Fux to image processing optimized for each scene (step S64). Specifically, the processing unit 10 processes the segmented image Fux by using a correction parameter according to the scene determined in step S62. Next, the processing proceeds to step S69.
  • the processing unit 10 subjects the segmented image Fux to image processing optimized for each scene (step S67). Specifically, the processing unit 10 processes the segmented image Fux by using a correction parameter according to the scene determined in step S66. Next, the processing proceeds to step S69.
  • step S65 boundary (step S65, Yes)
  • the processing unit 10 processes the segmented image Fux by the intermediate-value correction parameter between the correction parameters used for processing the adjacent segmented images.
  • the processing unit 10 processes the image, for example, by the intermediate-value correction parameter between the correction parameter used for the outdoor image Fux-1 adjacent to the boundary image Fux and the correction parameter used for the indoor image Fux+1 adjacent to the boundary image (step S68). Next, the processing proceeds to step S69.
  • FIG. 11 is a diagram illustrating an example of the hardware configuration of the imaging device 100 of the embodiment.
  • the imaging device 100 of the embodiment includes the imaging element 101, the fish-eye lens 102, the imaging element 103, the fish-eye lens 104, an image processing device 105, a CPU 106, an acceleration sensor 107, a display device 108, an operation button 109, and a storage medium 111.
  • the imaging device 100 can also communicate with a smart device 110.
  • the imaging element 101 forms an image with light entering through the fish-eye lens 102.
  • a target image formed on the imaging element 101 is converted into an image signal (electric signal) showing the first image 21.
  • the imaging element 101 inputs the image signal into the image processing device 105.
  • the imaging element 101 and the fish-eye lens 102 correspond to the first imaging unit 1 (see FIG. 2).
  • the imaging element 103 forms an image with light entering through the fish-eye lens 104.
  • a target image formed on the imaging element 103 is converted into an image signal (electric signal) showing the second image 22.
  • the imaging element 103 inputs the image signal into the image processing device 105.
  • the imaging element 103 and the fish-eye lens 104 correspond to the second imaging unit 2 (see FIG. 2).
  • the image processing device 105 and the CPU 106 correspond to the combining unit 4, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 (see FIG. 2).
  • a function block realized by hardware is realized by the image processing device 105.
  • a function block realized by software is realized by the CPU 106.
  • the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13, which function block is to be realized by hardware and which function block is to be realized by software can be determined arbitrarily.
  • the acceleration sensor 107 detects the zenith direction of the imaging device 100 at the time of imaging. Information detected by the acceleration sensor 107 is processed by the first correction unit 6 realized by the image processing device 105 or the CPU 106.
  • the display device 108, the operation button 109, and the smart device 110 are user interfaces for operating the imaging device 100.
  • the display device 108, the operation button 109, and the smart device 110 correspond to the display unit 14 and the selection unit 15 (see FIG. 2).
  • the storage medium 111 stores, for example, the first image 21, the second image 22, and the whole celestial sphere image 23.
  • the storage medium 111 corresponds to the storage unit 3 (see FIG. 2).
  • the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 are realized as a program
  • a modularly-configured program including a part or all thereof is housed in the storage medium 111.
  • the program may be recorded and provided in a computer readable recording medium such as a CD-ROM, a flexible disc (FD), a CD-R, a digital versatile disk (DVD), and a universal serial bus (USB) in a file in an installable form or in an executable form.
  • a computer readable recording medium such as a CD-ROM, a flexible disc (FD), a CD-R, a digital versatile disk (DVD), and a universal serial bus (USB) in a file in an installable form or in an executable form.
  • the program may be provided or distributed via a network such as the Internet.
  • the segmentation unit 7 segments the whole celestial sphere image into a plurality of segmented images.
  • the determination unit 9 determines the scene of each segmented image.
  • the processing unit 10 can process each segmented image by using the correction parameter according to the scene. Therefore, even when a plurality of scenes is included in one whole celestial sphere image, an optimum quality whole celestial sphere image can be generated.
  • the segmentation unit 7 segments an image into a plurality of segmented images.
  • the determination unit 9 determines the scene of each segmented image.
  • the processing unit 10 can process each segmented image by using the correction parameter according to the scene. Therefore, even when a plurality of scenes is included in one image, an optimum quality image can be generated.
  • an optimum quality image can be generated even when a plurality of scenes is included in an image.
  • first imaging unit 2 second imaging unit 3 storage unit 4 combining unit 5 first detection unit 6 first correction unit 7 segmentation unit 8 determining unit 9 determination unit 10 processing unit 11 second detection unit 12 calculation unit 13 second correction unit 14 display unit 15 selection unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

  An imaging device of an embodiment includes an imaging unit that captures an image, a segmentation unit that segments the image into a plurality of segmented images, a determination unit that determines a scene of each of the segmented images, and a processing unit that processes each of the segmented images by a correction parameter according to the scene.

Description

IMAGING DEVICE, IMAGE PROCESSING DEVICE, AND IMAGING METHOD
      The present invention relates to an imaging device, an image processing device, and an imaging method.
      A technique to image all directions of an imaging spot and form them into one image (hereinafter referred to as the “whole celestial sphere image”) is known.  Patent Literature 1 discloses, for example, an invention of an image processing device that can highly accurately detect a backlight area from a whole celestial sphere image and appropriately correct backlight.
      In the related art, however, it has been difficult to generate an optimum quality image when a plurality of scenes is included in an image.
      In view of the above, there is a need to provide an imaging device, an image processing device, and an imaging method which can generate an optimum quality image even when a plurality of scenes is included in an image.
      According to an embodiment of the present invention, there is provided an imaging device comprising: an imaging unit that captures an image; a segmentation unit that segments the image into a plurality of segmented images; a determination unit that determines a scene of each of the segmented images; and a processing unit that processes each of the segmented images by a correction parameter according to the scene.
      The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
FIG. 1 is a pattern diagram illustrating an exemplary imaging device of an embodiment of the present invention. FIG. 2 is a diagram illustrating an exemplary configuration of the imaging device of the embodiment. FIG. 3 is a diagram illustrating an exemplary whole celestial sphere image of the embodiment. FIG. 4 is a diagram illustrating an exemplary segmented image of the embodiment. FIG. 5 is a diagram illustrating an exemplary correction parameter of the embodiment. FIG. 6 is a diagram illustrating an exemplary scene selected by the imaging device of the embodiment. FIG. 7 is a flowchart illustrating an exemplary method for operating the imaging device of the embodiment. FIG. 8 is a flowchart illustrating an exemplary indoor image/outdoor image determining method of the embodiment. FIG. 9 is a flowchart illustrating an exemplary boundary image determining method of the embodiment. FIG. 10 is a flowchart illustrating an exemplary image processing method of the embodiment. FIG. 11 is a diagram illustrating an exemplary hardware configuration of the imaging device of the embodiment.
      An embodiment of an imaging device, an image processing device, and an imaging method will be described below with reference to the attached figures.  In the present embodiment, the device for imaging a whole celestial sphere, using two imaging units, will be described.  However, the present invention can be applied when a captured image includes a plurality of categories and scenes.  That is, the present invention can also be applied to an imaging device such as an ordinary single-lens reflex camera and a smartphone, as long as an image can be segmented into a plurality of areas, and a category and a scene can be set for each segmented area.
      FIG. 1 is a pattern diagram illustrating an imaging device 100 of the embodiment.  The imaging device 100 of the embodiment includes an imaging element 101, a fish-eye lens 102, an imaging element 103, a fish-eye lens 104, a housing 121, and an imaging switch 122.  The imaging element 101 converts light, having passed through the fish-eye lens 102 having an angle of view of 180 degrees or more in the front of the imaging device 100, into an electric signal.  The imaging element 103 converts light, having passed through the fish-eye lens 104 having an angle of view of 180 degrees or more in the back of the imaging device 100, into an electric signal.  The imaging elements 101 and 103 include a complementary metal oxide semiconductor (CMOS) sensor, for example.  The imaging switch 122 is provided on one side of the housing 121.  Although not illustrated in FIG. 1, the imaging device 100 also includes various types of operation buttons, a power switch, a touch panel, and the like.
      FIG. 2 is a diagram illustrating an exemplary configuration of the imaging device 100 of the embodiment.  The imaging device 100 of the embodiment includes a first imaging unit 1, a second imaging unit 2, a storage unit 3, a combining unit 4, a first detection unit 5, a first correction unit 6, a segmentation unit 7, a determining unit 8, a determination unit 9, a processing unit 10, a second detection unit 11, a calculation unit 12, a second correction unit 13, a display unit 14, and a selection unit 15.
      The first imaging unit 1 captures a first image.  The first image is an image including an object in the front (0 to 180 degrees) of the imaging device 100.  At this time, an angle of view photographed by the first imaging unit 1 is 180 degrees or more (preferably 180 degrees or more and 190 degrees or less).  This first image photographed by the first imaging unit 1 has an area overlapping a second image in the back which will be described later.  The first imaging unit 1 stores the first image in the storage unit 3.
      The second imaging unit 2 captures the second image.  The second image is an image including an object in the back (180 to 360 degrees) of the imaging device 100.  At this time, an angle of view photographed by the second imaging unit 2 is 180 degrees or more (preferably 180 degrees or more and 190 degrees or less).  This second image photographed by the second imaging unit 2 has an area overlapping the first image in the front described above.  The second imaging unit 2 stores the second image in the storage unit 3.
      The storage unit 3 stores the first and second images.  The storage unit 3 also combines the first and second images and stores the generated whole celestial sphere image which will be described later.  Furthermore, the storage unit 3 stores a category and a scene for each segmented image (image corresponding to a part of the areas of the whole celestial sphere image) which will be described later.  The scene indicates the type of the segmented image, determined in accordance with an object included in the segmented image.  The category is information on the segmented image classified in accordance with the scene.  The category of the segmented image includes an outdoor image obtained by imaging outdoors, an indoor image obtained by imaging indoors, or a boundary image positioned at the boundary between the outdoor image and the indoor image.  The storage unit 3 also stores the (type of) category and the (type of) scene included therein.  In addition, the storage unit 3 stores a correction parameter used by the processing unit 10 which will be described later.  The storage unit 3 stores the correction parameter for each scene.  An example of the correction parameter stored for each scene will be described later with reference to FIG. 5.
      The combining unit 4 generates the whole celestial sphere image showing images in all directions from the imaging spot by combining the first and second images by superimposing the overlapping areas.  The combining unit 4 stores the whole celestial sphere image in the storage unit 3.
      FIG. 3 is a diagram illustrating an example of the whole celestial sphere image 23 of the embodiment.  FIG. 3 is an example of a case where the whole celestial sphere image 23 including the objects in all directions from the imaging spot is generated by combining the overlapping areas of the first image 21 and the second image 22 as a mark for each other.  The whole celestial sphere image 23 is created by the equidistant cylindrical projection, Mercator projection, or the like.  When this whole celestial sphere image 23 is pasted and displayed on a spherical object, the images in all directions will be displayed.  The whole celestial sphere image 23 may be a moving image or a still image.
      Referring back to FIG. 2, the first detection unit 5 detects a zenith direction of the imaging device 100 at the time of imaging (inclination of the imaging device 100 at the time of imaging).  The first detection unit 5 detects the zenith direction of the imaging device 100 at the time of imaging by an acceleration sensor, for example.  The first detection unit 5 inputs, into the first correction unit 6, information on zenith direction that indicates the zenith direction at the time of imaging.
      The first correction unit 6 receives the information on zenith direction from the first detection unit 5.  The first correction unit 6 reads the whole celestial sphere image from the storage unit 3.  The first correction unit 6 corrects the whole celestial sphere image based on the information on zenith direction.  Specifically, the first correction unit 6 corrects the zenith direction of the whole celestial sphere image by the rotation of the whole celestial sphere image or the movement of the position thereof.  Accordingly, a deviation in the zenith direction of the whole celestial sphere image, attributable to the inclination of the imaging device 100 at the time of imaging, is corrected.  The first correction unit 6 overwrites the whole celestial sphere image in the storage unit 3 with the whole celestial sphere image after correction.
      The segmentation unit 7 reads, from the storage unit 3, the whole celestial sphere image with the deviation in the zenith direction corrected.  The segmentation unit 7 segments the whole celestial sphere image into a plurality of segmented images.
      FIG. 4 is a diagram illustrating an example of the segmented images of the embodiment.  The segmentation unit 7 segments the whole celestial sphere image into 2N pieces of segmented images by segmenting the whole celestial sphere image 23 into N pieces (N is an integer equal to or greater than two) in a vertical direction, and two pieces in a horizontal direction.  Specifically, the segmentation unit 7 first segments an area corresponding to the first image 21 into n pieces in strips (n = N/2), and by further segmenting the area segmented into strips into two parts, an upper half and a bottom half, obtains segmented images Fun and Fbn (1 ≦ n ≦ N/2).  Similarly, the segmentation unit 7 segments an area corresponding to the second image 22 into n pieces in strips (n = N/2), and by further segmenting the area segmented into strips into two parts, an upper half and a bottom half, obtains segmented images Run and Rbn (1 ≦ n ≦ N/2).
      Referring back to FIG. 2, the determining unit 8 reads, from the storage unit 3, the segmented images in the upper half of the whole celestial sphere image 23 (segmented images Fun (1 ≦ n ≦ N/2) and Run (1 ≦ n ≦ N/2)).  The determining unit 8 determines a category of segmented image for each of the segmented images.  Determining processing is processing of determining the category (outdoor, indoor, or boundary) of the segmented image, indicating any one of outdoor images obtained by imaging outdoors, indoor images obtained by imaging indoors, and boundary images positioned at the boundary between the outdoor images and the indoor images.  Details of the determining processing will be described later with reference to FIGS. 8 and 9.  The determining unit 8 determines, by the categories of the segmented images in the upper half, the category of each of the segmented images in the bottom half (segmented images Fbn (1 ≦ n ≦ N/2) and segmented images (Rbn (1 ≦ n ≦ N/2)), the position in the horizontal direction of which is the same as that of the upper half.  The determining unit 8 associates the categories determined for the segmented images and stores them in the storage unit 3.
      The determination unit 9 determines the scene of segmented image for each segmented image.  Specifically, the determination unit 9 reads the segmented images and the categories thereof from the storage unit 3.  The determination unit 9 determines the scene of the segmented image by recognizing the optimum scene among a plurality of scenes associated with the category.  The scene is determined in order for the processing unit 10 described later to appropriately process each segmented image by using a correction parameter according to an object.  Scenes in the indoor category include, for example, cooking, furniture, and the like.  Scenes in the outdoor category include autumn leaf, snow, cloud, wood, and the like.  In a case where the category of segmented image is boundary, the determination unit 9 does not determine the scene of the segmented image.  The determination unit 9 stores the scene determined for each segmented image in the storage unit 3.  Therefore, in the storage unit 3, the category and the scene are associated and stored for each segmented image.  In a case where the category is boundary, the scene is stored as Null.
      The processing unit 10 processes each segmented image by a correction parameter according to the scene.  In a case where the category of the segmented image is boundary, the processing unit 10 processes the image by an intermediate-value correction parameter between a correction parameter used for the outdoor image adjacent to the boundary image and a correction parameter used for the indoor image adjacent to the boundary image.  For example, in processing the boundary image between the outdoor image (autumn leaf) and the indoor image (cooking), an intermediate-value correction parameter between a correction parameter used in a case where the scene is autumn leaf, and a correction parameter used in a case where the scene is cooking, is used.  As a result, it is possible to prevent inconsistencies in the area of the boundary image, caused by the effects of image processing using different correction parameters.  The processing unit 10 overwrites the segmented image in the storage unit 3 (some area of the whole celestial sphere image) with the segmented image after image processing.
      Here, an example of the correction parameter according to the scene will be described.
      FIG. 5 is a diagram illustrating an example of the correction parameter of the embodiment.  For example, in a case where the scene is cooking, the processing unit 10 processes the image by a correction parameter related to exposure setting to make the segmented image brighter and a correction parameter related to color matrix parameter (color reproduction correction) to increase the saturation of the segmented image.  The actual values of the correction parameters may be set at appropriate values in accordance with, for example, photographing conditions.
      Referring back to FIG. 2, the second detection unit 11 reads the segmented image after image processing from the storage unit 3.  The second detection unit 11 detects the luminance of the segmented image.  The second detection unit 11 inputs luminance information indicating the luminance of the segmented image into the calculation unit 12.
      The calculation unit 12 receives luminance information of each of the segmented images from the second detection unit 11.  The calculation unit 12 calculates, for each of the segmented images in the upper half and in the bottom half, a difference in luminance of the segmented images between the upper half and the bottom half, the positions in the horizontal direction of which are the same.  The calculation unit 12 inputs, into the second correction unit 13, luminance-difference information indicating a difference in luminance of the segmented images between the upper half and the bottom half, the positions in the horizontal direction of which are the same.
      The second correction unit 13 receives luminance-difference information from the calculation unit 12.  The second correction unit 13 makes different corrections for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half, in a case where the number of sets of the segmented images in the upper half and the segmented images in the bottom half, the luminance difference between which is greater than a first threshold, is greater than a second threshold.  As a result, a whole celestial sphere image with a wider D range of luminance can be obtained.  The second correction unit 13 overwrites the whole celestial sphere image in the storage unit 3 with the whole celestial sphere image where different corrections have been made for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half.
      The display unit 14 displays, for example, the conditions of the imaging device 100.  The display unit 14 displays a monitoring screen which displays the conditions of all directions of the imaging spot at the time of imaging by the imaging device 100, for example.  At this time, the processing in the first imaging unit 1, the second imaging unit 2, the storage unit 3, the combining unit 4, the first detection unit 5, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 is executed in real time during monitoring.  The whole celestial sphere image obtained as a result of the processing is displayed (reproduced) on the display unit 14.  The display unit 14 also displays one or more types of scenes (autumn leaf, cooking, and the like) determined by the determination unit 9.
      The selection unit 15 selects a scene from among one or more types of scenes displayed on the display unit 14 based on the operation by a user of the imaging device 100.  At this time, the display unit 14 displays (reproduces), as one image, one or more segmented images recognized as the scenes.  The storage unit 3 stores, as one image, one or more segmented images recognized as the scenes.  As a result, only images including scenes the user of the imaging device 100 wishes to image can be displayed and stored.
      FIG. 6 is a pattern diagram illustrating an exemplary scene selected by the imaging device 100.  FIG. 6 illustrates a case where a dog is selected as a scene the user of the imaging device 100 wishes to image.  At this time, the display unit 14 displays, as one image, one or more segmented images including the dog as an object (four images in the example of FIG. 6).  The storage unit 3 stores, as one image, one or more segmented images including the dog as an object.
      Next, a method for operating the imaging device 100 of the embodiment will be described.
      FIG. 7 is a flowchart illustrating an exemplary method for operating the imaging device 100 of the embodiment.
      First, the first imaging unit 1 captures the first image, and the second imaging unit 2 captures the second image (step S1).  Next, the combining unit 4 generates the whole celestial sphere image showing images in all directions of the imaging spot by combining the first and second images (step S2).  Next, the first correction unit 6 corrects the zenith direction of the whole celestial sphere image by the rotation of the whole celestial sphere image or the movement of the position thereof in accordance with the zenith direction (step S3).
      Next, the segmentation unit 7 segments the whole celestial sphere image into a plurality of segmented images (step S4).  Next, the determining unit 8 determines the category (indoor image/outdoor image) of segmented image for each of the segmented images (step S5).  Details of the indoor image/outdoor image determining processing in step S4 will be described later with reference to FIG. 8.
      Next, the determining unit 8 determines the category (boundary image) of segmented image for each of the segmented images (step S6).  Details of the boundary image determining processing in step S6 will be described later with reference to FIG. 9.
      Next, the determination unit 9 determines the scene of each segmented image by recognizing the optimum scene among a plurality of scenes associated with the category of the segmented image, and the processing unit 10 processes each segmented image by the correction parameter according to the scene (step S7).  Details of the image processing in step S7 will be described later with reference to FIG. 10.
      Next, the second correction unit 13 makes corrections for gradation characteristics of the whole celestial sphere image (step S8).  Specifically, the second detection unit 11 first detects the luminance of the segmented image.  Next, the calculation unit 12 calculates, for each of the segmented images in the upper half and in the bottom half, a difference in luminance of the segmented images between the upper half and the bottom half, the positions in the horizontal direction of which are the same.  The second correction unit 13 makes different corrections for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half, in a case where the number of sets of the segmented images in the upper half and the segmented images in the bottom half, the luminance difference between which is greater than a first threshold, is greater than a second threshold.
      Next, details of the indoor image/outdoor image determining method in step S5 will be described.
      FIG. 8 is a flowchart illustrating an example of the indoor image/outdoor image determining method of the embodiment.  In FIG. 8, the indoor image/outdoor image determining method of the segmented images Fun (1 ≦ n ≦ N/2) corresponding to the area in the upper half of the first image 21 and the segmented images Fbn (1 ≦ n ≦ N/2) corresponding to the area in the bottom half of the first image 21 will be described.
      First, the determining unit 8 determines whether or not an image showing a blue sky is included in the segmented images Fun (1 ≦ n ≦ N/2) (step S21).  The determining unit 8 compares, for example, the segmented images Fun with an image showing a blue sky stored in advance, and determines whether or not the image showing the blue sky is included in the segmented images Fun based on the similarity between the two images.  If the image showing the blue sky is included (step S21, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
      If the image showing the blue sky is not included (step S21, No), the determining unit 8 determines whether or not an image showing a cloud is included in the segmented images Fun (step S22).  If the image showing the cloud is included (step S22, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
      If the image showing the cloud is not included (step S22, No), the determining unit 8 determines whether or not an image showing a sunrise glow or a sunset glow is included in the segmented images Fun (step S23).  If the image showing the sunrise glow or the sunset glow is included (step S23, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
      If the image showing the sunrise glow or the sunset glow is not included (step S23, No), the determining unit 8 determines whether an image showing a night sky is included in the segmented images Fun (step S24).  If the image showing the night sky is included (step S24, Yes), the determining unit 8 determines that the category of the segmented images Fun is outdoor (step S26).
      If the image showing the night sky is not included (step S24, No), the determining unit 8 determines that the category of the segmented images Fun is indoor (step S25).
      The determining unit 8 determines the categories of the segmented images Fbn (1 ≦ n ≦ N/2) in the bottom half by using the segmented images Fun (1 ≦ n ≦ N/2) in the upper half, the position in the horizontal direction of which is the same as that of the bottom half.  For example, if a segmented image Fu1 = outdoor, it is assumed that a segmented image Fb1 = outdoor, and if a segmented image Fu2 = indoor, it is assumed that a segmented image Fb2 = indoor.
      The flowchart of FIG. 8 also applies to segmented images Run (1 ≦ n ≦ N/2) corresponding to the area in the upper half of the second image 22 and segmented images Rbn (1 ≦ n ≦ N/2) corresponding to the area in the bottom half of the second image 22.
      Next, details of a boundary image determining method in step S6 will be described.
      FIG. 9 is a flowchart illustrating an example of the boundary image determining method of the embodiment.  In FIG. 9, a case will be described, where the boundary of the area in the upper half of the first image 21 (segmented images Fun (1 ≦ n ≦ N/2)) is determined.  The flowchart of FIG. 8 also applies to the area in the bottom half of the first image 21 (segmented images Fbn (1 ≦ n ≦ N/2)), the area in the upper half of the second image 22 (segmented images Run (1 ≦ n ≦ N/2)), and the area in the bottom half of the second image 22 (segmented images Rbn (1 ≦ n ≦ N/2)).
      First, the determining unit 8 assigns a value of 1 to a variable x (step S41).  Next, the determining unit 8 determines whether or not Fux(S) = indoor and Fux+1(S) = outdoor (step S42).  Here, Fux(S) indicates the category (outdoor or indoor) of the segmented image Fux, determined by the flowchart of FIG. 8.
      If Fux(S) = indoor and Fux+1(S) = outdoor (step S42, Yes), the determining unit 8 determines that Fux(S) = boundary (step S44).  Next, the processing proceeds to step S45.
      If Fux(S) = indoor and Fux+1(S) = outdoor do not apply (step S42, No), the determining unit 8 determines whether or not Fux(S) = outdoor and Fux+1(S) = indoor (step S43).
      If Fux(S) = outdoor and Fux+1(S) = indoor (step S43, Yes), the determining unit 8 determines that Fux(S) = boundary (step S44).  Next, the processing proceeds to step S45.
      If Fux(S) = outdoor, and Fux+1(S) = indoor do not apply (step S43, No), the processing proceeds to step S45.
      Next, the determining unit 8 determines whether or not x + 1 = N/2 (step S45).  If x + 1 = N/2 does not apply (step S45, No), the determining unit 8 assigns x + 1 to x (step S46) and the processing returns to step S42.  If x + 1 = N/2 (step S45, Yes), the processing is finished.
      Next, details of the image processing method in step S7 will be described.  In FIG. 10, a case will be described, where the image of the area in the upper half of the first image 21 (segmented images Fun (1 ≦ n ≦ N/2)) is processed.  The flowchart of FIG. 10 also applies to the area in the bottom half of the first image 21 (segmented images Fbn (1 ≦ n ≦ N/2)), the area in the upper half of the second image 22 (segmented images Run (1 ≦ n ≦ N/2)), and the area in the bottom half of the second image 22 (segmented images Rbn (1 ≦ n ≦ N/2)).
      FIG. 10 is a flowchart illustrating an example of the image processing method of the embodiment.  First, the determination unit 9 assigns a value of 1 to a variable x (step S61).  Next, the determination unit 9 determines whether or not Fux(S) = outdoor (step S62).  Here, Fux(S) indicates the category (outdoor, indoor, or boundary) of the segmented image Fux, determined by the flowcharts of FIGS. 8 and 9.
      If Fux(S) = outdoor (step S62, Yes), the determination unit 9 recognizes a scene among the scenes in the outdoor category (step S63) and determines the scene of the segmented image Fux.  Scenes in the outdoor category include, for example, autumn leaf, snow, cloud, wood, and the like.  Next, the processing unit 10 subjects the segmented image Fux to image processing optimized for each scene (step S64).  Specifically, the processing unit 10 processes the segmented image Fux by using a correction parameter according to the scene determined in step S62.  Next, the processing proceeds to step S69.
      If Fux(S) = outdoor does not apply (step S62, No), the determination unit 9 determines whether or not Fux(S) = boundary (step S65).
      If Fux(S) = boundary does not apply (step S65, No), the determination unit 9 recognizes a scene among the scenes in the indoor category (step S66), and determines the scene of the segmented image Fux.  Scenes in the indoor category include, for example, cooking, furniture, and the like.  Next, the processing unit 10 subjects the segmented image Fux to image processing optimized for each scene (step S67).  Specifically, the processing unit 10 processes the segmented image Fux by using a correction parameter according to the scene determined in step S66.  Next, the processing proceeds to step S69.
      If Fux(S) = boundary (step S65, Yes), the processing unit 10 processes the segmented image Fux by the intermediate-value correction parameter between the correction parameters used for processing the adjacent segmented images.  The processing unit 10 processes the image, for example, by the intermediate-value correction parameter between the correction parameter used for the outdoor image Fux-1 adjacent to the boundary image Fux and the correction parameter used for the indoor image Fux+1 adjacent to the boundary image (step S68).  Next, the processing proceeds to step S69.
      Next, the determination unit 9 determines whether or not x + 1 = n (step S69).  If x + 1 = n does not apply (step S69, No), the determination unit 9 assigns x + 1 to x (step S70) and the processing returns to step S62.  If x + 1 = n (step S69, Yes), the processing is finished.
      Finally, an exemplary hardware configuration of the imaging device 100 of the embodiment will be described.
      FIG. 11 is a diagram illustrating an example of the hardware configuration of the imaging device 100 of the embodiment.  The imaging device 100 of the embodiment includes the imaging element 101, the fish-eye lens 102, the imaging element 103, the fish-eye lens 104, an image processing device 105, a CPU 106, an acceleration sensor 107, a display device 108, an operation button 109, and a storage medium 111.  The imaging device 100 can also communicate with a smart device 110.
      The imaging element 101 forms an image with light entering through the fish-eye lens 102.  A target image formed on the imaging element 101 is converted into an image signal (electric signal) showing the first image 21.  The imaging element 101 inputs the image signal into the image processing device 105.  The imaging element 101 and the fish-eye lens 102 correspond to the first imaging unit 1 (see FIG. 2).
      The imaging element 103 forms an image with light entering through the fish-eye lens 104.  A target image formed on the imaging element 103 is converted into an image signal (electric signal) showing the second image 22.  The imaging element 103 inputs the image signal into the image processing device 105.  The imaging element 103 and the fish-eye lens 104 correspond to the second imaging unit 2 (see FIG. 2).
      The image processing device 105 and the CPU 106 correspond to the combining unit 4, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 (see FIG. 2).  A function block realized by hardware is realized by the image processing device 105.  A function block realized by software (program) is realized by the CPU 106.  Of the combining unit 4, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13, which function block is to be realized by hardware and which function block is to be realized by software can be determined arbitrarily.
      The acceleration sensor 107 detects the zenith direction of the imaging device 100 at the time of imaging.  Information detected by the acceleration sensor 107 is processed by the first correction unit 6 realized by the image processing device 105 or the CPU 106.
      The display device 108, the operation button 109, and the smart device 110 are user interfaces for operating the imaging device 100.  The display device 108, the operation button 109, and the smart device 110 correspond to the display unit 14 and the selection unit 15 (see FIG. 2).
      The storage medium 111 stores, for example, the first image 21, the second image 22, and the whole celestial sphere image 23.  The storage medium 111 corresponds to the storage unit 3 (see FIG. 2).
      When a part or all of the combining unit 4, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 are realized as a program, a modularly-configured program including a part or all thereof is housed in the storage medium 111.  By executing the program of the storage medium 111 by the CPU 106, a part or all of the combining unit 4, the first correction unit 6, the segmentation unit 7, the determining unit 8, the determination unit 9, the processing unit 10, the second detection unit 11, the calculation unit 12, and the second correction unit 13 are realized on a RAM not illustrated in FIG. 11.
      The program may be recorded and provided in a computer readable recording medium such as a CD-ROM, a flexible disc (FD), a CD-R, a digital versatile disk (DVD), and a universal serial bus (USB) in a file in an installable form or in an executable form.  The program may be provided or distributed via a network such as the Internet.
      As described above, in the imaging device of the embodiment, the segmentation unit 7 segments the whole celestial sphere image into a plurality of segmented images.  The determination unit 9 determines the scene of each segmented image.  Thus, the processing unit 10 can process each segmented image by using the correction parameter according to the scene.  Therefore, even when a plurality of scenes is included in one whole celestial sphere image, an optimum quality whole celestial sphere image can be generated.
      Even when the imaging device is applied to an imaging device such as a single-lens reflex camera and a smartphone, the segmentation unit 7 segments an image into a plurality of segmented images.  The determination unit 9 determines the scene of each segmented image.  Thus, the processing unit 10 can process each segmented image by using the correction parameter according to the scene.  Therefore, even when a plurality of scenes is included in one image, an optimum quality image can be generated.
      According to the embodiment of the present invention, an optimum quality image can be generated even when a plurality of scenes is included in an image.
      Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
1  first imaging unit
2  second imaging unit
3  storage unit
4  combining unit
5  first detection unit
6  first correction unit
7  segmentation unit
8  determining unit
9  determination unit
10  processing unit
11  second detection unit
12  calculation unit
13  second correction unit
14  display unit
15  selection unit
Japanese Laid-open Patent Publication No. 2013-198070

Claims (11)

  1.       An imaging device comprising:
          an imaging unit that captures an image;
          a segmentation unit that segments the image into a plurality of segmented images;
          a determination unit that determines a scene of each of the segmented images; and
          a processing unit that processes each of the segmented images by a correction parameter according to the scene.
  2.       The imaging device according to claim 1, further comprising a determining unit that determines a category of the segmented image showing any one of an outdoor image obtained by imaging outdoors, an indoor image obtained by imaging indoors, and a boundary image positioned at a boundary between the outdoor image and the indoor image, wherein
          the determination unit determines the scene by recognizing an optimum scene among a plurality of scenes associated with the category.
  3.       The imaging device according to claim 1 or 2, wherein
          the segmentation unit segments the image into 2N pieces of the segmented images by segmenting the image into N pieces (N is an integer equal to or greater than two) in a vertical direction and two pieces in a horizontal direction, and
          the determining unit uses each of the segmented images in the upper half and determines a category of each of the segmented images in the bottom half, a position in the horizontal direction of which is the same.
  4.       The imaging device according to claim 2 or 3, wherein
          the processing unit processes an image by an intermediate-value correction parameter between the correction parameter used for the outdoor image adjacent to the boundary image and the correction parameter used for the indoor image adjacent to the boundary image, when a category of the segmented image is the boundary image.
  5.       The imaging device according to claim 3 or 4, further comprising:
          a second detection unit that detects luminance of the segmented image;
          a calculation unit that calculates, for each of the segmented images in the upper half and in the bottom half, a difference in luminance between the segmented images in the upper half and those in the bottom half where positions in the horizontal direction are the same; and
          a second correction unit that makes different corrections for gradation characteristics between the segmented images in the upper half and the segmented images in the bottom half, in a case where the number of sets of the segmented images in the upper half and the segmented images in the bottom half, the luminance difference between which is greater than a first threshold, is greater than a second threshold.
  6.       The imaging device according to any one of claims 2 to 5, further comprising:
          a display unit that displays one or more types of the scenes included in the image;
          a selection unit that selects the scene among the one or more types of the scenes; and
          a storage unit that stores one or more of the segmented images of the selected scene.
  7.       The imaging device according to claim 6, wherein
          the display unit displays one or more of the segmented images of the scene selected by the selection unit.
  8.       The imaging device according to any one of claims 1 to 7, wherein
          the imaging unit comprises:
          a first imaging unit that captures a first image; and
          a second imaging unit that captures a second image,
          and the imaging device further comprises a combining unit that generates a whole celestial sphere image showing images in all directions of an imaging spot by combining the first and second images.
  9.       The imaging device according to claim 8, further comprising:
          a first detection unit that detects a zenith direction of the imaging device at the time of imaging; and
          a first correction unit that corrects a deviation in a zenith direction of the whole celestial sphere image based on the zenith direction of the imaging device.
  10.       An image processing device comprising:
          a segmentation unit that segments an image into a plurality of segmented images;
          a determination unit that determines a scene of each of the segmented images; and
          a processing unit that processes each of the segmented images by a correction parameter according to the scene.
  11.       An imaging method, comprising the steps of:
          capturing an image by an imaging unit;
          segmenting, by a segmentation unit, the image into a plurality of segmented images;
          determining, by a determination unit, a scene of each of the segmented images; and
          processing, by a processing unit, each of the segmented images by a correction parameter according to the scene.
PCT/JP2015/004566 2014-09-08 2015-09-08 Imaging device, image processing device, and imaging method WO2016038886A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014182630A JP2016058840A (en) 2014-09-08 2014-09-08 Imaging apparatus, image processing system, imaging method, and program
JP2014-182630 2014-09-08

Publications (1)

Publication Number Publication Date
WO2016038886A1 true WO2016038886A1 (en) 2016-03-17

Family

ID=55458656

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/004566 WO2016038886A1 (en) 2014-09-08 2015-09-08 Imaging device, image processing device, and imaging method

Country Status (2)

Country Link
JP (1) JP2016058840A (en)
WO (1) WO2016038886A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108462833A (en) * 2018-03-26 2018-08-28 北京小米移动软件有限公司 Image pickup method, device and computer readable storage medium
US10666863B2 (en) 2018-05-25 2020-05-26 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using overlapping partitioned sections
US10764494B2 (en) 2018-05-25 2020-09-01 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using composite pictures
US10943328B2 (en) 2018-08-27 2021-03-09 Canon Kabushiki Kaisha Image capturing apparatus, method for controlling same, and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7447403B2 (en) * 2018-07-27 2024-03-12 大日本印刷株式会社 Information processing device, information processing system, information processing method and program
EP3719529A1 (en) 2019-03-20 2020-10-07 Ricoh Company, Ltd. Range finding device and range finding method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007336099A (en) * 2006-06-13 2007-12-27 Canon Inc Imaging apparatus and imaging method
US20130121578A1 (en) * 2011-11-10 2013-05-16 Canon Kabushiki Kaisha Image processing apparatus and control method therefor
WO2014042104A1 (en) * 2012-09-11 2014-03-20 Ricoh Company, Ltd. Imaging controller and imaging control method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007336099A (en) * 2006-06-13 2007-12-27 Canon Inc Imaging apparatus and imaging method
US20130121578A1 (en) * 2011-11-10 2013-05-16 Canon Kabushiki Kaisha Image processing apparatus and control method therefor
WO2014042104A1 (en) * 2012-09-11 2014-03-20 Ricoh Company, Ltd. Imaging controller and imaging control method and program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108462833A (en) * 2018-03-26 2018-08-28 北京小米移动软件有限公司 Image pickup method, device and computer readable storage medium
US10666863B2 (en) 2018-05-25 2020-05-26 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using overlapping partitioned sections
US10764494B2 (en) 2018-05-25 2020-09-01 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using composite pictures
US10943328B2 (en) 2018-08-27 2021-03-09 Canon Kabushiki Kaisha Image capturing apparatus, method for controlling same, and storage medium

Also Published As

Publication number Publication date
JP2016058840A (en) 2016-04-21

Similar Documents

Publication Publication Date Title
WO2016038886A1 (en) Imaging device, image processing device, and imaging method
US11501535B2 (en) Image processing apparatus, image processing method, and storage medium for reducing a visibility of a specific image region
CN111164647B (en) Estimating depth using a single camera
US20200043225A1 (en) Image processing apparatus and control method thereof
US9544574B2 (en) Selecting camera pairs for stereoscopic imaging
JP5980294B2 (en) Data processing apparatus, imaging apparatus, and data processing method
JP5205007B2 (en) Light source estimation method and apparatus
US9813634B2 (en) Image processing apparatus and method
WO2019146419A1 (en) Image processing device, image processing method, and program
JP5956844B2 (en) Image processing apparatus and control method thereof
US10565712B2 (en) Image processing apparatus and method for controlling the same
JP6937603B2 (en) Image processing equipment and its control methods, programs, and storage media
JP6780749B2 (en) Imaging equipment, image processing equipment, imaging methods and programs
JP2017138927A (en) Image processing device, imaging apparatus, control method and program thereof
CN107430841B (en) Information processing apparatus, information processing method, program, and image display system
JP2018182700A (en) Image processing apparatus, control method of the same, program, and storage medium
US20240251154A1 (en) Image capture apparatus, image processing apparatus, and method
JP2017182668A (en) Data processor, imaging device, and data processing method
US10116877B2 (en) Image processing apparatus and image processing method
KR20240049211A (en) Camera system including a monochrome camera and a color camera with a global shutter sensor
US20150169974A1 (en) Image processing apparatus, method and program
JP2016194956A (en) Data processor, imaging device, and data processing method
JP2015133758A (en) Image processing device, image processing method and program thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15840024

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15840024

Country of ref document: EP

Kind code of ref document: A1