WO2015166684A1 - 画像処理装置と画像処理方法 - Google Patents
画像処理装置と画像処理方法 Download PDFInfo
- Publication number
- WO2015166684A1 WO2015166684A1 PCT/JP2015/053947 JP2015053947W WO2015166684A1 WO 2015166684 A1 WO2015166684 A1 WO 2015166684A1 JP 2015053947 W JP2015053947 W JP 2015053947W WO 2015166684 A1 WO2015166684 A1 WO 2015166684A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- information
- image
- setting information
- illumination setting
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000005286 illumination Methods 0.000 claims abstract description 282
- 238000000034 method Methods 0.000 claims description 72
- 238000003384 imaging method Methods 0.000 claims description 63
- 239000013598 vector Substances 0.000 description 43
- 238000005516 engineering process Methods 0.000 description 14
- 229920006395 saturated elastomer Polymers 0.000 description 14
- 239000000463 material Substances 0.000 description 13
- 230000005484 gravity Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 102100036848 C-C motif chemokine 20 Human genes 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 102100035353 Cyclin-dependent kinase 2-associated protein 1 Human genes 0.000 description 2
- 102100029860 Suppressor of tumorigenicity 20 protein Human genes 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G06T5/92—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/133—Equalising the characteristics of different image components, e.g. their average brightness or colour balance
Definitions
- This technology relates to an image processing apparatus and an image processing method, and makes it possible to easily set an illumination environment in relighting.
- an image processing technique called relighting that constructs an illumination environment different from that at the time of photographing and performs illumination and drawing processing again is used for image processing and the like.
- the information necessary for relighting is roughly divided into the shape of the subject, the reflection characteristics of the subject, and the position of the light source, which are the same elements used for drawing general three-dimensional computer graphics.
- shape of the subject for example, a method is used in which the depth is estimated by stereo matching using a plurality of cameras, and the model is restored from a set of three-dimensional coordinates corresponding to each pixel.
- reflection characteristics of the subject and the light source position for example, a method using light source reflection in a spherical mirror has been proposed.
- Patent Document 1 discloses that only the intensity of the light source is automatically adjusted.
- Patent Document 2 discloses that randomly generated illuminations are listed and used for rendering in a low frequency region.
- Patent Literature 1 and Patent Literature 2 are not applicable to construction of an arbitrary illumination environment including not only the light intensity of the light source but also the arrangement of the light sources.
- an object of this technology is to provide an image processing apparatus and an image processing method that can easily set an illumination environment in relighting.
- the first aspect of this technology is A subject information acquisition unit that acquires captured subject information indicating attributes related to illumination in the subject from the captured image; A lighting setting information selection unit for selecting lighting setting information according to a user operation; An image including an illumination setting information adjustment unit that adjusts the illumination setting information selected by the illumination setting information selection unit to illumination setting information corresponding to the subject based on the imaging subject information acquired by the subject information acquisition unit.
- a subject information acquisition unit that acquires captured subject information indicating attributes related to illumination in the subject from the captured image
- a lighting setting information selection unit for selecting lighting setting information according to a user operation
- An image including an illumination setting information adjustment unit that adjusts the illumination setting information selected by the illumination setting information selection unit to illumination setting information corresponding to the subject based on the imaging subject information acquired by the subject information acquisition unit.
- the subject information acquisition unit acquires, as captured subject information, attributes related to illumination in the subject, such as the three-dimensional shape and reflection characteristics of the subject, from the captured images at different viewpoint positions acquired by the image acquisition unit.
- the lighting setting information selection unit displays the setting selection image related to the lighting setting information, and sets the lighting setting information corresponding to the setting selection image selected by the user as the lighting setting information selected. For example, preset information based on the illumination setting information is displayed as a setting selection image, and the illumination setting information corresponding to the preset information selected by the user is used as the illumination setting information. Alternatively, the preset image is displayed as the setting selection image, and the illumination setting information is acquired from the preset image selected by the user. Alternatively, metadata associated with the illumination setting information is displayed as the setting selection image, and the illumination setting information corresponding to the metadata selected by the user is selected as the illumination setting information.
- the illumination setting information selection unit selects illumination setting information having imaging mode information that matches the imaging mode information for the subject acquired by the subject information acquisition unit.
- the illumination setting information adjustment unit adjusts the illumination setting information selected by the illumination setting information selection unit to illumination setting information corresponding to the subject based on the imaging subject information acquired by the subject information acquisition unit.
- the illumination setting information is generated using, for example, a preset coordinate system, and the illumination setting information adjustment unit adjusts the illumination setting information to illumination setting information corresponding to the subject by associating the coordinate system with the subject.
- the illumination setting information includes light source information including information on the arrangement of the light source and illumination light, and reference subject information including information on the size and arrangement of the reference subject illuminated by the light source indicated by the light source information.
- the setting information adjustment unit associates the coordinate system with the subject so that the size and arrangement of the reference subject coincide with the subject of the captured image.
- the illumination setting information adjustment unit adjusts the illumination setting information so that the subject image obtained by performing the drawing process using the imaging subject information and the adjusted illumination setting information does not cause saturation.
- the image generation unit performs a drawing process based on the imaging subject information acquired by the subject information acquisition unit and the illumination setting information adjusted by the illumination setting information adjustment unit, so that the subject image in the illumination environment selected by the user Get.
- the second aspect of this technology is In the subject information acquisition unit, acquiring captured subject information indicating attributes related to illumination in the subject from the captured image; In the lighting setting information selection unit, a step of selecting lighting setting information according to a user operation; A step of adjusting the illumination setting information selected by the illumination setting information selection unit to illumination setting information corresponding to the subject based on the imaging subject information acquired by the subject information acquisition unit in the illumination setting information adjustment unit; In the image processing method including.
- the subject information acquisition unit acquires captured subject information indicating an attribute related to illumination in the subject from the captured image.
- the illumination setting information selection unit selects the illumination setting information according to a user operation.
- the illumination setting information adjustment unit performs processing for converting the selected illumination setting information into illumination setting information corresponding to the subject based on the imaging subject information. For this reason, if the user selects the illumination setting information, the selected illumination setting information is adjusted to the illumination setting information corresponding to the subject. Therefore, by performing the drawing process using the adjusted illumination setting information, a desired process is performed. It is possible to generate an image corresponding to a case where the image is captured in an illumination environment. Therefore, it is possible to easily set the lighting environment in relighting. Note that the effects described in the present specification are merely examples and are not limited, and may have additional effects.
- FIG. 1 is a diagram illustrating a configuration of an image processing apparatus according to a first embodiment. It is a figure for demonstrating the method to acquire the three-dimensional shape of a to-be-photographed object. It is a figure for demonstrating how to obtain
- the image processing apparatus of this technology includes a subject information acquisition unit, an illumination setting information selection unit, and an illumination setting information adjustment unit.
- the subject information acquisition unit acquires captured subject information indicating attributes related to illumination in the subject from the captured image.
- the illumination setting information selection unit selects illumination setting information according to a user operation. For example, the illumination setting information selection unit displays a setting selection image associated with the illumination setting information, and sets the illumination setting information corresponding to the setting selection image selected by the user as the illumination setting information selected.
- the illumination setting information adjustment unit uses the illumination setting information selected by the illumination setting information selection unit as illumination setting information corresponding to the subject based on the imaging subject information acquired by the subject information acquisition unit.
- the preset information based on the illumination setting information is displayed as a setting selection image, and the illumination setting information in which the illumination setting information corresponding to the preset information selected by the user is selected. The case will be described.
- FIG. 1 illustrates the configuration of the image processing apparatus according to the first embodiment.
- the image processing apparatus 10 includes an image acquisition unit 11, a subject information acquisition unit 12, a preset information selection unit 21, an illumination setting information adjustment unit 31, an image generation unit 41, and an image display unit 45.
- the preset information selection unit 21 corresponds to an illumination setting information selection unit.
- the image acquisition unit 11 acquires an image capable of estimating the three-dimensional shape of the subject, the reflection characteristics of the subject, and the position of the light source.
- the image acquisition unit 11 estimates the three-dimensional shape of the subject performed by the subject information acquisition unit 12 to be described later. For example, when obtaining the depth corresponding to each pixel position based on the parallax, the image acquisition unit 11 is arranged side by side at a known interval. A captured image of the subject is acquired using a pair of cameras (so-called stereo camera) having specifications.
- the image acquisition unit 11 uses, for example, the bright spots at different positions generated on the spherical surface in each of the right viewpoint image and the left viewpoint image of the stereo camera in the estimation of the position of the light source performed by the subject information acquisition unit 12 described later. To do.
- the captured image of the subject is acquired by including the spherical mirror in the imaging range of the stereo camera.
- the image acquisition unit 11 is not limited to a configuration using a stereo camera, and may be configured to read a captured image acquired by the stereo camera from a recording medium, or from an external device via a network or the like as described above. A configuration for acquiring data of a captured image is also possible.
- the subject information acquisition unit 12 acquires imaging subject information indicating attributes related to illumination in the subject from the captured image acquired by the image acquisition unit 11.
- the subject information acquisition unit 12 acquires, for example, information indicating the three-dimensional shape of the subject (hereinafter referred to as “subject shape information”) and information indicating the reflection characteristics (hereinafter referred to as “reflection characteristic information”) as the imaging subject information.
- the subject information acquisition unit 12 calculates the parallax of each pixel of the subject captured in each stereo camera as a method for acquiring the subject shape information, and obtains a depth corresponding to each pixel position based on the parallax.
- FIG. 2 is a diagram for explaining a method for acquiring a three-dimensional shape of a subject.
- the stereo camera is set so that the optical axes of the right camera and the left camera are parallel to each other.
- a point (X, Y, Z) where the subject is located is a point (xL, yL) on the imaging surface of the left camera and a point (xR, yR) on the imaging surface of the right camera. ).
- the point (X, Y, Z)) can be expressed by the equations (1) to (3) using the point (xL, yL) and the point (xR, yR).
- Equation (1) to (3) “f” represents the focal length of the camera, and “B” represents the distance between the base lines (the distance between the optical axes of the right camera and the left camera).
- parallax This horizontal coordinate shift amount (xL-xR) between the left and right camera imaging surfaces is called parallax.
- the depth can be expressed from the parallax by using the equations (1) to (3), the depth of the subject can be obtained if the parallax can be obtained for all the pixels on the imaging surface.
- the block BR using peripheral pixels with respect to a reference point for which the parallax is to be obtained here, the point (xR, yR) on the right camera image) Set.
- a block BL having the same size as the block BR is set on the left camera image, and a position (xL, yR) corresponding to the point (xR, yR) is searched by searching for the position of the block BL having the highest similarity with the block BR. , YL) is generally used.
- the subject information acquisition unit 12 acquires light source information used when acquiring reflection characteristic information.
- the subject information acquisition unit 12 acquires the position of the light source by using, for example, the bright spot of the spherical mirror included in the captured image obtained by the stereo camera.
- the technique disclosed in the document “Light Source Estimation from Spherical Reflections, Schnieders, 2011, etc. can be used.
- FIG. 4 illustrates a spherical mirror that images using a camera.
- FIG. 5 schematically shows a bright spot when a spherical mirror is imaged with a stereo camera.
- FIG. 5A shows a bright spot of the left camera image
- FIG. 5B shows a right camera image. It is a bright spot.
- the camera positions are different, bright spots appear at different positions on the spherical surface in the captured image.
- the depth of the spherical mirror and thus the three-dimensional position of the spherical mirror can be obtained using the size of the circle extracted by circular detection for the captured image.
- FIG. 6 is a diagram for explaining the relationship between the bright spot of the spherical mirror and the light source direction.
- the vector VE is a ray vector from the viewpoint EY toward the bright spot
- the vector VN is a normal vector of the sphere at the bright spot P.
- the incident light vector VL indicating the direction of the incident light from the light source can be calculated based on Expression (4).
- FIG. 6 shows that the angle formed by the light vector VE and the normal vector VN and the angle formed by the normal vector VN and the incident light vector VL are both equal to the angle Ag.
- the three-dimensional positions PMR and PML can be obtained as the intersection of the light beam vector from the camera toward the bright spot and the mirror. That is, the position of the light source Lm can be expressed by simultaneous equations of Expressions (6) and (7) using the three-dimensional positions PMR and PML.
- the normal vector VNR is a sphere normal vector at the bright spot PR
- the normal vector VNL is a sphere normal vector at the bright spot PL.
- the coefficients kR and kL in the equations (6) and (7) are determined by the principle of triangulation, and the position of the light source LS can be calculated. Further, the intensity and color of the light source can be obtained from the color of each bright spot on the mirror.
- the subject information acquisition unit 12 acquires subject reflection characteristic information using the subject shape information and the light source information.
- the technique disclosed in the document “Principles of Appearance Acquisition and Representation, Weyrich et al., 2008” can be used.
- the reflection characteristics are generally estimated by preliminarily assuming a reflection model of an object and obtaining parameters in the model.
- a reflection model of an object In the present technology, it is assumed that one object can be expressed by a uniform bidirectional reflectance distribution function (BRDF).
- BRDF uniform bidirectional reflectance distribution function
- Phong's model The Phong model is a model that expresses the reflection characteristics of an object as a superposition of an ambient light component, a diffuse reflection component, and a specular reflection component.
- the luminance Iu when a certain point u on the object surface is expected is expressed by the equation ( 8).
- the intensity ia is the intensity of the ambient light
- the intensity im, d is the intensity of the diffuse reflection light from the light source m
- the intensity im, s is the intensity of the specular reflection light from the light source m.
- the incident light vector VLm is a vector indicating the incidence from the light source m to the subject.
- the normal vector VN u is a normal vector of the subject at the point u
- the vector Vrm is a reflected light vector on the subject surface
- the vector VF u is a vector from the point u to the observation point.
- the coefficient q is a parameter for adjusting the intensity distribution of specular reflection.
- the coefficients ka, kd, and ks are the intensity coefficients of each component according to the material of the object, and in order to obtain the reflection characteristics, these intensity coefficients may be estimated.
- the vector VF u and the normal vector VN u can be calculated at each point u of the subject. Further, by using information indicating the light source, the position of the light source and the color and intensity of the light source are clarified, so that the incident light vector VLm and the vector Vrm can be calculated. Furthermore, it can be considered that the intensity im, d and the intensity im, s are also known. Since the intensity ia is generally expressed as the sum of the intensity of each light source, this intensity can also be calculated.
- the above equation is calculated for each point u on the subject, and the assumption is made that the BRDF of the subject is uniform.
- Kd, ks, q can be solved simultaneously. That is, information indicating the reflection characteristics of the subject can be acquired by solving the coefficients ka, kd, ks, and q simultaneously.
- information indicating the reflection characteristics according to the material it is possible to obtain information indicating the reflection characteristics according to the material by specifying the material of the subject and adjusting the intensity of the diffuse reflection light and the intensity of the specular reflection light based on the material. It may be. For example, when the material is metal, the reflection characteristic information is acquired by increasing the value of the intensity of the specular reflection light.
- the preset information selection unit 21 in FIG. 1 selects preset information according to a user operation.
- the preset information selection unit 21 stores one or more pieces of preset information indicating light source information in advance.
- the preset information selection unit 21 outputs the preset information selected according to the user operation to the illumination setting information adjustment unit 31.
- FIG. 7 shows an example of preset information elements as a list.
- the preset information includes, for example, light source information and reference subject information.
- the light source information includes light source number information and light source attribute information.
- the light source number information indicates the number of light sources used for illumination.
- the light source attribute information is provided for each number of light sources indicated by the light source number information.
- the light source attribute information includes, for example, position information, direction information, intensity information, color information, type information, color temperature information, and the like.
- the position information indicates the distance of the light source from the reference center Op for each reference coordinate axis xr, yr, zr in the preset coordinate system.
- the direction information indicates the emission direction of the irradiation light from the light source. For example, the direction is indicated by a normalized vector.
- the intensity information indicates the brightness of the light emitted from the light source.
- the brightness is indicated in units of lumens.
- the color information indicates the color of the irradiation light from the light source.
- the color is indicated as the value of each color component of the three primary color components.
- the type information indicates what type of light source the light source is, for example, a point light source, a surface light source, a spot light source, a parallel light source, a hemispheric environmental light source, or the like.
- the color temperature information indicates the color temperature of the illumination light from the light source. For example, white 6500K (Kelvin temperature) is used as the standard color temperature. When there is no color temperature setting, a standard color temperature is used.
- the reference subject information includes, for example, size information, gravity center information, coordinate axis information, material information, imaging mode information, and the like.
- the size information indicates the radius of a sphere circumscribing the subject assumed in the preset information. Note that, in order to distinguish the subject assumed in the preset information from the imaged subject, the subject assumed in the preset information is set as a reference subject.
- the center-of-gravity information is information indicating where the center of gravity of the reference subject is located in the preset coordinate system.
- the coordinate axis information is information indicating which direction in the preset coordinate system the main three axes of the reference subject are associated with. For example, in the preset coordinate system, the three main axes of the reference subject are indicated as vector quantities.
- the material information is information for setting what kind of material the imaged subject is to be relighted. If there is no definition of the material, the material is defined as the subject material, that is, the material specified by the estimation of the reflection characteristics.
- the imaging mode information is information used for determining whether the imaging mode corresponds to the captured subject.
- FIG. 8 shows a case where the light source information is shown in a diagram, and for example, the reference center Op, the reference coordinate axes xr, yr, zr, the position, type, and intensity of the light source are shown.
- the reference radius Rd is a radius of a sphere that circumscribes the reference subject when the center of gravity of the reference subject is the reference center Op of the preset coordinate system.
- FIG. 9 shows a specific example of the preset information, in which the person is rewritten using two light sources.
- the light source 1 is provided at a position of (1, 0, 1) [unit: m], and the illumination direction is set to ( ⁇ 0.707, 0, ⁇ 0.707).
- the brightness of the light source 1 is 3000 lumens, and the illumination color is white.
- the light source 1 is a surface light source having a size of (0.3 ⁇ 0.3) [unit: m], and the color temperature is set to 6500K for each color.
- the light source 2 is provided at the position of ( ⁇ 0.8, 0, 1) [unit: m] using the same light source as the light source 1, and the illumination direction is (0.625, 0, ⁇ 0.781). And the direction.
- the center of gravity of the reference subject is the reference center of the preset coordinate system, and the coordinate system of the reference subject is matched with the preset coordinate system.
- the size of the reference subject is set to 0.5 m.
- the center of gravity of the reference subject is the center of the preset coordinate system, and the coordinate axis information matches the three main axes of the reference subject with the preset coordinate system. Further, when performing relighting of a person, the material of the subject whose lighting environment is changed is skin, and the imaging mode is portrait.
- the preset information selection unit 21 stores a plurality of such preset information in advance, and selects the preset information according to a user operation.
- FIG. 10 illustrates an icon image displayed in correspondence with the preset information on the image display unit 45 of the image processing apparatus 10.
- the icon image is, for example, an image schematically showing the reference subject.
- the icon image may include an image relating to the light source.
- the reference subject is a person's face
- FIG. 10A an image obtained by modeling the person's face and information on the light sources, for example, an image showing the number and positions of the light sources are provided.
- the icon image may be an image that does not include information on the light source.
- FIG. 10C illustrates an icon image when the upper body of a person is imaged.
- FIG. 11 is a diagram for explaining the adjustment process of the illumination setting information performed by the illumination setting information adjusting unit 31.
- the illumination setting information adjustment unit 31 adjusts a coordinate system set in advance so that the size and arrangement of the reference subject indicated by the preset information matches the imaged subject, and the illumination setting information adjustment unit 31 relates to the light source according to the adjustment of the coordinate system. Adjust information. For the sake of simplicity, it is assumed that the center of gravity of the reference subject is the reference center of the preset coordinate system, and the coordinate system of the reference subject matches the preset coordinate system.
- the illumination setting information adjustment unit 31 captures an image of the reference center Op (the center of gravity of the reference subject) of the preset coordinate system as shown in FIG. 11B with respect to the subject shown in FIG.
- the light source is arranged so as to coincide with the center of gravity.
- the illumination setting information adjustment unit 31 circumscribes the subject in which the reference radius Rd (radius of the sphere circumscribing the reference subject) indicated by the preset information is captured.
- the reference radius Rd is scaled so as to coincide with the radius of the sphere. That is, the illumination setting information adjustment unit 31 changes the scale itself of the preset light source coordinate system to change the distance from the center of gravity of the imaged subject to the light source.
- the illumination setting information adjustment unit 31 sets coordinate axes for the imaged subject based on the three-dimensional shape of the imaged subject and the imaging mode of the captured image, and associates the coordinate axis of the preset coordinate system with this coordinate axis. .
- the illumination setting information adjustment unit 31 extracts a person's face from the captured image, and sets the front direction of the face as a reference coordinate axis in the preset coordinate system as illustrated in FIG. zr.
- the illumination setting information adjusting unit 31 determines the vertical upward direction as the reference coordinate axis yr, and the vector orthogonal to these two axis vectors, that is, the direction of the outer product vector of both axes, as the reference coordinate axis xr.
- the illumination setting information adjustment unit 31 sets a vector orthogonal to both the axis vector, the camera front direction as the main coordinate axis direction, the camera front direction as the reference coordinate axis zr, and the vertical upward direction as the reference coordinate axis yr.
- the direction is defined as a reference coordinate axis xr.
- the illumination setting information adjustment unit 31 adjusts the arrangement of the light sources in accordance with the subject imaged by performing such processing. Furthermore, the illumination setting information adjustment unit 31 or the image generation unit 41 performs a drawing process based on the arrangement of the adjusted light source.
- the subject information acquisition unit 12 acquires subject shape information and reflection characteristic information.
- the illumination setting information adjustment unit 31 arranges the light sources included in the preset information selected by the user corresponding to the imaged subject. Therefore, the drawing process uses the acquired object shape information and reflection characteristic information, the adjusted light source arrangement, and the color and intensity of each light source included in the selected preset information. This method is used. Note that the Phong model is used as the reflection model of the subject, as in the case of the reflection characteristic estimation.
- the illumination setting information adjustment unit 31 calculates the color and brightness of the entire image using the image generated by the drawing process.
- the illumination setting information adjustment unit 31 adjusts the intensity of the light source so that the luminance and color are not saturated when it is determined that the luminance and color are saturated based on the calculation result of the color and luminance of the entire image. For example, when the luminance value of the image generated by the drawing process is saturated, the illumination setting information adjustment unit 31 sets each of the maximum luminance value IMAX in all the pixels to match the maximum value IRANGE of the image dynamic range.
- the intensity im of the light source m is scaled. Specifically, the correction strength imc, which is the strength after scaling, is calculated based on Equation (9).
- the light source information can be automatically optimized based on the preset information selected by the preset information selection unit 21 and the captured image.
- the illumination setting information adjustment unit 31 outputs adjusted preset information optimized according to the captured subject to the image generation unit 41.
- the image generation unit 41 performs a drawing process.
- the image generation unit 41 draws based on the captured image acquired by the image acquisition unit 11, the captured subject information acquired by the subject information acquisition unit 12, and the adjusted preset information supplied from the illumination setting information adjustment unit 31.
- the image generation unit 41 performs a drawing process to generate a captured image in an illumination environment based on the selected preset information and the captured subject, and outputs the captured image to the image display unit 45.
- the image display unit 45 displays the image generated by the image generation unit 41 on the screen.
- the image display unit 45 is associated with the preset information selection unit 21 and may display an icon image or the like indicating preset information that can be selected by the preset information selection unit 21.
- a user operation input means such as a touch panel is provided on the screen of the image display unit 45 to allow the user to select any icon image from the icon images displayed on the screen of the image display unit 45.
- the preset information selection unit 21 selects preset information corresponding to the icon image selected by the user.
- an image display function may be provided in the preset information selection unit 21 to display an icon image or the like indicating preset information that can be selected by the preset information selection unit 21.
- FIG. 12 is a flowchart showing the operation of the first embodiment.
- step ST1 the image processing apparatus 10 acquires an image.
- the image acquisition unit 11 of the image processing apparatus 10 acquires an image capable of estimating the three-dimensional shape of the subject, the reflection characteristics of the subject, and the position of the light source.
- the image acquisition unit 11 acquires, for example, a captured image acquired by a stereo camera, a captured image acquired by a stereo camera and recorded on a recording medium. Further, the image acquisition unit 11 may acquire these images via a network or the like.
- the image processing apparatus 10 acquires an image and proceeds to step ST2.
- step ST2 the image processing apparatus 10 calculates the three-dimensional shape of the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 calculates, for example, the parallax of each pixel of the subject using the image acquired by the image acquisition unit 11, and the depth corresponding to each pixel position based on the parallax. Is obtained to calculate the three-dimensional shape of the imaged subject.
- the image processing apparatus 10 calculates the three-dimensional shape of the subject and proceeds to step ST3.
- step ST3 the image processing apparatus 10 calculates the light source position and intensity with respect to the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 estimates the light source position based on, for example, the bright spot of the spherical mirror included in the captured image, and the color and intensity of the light source from the color of each bright spot on the spherical mirror. Is calculated.
- the image processing apparatus 10 calculates the light source position and intensity with respect to the subject and proceeds to step ST4.
- step ST4 the image processing apparatus 10 calculates the reflection characteristics of the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 calculates a reflection characteristic based on the three-dimensional shape of the subject calculated in step ST2, the light source position and the intensity calculated in step ST3, assuming, for example, a reflection model of the object in advance. .
- the vector VF u to the observation point and the normal vector VN u of the subject are calculated at each point u of the subject.
- the incident light vector VLm and the reflected light vector Vrm can be calculated.
- the intensity im, d and the intensity im, s are regarded as known, and based on the assumption that the BRDF of the subject is uniform, the intensity coefficients ka, kd, ks, and specular reflection are obtained using Equation (8). A coefficient q indicating a parameter is calculated.
- the image processing apparatus 10 thus calculates the reflection characteristics of the subject and proceeds to step ST5.
- step ST5 the image processing apparatus 10 selects preset information.
- the preset information selection unit 21 of the image processing apparatus 10 selects preset information from preset information stored in advance according to a user operation.
- the preset information selection unit 21 selects preset information according to a user operation from preset information stored in advance. For example, as shown in FIG. 13, preset information JS1 to JS3 is displayed on the screen of the image display unit 45, and any preset information is selected by the user.
- the image processing apparatus 10 selects preset information in accordance with a user operation, and proceeds to step ST6.
- step ST6 the image processing apparatus 10 adjusts the illumination setting information.
- the illumination setting information adjusting unit 31 of the image processing apparatus 10 adjusts the illumination setting information, that is, the preset information selected in step ST5 according to the three-dimensional shape of the imaged subject. For example, scaling is performed so that the center position of the preset coordinate system matches the center of gravity of the imaged subject, and the reference radius Rd determined by the preset information corresponds to the size of the imaged subject. Furthermore, the coordinate axis of the coordinate system of the preset information is made to correspond to the coordinate axis corresponding to the imaged subject, and the arrangement of the light sources indicated by the preset information is adjusted according to the imaged subject.
- the image processing apparatus 10 adjusts the illumination setting information in this way, and proceeds to step ST7.
- step ST7 the image processing apparatus 10 performs a drawing process.
- the image generation unit 41 of the image processing apparatus 10 performs a drawing process based on the captured image acquired in step ST1, the three-dimensional shape of the subject calculated in step ST2, the reflection characteristics calculated in step ST4, and the preset information adjusted in step ST6. Do.
- the drawing process is performed using a method similar to that of conventional computer graphics.
- the image processing apparatus 10 performs drawing processing to generate an image in which the illumination environment is changed, and proceeds to step ST8.
- step ST8 the image processing apparatus 10 determines whether there is image saturation.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 calculates the color and brightness of the entire image using the image generated by the drawing process.
- the image processing apparatus 10 proceeds to step ST9 when determining the luminance and color saturation based on the calculation result of the color and luminance of the entire image, and ends the process when there is no luminance or color saturation.
- step ST9 the image processing apparatus 10 adjusts the light source.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 adjusts the light source so that the luminance and color of the image generated by the drawing process are not saturated. For example, when the luminance is saturated, the illumination setting information adjustment unit 31 performs scaling of the intensity of each light source so that the maximum luminance value in all pixels matches the maximum value of the image dynamic range. Similarly, for the color, the intensity of each light source is scaled. The image processing apparatus 10 performs such light source adjustment and returns to step ST7.
- the lighting environment is automatically set according to the preset information selected by the user and the imaged subject. Therefore, the image processing apparatus can easily set the lighting environment in relighting, and the user can obtain an image similar to an image captured in the desired lighting environment only by selecting desired preset information. .
- FIG. 14 illustrates the configuration of the image processing apparatus according to the second embodiment.
- the image processing apparatus 10 includes an image acquisition unit 11, a subject information acquisition unit 12, a preset image selection unit 22, an illumination setting information acquisition unit 23, an illumination setting information adjustment unit 31, an image generation unit 41, and an image display unit 45. Yes. Note that the preset image selection unit 22 and the illumination setting information acquisition unit 23 correspond to an illumination setting information selection unit.
- the image acquisition unit 11 acquires an image capable of estimating the three-dimensional shape of the subject, the reflection characteristics of the subject, and the position of the light source.
- the image acquisition unit 11 has the same specifications arranged horizontally at a known interval.
- a captured image of the subject is acquired using a pair of cameras (so-called stereo cameras).
- the image acquisition unit 11 uses the bright spots at different positions generated on the spherical surface in each of the right viewpoint image and the left viewpoint image of the stereo camera in the estimation of the position of the light source performed by the subject information acquisition unit 12.
- the captured image of the subject is acquired by including the spherical mirror in the imaging range of the stereo camera.
- the image acquisition unit 11 may be configured to read a captured image acquired by a stereo camera from a recording medium, or may acquire the above-described captured image data from an external device via a network or the like. It may be.
- the subject information acquisition unit 12 acquires captured subject information, for example, subject shape information and reflection characteristic information, from the captured image acquired by the image acquisition unit 11.
- the subject information acquisition unit 12 acquires subject shape information by, for example, calculating the parallax of each pixel of the subject captured in each stereo camera and obtaining the depth corresponding to each pixel position based on the parallax.
- the subject information acquisition unit 12 obtains light source information such as the position, intensity, and color of the light source using the bright spot of the spherical mirror included in the captured image obtained by the stereo camera, for example.
- the subject information acquisition unit 12 acquires reflection characteristic information indicating the reflection characteristics of the captured subject based on the subject shape information and the light source information.
- the preset image selection unit 22 selects a preset image according to a user operation.
- the preset image selection unit 22 stores in advance one or a plurality of images taken in different illumination environments as preset images.
- the preset image selection unit 22 outputs the preset image selected according to the user operation to the illumination setting information acquisition unit 23. For example, when a user operation for selecting a preset image displayed on the image display unit 45 is performed, the preset image selection unit 22 outputs the preset image selected by the user to the illumination setting information acquisition unit 23.
- the illumination setting information acquisition unit 23 extracts the light source position and intensity from the preset image selected by the preset image selection unit 22.
- the technique disclosed in the document “Single image based on illumination, estimation, for lighting, virtual, object, real, scene, Chen et al, 2011 can be used.
- the illumination setting information acquisition unit 23 extracts, for example, three feature amounts from the preset image, that is, a rough three-dimensional structure of the scene, shadow of the subject, and reflection characteristics of the subject.
- the three-dimensional structure of the scene is obtained by comparing the color distribution, edge, and texture information of the entire image with the machine learning database of the prepared scene structure as a feature quantity, thereby obtaining a rough three-dimensional structure.
- the shadow and reflection characteristics of the subject can be obtained by separating the color information in the image into a shadow component and a reflection component. It can be obtained by checking against a machine learning database. Further, the position and intensity of the light source are obtained using the rough three-dimensional structure and the shadow and reflection characteristics of the subject.
- the illumination setting information acquisition unit 23 determines the center of gravity of the three-dimensional structure as the reference radius Rd from the distance from the reference center Op to each vertex of the three-dimensional structure.
- the illumination setting information acquisition unit 23 determines that each light source is uniformly located at a position of the reference radius Rd from the reference center Op, and presets the same information as the preset information of the first embodiment as the illumination setting information. Obtain from an image.
- the illumination setting information adjustment unit 31 adjusts the illumination setting information acquired by the illumination setting information acquisition unit 23 according to the imaged subject. Moreover, the illumination setting information adjustment part 31 or the image generation part 41 performs a drawing process based on the illumination setting information after adjustment. As described above, the subject information acquisition unit 12 acquires subject shape information and reflection characteristic information. In addition, the illumination setting information adjustment unit 31 adjusts the illumination setting information acquired from the preset image selected by the user according to the imaged subject. Therefore, the drawing is performed using the same method as conventional computer graphics using the subject shape information and reflection characteristics acquired by the subject information acquisition unit 12 and the illumination setting information adjusted by the illumination setting information adjustment unit 31. To do.
- the illumination setting information adjustment unit 31 calculates the color and luminance of the entire image using the image generated by the drawing process, and determines that the luminance and color of the entire image are saturated. Adjust the intensity of the light source so that the color is not saturated.
- the image generation unit 41 performs a drawing process.
- the image generation unit 41 is based on the captured image acquired by the image acquisition unit 11, the captured subject information acquired by the subject information acquisition unit 12, and the adjusted illumination setting information supplied from the illumination setting information adjustment unit 31.
- Perform drawing processing The image generation unit 41 performs a drawing process, generates a captured image in an illumination environment in which the illumination environment at the time of generating the selected preset image is adjusted based on the captured subject, and outputs the captured image to the image display unit 45.
- the image display unit 45 displays the image generated by the image generation unit on the screen.
- the image display unit 45 is associated with the preset image selection unit 22 and may display preset images that can be selected by the preset image selection unit 22.
- a user operation input means such as a touch panel is provided on the screen of the image display unit 45 to allow the user to select any preset image from the preset images displayed on the screen of the image display unit 45.
- the preset image selection unit 22 selects a preset image selected by the user.
- the preset image selection unit 22 may be provided with an image display function to display preset images that can be selected by the preset image selection unit 22.
- FIG. 15 is a flowchart showing the operation of the second embodiment.
- the image processing apparatus 10 acquires an image.
- the image acquisition unit 11 of the image processing apparatus 10 acquires an image capable of estimating the three-dimensional shape of the subject, the reflection characteristics of the subject, and the position of the light source.
- the image acquisition unit 11 acquires a captured image acquired by a stereo camera, a captured image acquired by the stereo camera and recorded on a recording medium, and the process proceeds to step ST12.
- the image acquisition unit 11 may acquire these images via a network or the like.
- step ST12 the image processing apparatus 10 calculates the three-dimensional shape of the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 calculates, for example, the parallax of each pixel of the subject using the image acquired by the image acquisition unit 11, and the depth corresponding to each pixel position based on the parallax. Is calculated to calculate the three-dimensional shape of the imaged subject, and the process proceeds to step ST13.
- step ST13 the image processing apparatus 10 calculates the light source position and intensity with respect to the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 estimates the light source position based on, for example, the bright spot of the spherical mirror included in the captured image, and the color and intensity of the light source from the color of each bright spot on the spherical mirror. Is calculated.
- the image processing apparatus 10 calculates the light source position and intensity with respect to the subject and proceeds to step ST14.
- step ST14 the image processing apparatus 10 calculates the reflection characteristics of the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 calculates a reflection characteristic based on the three-dimensional shape of the subject calculated in step ST12 and the light source position and intensity calculated in step ST13, assuming, for example, a reflection model of the object in advance. Then, the process proceeds to step ST15.
- step ST15 the image processing apparatus 10 selects a preset image.
- the preset image selection unit 22 of the image processing apparatus 10 selects a preset image according to a user operation from preset images stored in advance, and proceeds to step ST16.
- the image processing apparatus 10 acquires illumination setting information.
- the illumination setting information acquisition unit 23 of the image processing apparatus 10 acquires illumination setting information such as the light source position and intensity from the preset image selected in step ST15.
- the illumination setting information acquisition unit 23 acquires the rough three-dimensional structure of the scene, the shadow of the subject, and the reflection characteristics of the subject as, for example, three feature amounts from the preset image.
- the illumination setting information acquisition unit 23 calculates the position and intensity of the light source using the acquired three-dimensional structure and the shadow and reflection characteristics of the subject, and proceeds to step ST17.
- step ST17 the image processing apparatus 10 adjusts the illumination setting information.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 adjusts the illumination setting information acquired in step ST16, for example, the position and intensity of the light source according to the three-dimensional shape of the imaged subject, and proceeds to step ST18.
- step ST18 the image processing apparatus 10 performs a drawing process.
- the image generation unit 41 of the image processing apparatus 10 is based on the captured image acquired in step ST11, the three-dimensional shape of the subject calculated in step ST12, the reflection characteristics calculated in step ST14, and the illumination setting information adjusted in step ST17.
- the drawing process is performed as in the conventional case.
- the image processing apparatus 10 performs drawing processing to generate an image in which the illumination environment is changed, and proceeds to step ST19.
- step ST19 the image processing apparatus 10 determines whether there is image saturation.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 calculates the color and brightness of the entire image using the image generated by the drawing process.
- the image processing apparatus 10 proceeds to step ST20 when determining the luminance and color saturation based on the calculation result of the color and luminance of the entire image, and ends the process when there is no luminance or color saturation.
- step ST20 the image processing apparatus 10 adjusts the light source.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 adjusts the light source so that the luminance and color of the image generated by the drawing process are not saturated. For example, when the luminance is saturated, the illumination setting information adjustment unit 31 performs scaling of the intensity of each light source so that the maximum luminance value in all pixels matches the maximum value of the image dynamic range. Similarly, for the color, the intensity of each light source is scaled. The image processing apparatus 10 performs such light source adjustment and returns to step ST18.
- the lighting environment is automatically set according to the preset image selected by the user and the captured subject. Therefore, the image processing apparatus can easily set the lighting environment in relighting, and the user can obtain an image similar to the image captured in the desired lighting environment only by selecting a desired preset image. .
- the illumination setting information is generated from the preset image, it is not necessary to previously generate the preset information composed of information about the light source as in the first embodiment. Accordingly, a preset image is stored in advance, and the user simply selects a preset image of a desired lighting environment, and obtains an image similar to that obtained when the captured subject is imaged in an illumination environment equivalent to the selected preset image. Will be able to.
- FIG. 16 illustrates the configuration of the image processing apparatus according to the third embodiment.
- the image processing apparatus 10 includes an image acquisition unit 11, a subject information acquisition unit 12, a metadata selection unit 24, a correspondence information selection unit 25, an illumination setting information adjustment unit 31, an image generation unit 41, and an image display unit 45.
- the metadata selection unit 24 and the correspondence information selection unit 25 correspond to an illumination setting information selection unit.
- the image acquisition unit 11 acquires an image capable of estimating the three-dimensional shape of the subject, the reflection characteristics of the subject, and the position of the light source.
- the image acquisition unit 11 has the same specifications arranged horizontally at a known interval.
- a captured image of the subject is acquired using a pair of cameras (so-called stereo cameras).
- the image acquisition unit 11 uses the bright spots at different positions generated on the spherical surface in each of the right viewpoint image and the left viewpoint image of the stereo camera in the estimation of the position of the light source performed by the subject information acquisition unit 12.
- the captured image of the subject is acquired by including the spherical mirror in the imaging range of the stereo camera.
- the image acquisition unit 11 may be configured to read a captured image acquired by a stereo camera from a recording medium, or may acquire the above-described captured image data from an external device via a network or the like. It may be.
- the subject information acquisition unit 12 acquires captured subject information, for example, subject shape information and reflection characteristic information, from the captured image acquired by the image acquisition unit 11.
- the subject information acquisition unit 12 acquires subject shape information by, for example, calculating the parallax of each pixel of the subject captured in each stereo camera and obtaining the depth corresponding to each pixel position based on the parallax.
- the subject information acquisition unit 12 obtains light source information such as the position, intensity, and color of the light source using the bright spot of the spherical mirror included in the captured image obtained by the stereo camera, for example.
- the subject information acquisition unit 12 acquires reflection characteristic information indicating the reflection characteristics of the captured subject based on the subject shape information and the light source information.
- the subject information acquisition unit 12 acquires imaging mode information from the captured image acquired by the image acquisition unit 11.
- the subject information acquisition unit 12 determines the imaging mode in which the captured image is an image using the captured subject information and acquires the imaging mode information.
- the subject information acquisition unit 12 determines, for example, the portrait mode when the subject is a person and the macro mode when the subject is a flower.
- the metadata selection unit 24 selects metadata according to a user operation.
- the metadata selection unit 24 selects metadata associated with the illumination setting information stored in advance in the correspondence information selection unit 25 by the user. For example, when a photographer's name is used as metadata as described above, the correspondence information selection unit 25 stores in advance illumination setting information for reproducing the illumination environment used by the photographer with the name indicated in the metadata. Keep it.
- the metadata selection unit 24 outputs the metadata selected by the user to the correspondence information selection unit 25.
- the correspondence information selection unit 25 selects the illumination setting information corresponding to the metadata supplied from the metadata selection unit 24 and outputs it to the illumination setting information adjustment unit 31. In addition, when the illumination setting information is provided for each imaging mode information, the correspondence information selection unit 25 illuminates corresponding to the imaging mode information acquired by the subject information acquisition unit 12 from the illumination setting information corresponding to the metadata. The setting information is output to the illumination setting information adjustment unit 31.
- the correspondence information selection unit 25 when selecting the illumination setting information corresponding to the imaging mode information from the illumination setting information corresponding to the metadata, the correspondence information selection unit 25 adds the metadata (for example, the photographer's name) to the preset information shown in FIG. Illumination setting information provided with an element indicating) is stored in advance. Further, the correspondence information selection unit 25 may select illumination setting information in which the metadata and the subject imaging mode match from the stored illumination setting information.
- stored in the external device may be sufficient. Moreover, it is good also as a structure which can add new illumination setting information to the corresponding
- FIG. 17 is a diagram for explaining selection of lighting setting information.
- the database 251 provided in the correspondence information selection unit 25 stores illumination setting information for each imaging mode for each photographer. For example, for the photographer CMa, illumination setting information (indicated by circles) of the imaging modes MP1 to MP3 is stored. Similarly, the illumination setting information of the imaging modes MP1 to MP3 is stored for the photographer CMb, and the illumination setting information of the imaging modes MP1 to MP2 is stored for the photographer CMc.
- the imaging mode information acquired by the subject information acquisition unit 12 is “imaging mode MP1” and the metadata selected by the user by the metadata selection unit 24 is “photographer CMa”.
- the correspondence information selection unit 25 selects the illumination setting information (indicated by a bold circle) of the imaging mode MP1 from the illumination setting information of the photographer CMa and outputs it to the illumination setting information adjustment unit 31.
- the illumination setting information adjustment unit 31 adjusts the illumination setting information supplied from the metadata selection unit 24 according to the imaged subject. Further, the illumination setting information adjustment unit 31 or the image generation unit performs a drawing process based on the adjusted illumination setting information.
- the subject information acquisition unit 12 acquires subject shape information and reflection characteristic information.
- the illumination setting information adjustment unit 31 adjusts the illumination setting information related to the metadata selected by the user according to the imaged subject. Therefore, the drawing is performed using the same method as conventional computer graphics using the subject shape information and reflection characteristics acquired by the subject information acquisition unit 12 and the illumination setting information adjusted by the illumination setting information adjustment unit 31. To do.
- the illumination setting information adjustment unit 31 calculates the color and luminance of the entire image using the image generated by the drawing process, and determines that the luminance and color of the entire image are saturated. Adjust the intensity of the light source so that the color is not saturated.
- the image generation unit 41 performs a drawing process.
- the image generation unit 41 is based on the captured image acquired by the image acquisition unit 11, the captured subject information acquired by the subject information acquisition unit 12, and the adjusted illumination setting information supplied from the illumination setting information adjustment unit 31.
- Perform drawing processing The image generation unit 41 performs a drawing process, generates a captured image in an illumination environment in which the illumination environment at the time of generating the selected preset image is adjusted based on the captured subject, and outputs the captured image to the image display unit 45.
- the image display unit 45 displays the image generated by the image generation unit on the screen.
- the image display unit 45 is associated with the metadata selection unit 24 and may display metadata that can be selected by the metadata selection unit 24.
- a user operation input means such as a touch panel is provided on the screen of the image display unit 45 to allow the user to select any metadata from the metadata displayed on the screen of the image display unit 45.
- the metadata selection unit 24 may be provided with an image display function to display metadata that can be selected by the metadata selection unit 24.
- FIG. 18 is a flowchart showing the operation of the third embodiment.
- step ST31 the image processing apparatus 10 acquires an image.
- the image acquisition unit 11 of the image processing apparatus 10 acquires an image capable of estimating the three-dimensional shape of the subject, the reflection characteristics of the subject, and the position of the light source.
- the image acquisition unit 11 acquires, for example, a captured image acquired by a stereo camera, a captured image acquired by the stereo camera and recorded on a recording medium, and the process proceeds to step ST32.
- the image acquisition unit 11 may acquire these images via a network or the like.
- step ST32 the image processing apparatus 10 calculates the three-dimensional shape of the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 calculates, for example, the parallax of each pixel of the subject using the image acquired by the image acquisition unit 11, and the depth corresponding to each pixel position based on the parallax. To calculate the three-dimensional shape of the imaged subject, and the process proceeds to step ST33.
- step ST33 the image processing apparatus 10 calculates the light source position and intensity with respect to the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 estimates the light source position based on, for example, the bright spot of the spherical mirror included in the captured image, and the color and intensity of the light source from the color of each bright spot on the spherical mirror. Is calculated.
- the image processing apparatus 10 calculates the light source position and intensity with respect to the subject and proceeds to step ST34.
- step ST34 the image processing apparatus 10 calculates the reflection characteristics of the subject.
- the subject information acquisition unit 12 of the image processing apparatus 10 calculates the reflection characteristics based on the three-dimensional shape of the subject calculated in step ST32 and the light source position and intensity calculated in step ST33, assuming, for example, a reflection model of the object in advance. Then, the process proceeds to step ST35.
- step ST35 the image processing apparatus 10 selects metadata.
- the metadata selection unit 24 of the image processing apparatus 10 selects metadata from the metadata stored in advance according to the user operation, and proceeds to step ST36.
- step ST36 the image processing apparatus 10 selects illumination setting information.
- the correspondence information selection unit 25 of the image processing apparatus 10 proceeds to step ST37 as the illumination setting information used for the relighting process using the illumination setting information associated with the metadata selected in step ST35.
- the imaging mode information acquired by the subject information acquisition unit 12 may be further used to select the illumination setting information that matches the imaging mode indicated by the imaging mode information. .
- step ST37 the image processing apparatus 10 adjusts the illumination setting information.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 adjusts the illumination setting information selected in step ST36 according to the three-dimensional shape of the imaged subject, and proceeds to step ST38.
- step ST38 the image processing apparatus 10 performs a drawing process.
- the image generation unit 41 of the image processing apparatus 10 is based on the captured image acquired in step ST31, the three-dimensional shape of the subject calculated in step ST32, the reflection characteristics calculated in step ST34, and the illumination setting information adjusted in step ST37.
- the drawing process is performed as in the conventional case.
- the image processing apparatus 10 performs drawing processing to generate an image in which the illumination environment is changed, and proceeds to step ST39.
- step ST39 the image processing apparatus 10 determines whether there is image saturation.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 calculates the color and brightness of the entire image using the image generated by the drawing process.
- the image processing apparatus 10 proceeds to step ST40 when determining the luminance or color saturation based on the calculation result of the color and luminance of the entire image, and ends the process when there is no luminance or color saturation.
- step ST40 the image processing apparatus 10 adjusts the light source.
- the illumination setting information adjustment unit 31 of the image processing apparatus 10 adjusts the light source so that the luminance and color of the image generated by the drawing process are not saturated. For example, when the luminance is saturated, the illumination setting information adjustment unit 31 performs scaling of the intensity of each light source so that the maximum luminance value in all pixels matches the maximum value of the image dynamic range. Similarly, for the color, the intensity of each light source is scaled. The image processing apparatus 10 performs such light source adjustment and returns to step ST38.
- the lighting environment is automatically set according to the metadata selected by the user and the captured subject. Therefore, the image processing apparatus can easily set the lighting environment in relighting, and the user can obtain an image similar to an image captured in the desired lighting environment only by selecting desired metadata. .
- metadata and lighting setting information are stored in association with each other. Therefore, for example, by selecting only the metadata indicating the name of the desired photographer, it is possible to obtain an image similar to that obtained when the captured subject is imaged in an illumination environment used by the desired photographer. .
- the photographer's name is used as metadata.
- the word indicating the impression of the generated image is displayed, and the lighting setting information corresponding to the word indicating the impression is displayed. You may make it memorize
- the illumination setting information is selected according to the user selection operation for the preset information, the preset image, or the metadata
- the metadata may be superimposed on the image of the preset information and displayed. .
- imaging mode information and a word indicating the impression of the generated image are superimposed and displayed on the preset information image.
- metadata based on the light source position and intensity extracted from the preset image may be displayed superimposed on the preset image.
- the imaging mode information portrait
- the imaging mode of the subject is determined, and the preset information and preset image of the imaging mode information that matches the determined imaging mode is preferentially presented to the user. .
- the user can easily select desired preset information and preset images.
- the selection and adjustment of the lighting setting information used in the relighting process can also be applied to the editing of a general computer graphic model, so that the computer graphic model can be easily edited without using specialized knowledge and tools.
- the method of acquiring the captured subject information from the captured image and the method of acquiring the illumination setting information from the preset image are not limited to the methods described above, and other methods may be used.
- the series of processes described in the specification can be executed by hardware, software, or a combined configuration of both.
- a program in which a processing sequence is recorded is installed and executed in a memory in a computer incorporated in dedicated hardware.
- the program can be installed and executed on a general-purpose computer capable of executing various processes.
- the program can be recorded in advance on a hard disk, SSD (Solid State Drive), or ROM (Read Only Memory) as a recording medium.
- the program is a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto optical disc), a DVD (Digital Versatile Disc), a BD (Blu-Ray Disc (registered trademark)), a magnetic disk, or a semiconductor memory card. It can be stored (recorded) in a removable recording medium such as temporarily or permanently. Such a removable recording medium can be provided as so-called package software.
- the program may be transferred from the download site to the computer wirelessly or by wire via a network such as a LAN (Local Area Network) or the Internet.
- the computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
- the image processing apparatus of the present technology can have the following configuration.
- a subject information acquisition unit that acquires captured subject information indicating attributes related to illumination in the subject from the captured image;
- a lighting setting information selection unit for selecting lighting setting information according to a user operation;
- An image including an illumination setting information adjustment unit that adjusts the illumination setting information selected by the illumination setting information selection unit to illumination setting information corresponding to the subject based on the imaging subject information acquired by the subject information acquisition unit.
- Processing equipment (2) The image processing apparatus according to (1), wherein the subject information acquisition unit acquires a three-dimensional shape and reflection characteristics of the subject from the captured image as the captured subject information.
- the illumination setting information is generated using a preset coordinate system, and the illumination setting information is adjusted to illumination setting information corresponding to the subject by associating the coordinate system with the subject (1).
- the illumination setting information includes light source information including information on light source arrangement and illumination light, and reference subject information including information on the size and arrangement of a reference subject illuminated with the light source indicated by the light source information.
- the illumination setting information adjustment unit adjusts the illumination setting information so that a subject image obtained by performing drawing processing using the imaging subject information and the adjusted illumination setting information does not cause saturation.
- the image processing apparatus according to any one of (1) to (5).
- the lighting setting information selection unit displays a setting selection image associated with the lighting setting information, and sets the lighting setting information corresponding to the setting selection image selected by the user as the lighting setting information ( The image processing apparatus according to any one of 1) to (6).
- the lighting setting information selection unit displays preset information based on the lighting setting information as the setting selection image, and selects lighting setting information corresponding to the preset information selected by the user. Image processing apparatus.
- the image processing device displays a preset image as the setting selection image and acquires illumination setting information from a preset image selected by a user.
- the lighting setting information selection unit displays the metadata associated with the lighting setting information as the setting selection image, and selects the lighting setting information corresponding to the metadata selected by the user (7).
- An image processing apparatus according to 1.
- the subject information acquisition unit acquires imaging mode information about the subject, The image processing according to any one of (1) to (10), wherein the illumination setting information selection unit selects illumination setting information having imaging mode information that matches the imaging mode information acquired by the subject information acquisition unit. apparatus.
- An image generation unit that performs a drawing process based on the imaging subject information acquired by the subject information acquisition unit and the illumination setting information adjusted by the illumination setting information adjustment unit (1) to (11)
- An image processing apparatus according to any one of the above.
- the image processing device further including an image acquisition unit that acquires an image obtained by capturing the subject from different viewpoint positions as the captured image.
- subject information indicating attributes related to illumination on the subject is acquired from the captured image by the subject information acquisition unit.
- the illumination setting information selection unit selects the illumination setting information according to a user operation.
- the illumination setting information adjustment unit performs processing for converting the selected illumination setting information into illumination setting information corresponding to the subject based on the subject information. For this reason, if the user selects the illumination setting information, the selected illumination setting information is adjusted to the illumination setting information corresponding to the subject. It is possible to generate an image corresponding to a case where an image is captured in an illumination environment, and it is possible to easily set the illumination environment in relighting. Therefore, it is suitable for an electronic device having an imaging function, an electronic device that edits a captured image, and the like.
Abstract
Description
撮像画像から被写体における照明に関係した属性を示す撮像被写体情報を取得する被写体情報取得部と、
照明設定情報をユーザ操作に応じて選択する照明設定情報選択部と、
前記照明設定情報選択部で選択された照明設定情報を、前記被写体情報取得部で取得された撮像被写体情報に基づいて前記被写体に対応した照明設定情報に調整する照明設定情報調整部と
を備える画像処理装置にある。
被写体情報取得部において、撮像画像から被写体における照明に関係した属性を示す撮像被写体情報を取得する工程と、
照明設定情報選択部において、ユーザ操作に応じて照明設定情報を選択する工程と、
照明設定情報調整部において、前記照明設定情報選択部で選択された照明設定情報を、前記被写体情報取得部で取得された撮像被写体情報に基づいて前記被写体に対応した照明設定情報に調整する工程と
含む画像処理方法にある。
1.第1の実施の形態
2.第2の実施の形態
3.第3の実施の形態
4.他の実施の形態
第1の実施の形態では、照明設定情報の選択において、照明設定情報に基づいたプリセット情報を設定選択画像として表示して、ユーザが選択したプリセット情報に対応する照明設定情報を選択した照明設定情報とする場合について説明する。
図1は、第1の実施の形態の画像処理装置の構成を例示している。画像処理装置10は、画像取得部11、被写体情報取得部12,プリセット情報選択部21、照明設定情報調整部31、画像生成部41、画像表示部45を有している。なお、プリセット情報選択部21は、照明設定情報選択部に相当する。
次に、第2の実施の形態について説明する。第2の実施の形態では、照明設定情報の選択において、プリセット画像を設定選択画像として表示して、ユーザが選択したプリセット画像から照明設定情報を取得する場合について説明する。
次に、第3の実施の形態について説明する。第3の実施の形態では、照明設定情報の選択において、照明設定情報に関係付けられているメタデータを設定選択画像として表示して、ユーザが選択したメタデータに対応する照明設定情報を選択する場合について説明する。なお、第3の実施の形態では、メタデータとして例えば写真家の名前を用いる場合、メタデータで示された名前の写真家が用いる照明環境を再現するための照明設定情報をメタデータと関係付けて記憶しておく。
上述の実施の形態では、プリセット情報やプリセット画像またはメタデータに対するユーザ選択操作に応じて照明設定情報を選択する場合について説明したが、プリセット情報の画像にメタデータを重畳させて表示してもよい。例えばプリセット情報の画像に撮像モード情報や生成される画像の印象を示す単語を重畳して表示する。また、プリセット画像から抽出した光源位置および強度に基づいたメタデータをプリセット画像に重畳させて表示してもよい。例えば、光源位置および強度がポートレートの撮像モードの場合に類似している場合、プリセット画像に撮像モード情報(ポートレート)を重畳して表示する。このように、プリセット情報やプリセット画像またはメタデータを組み合わせて表示すれば、ユーザは好みの照明環境を容易に選択できるようになる。
(1) 撮像画像から被写体における照明に関係した属性を示す撮像被写体情報を取得する被写体情報取得部と、
照明設定情報をユーザ操作に応じて選択する照明設定情報選択部と、
前記照明設定情報選択部で選択された照明設定情報を、前記被写体情報取得部で取得された撮像被写体情報に基づいて前記被写体に対応した照明設定情報に調整する照明設定情報調整部と
を備える画像処理装置。
(2) 前記被写体情報取得部は、前記撮像被写体情報として前記撮像画像から被写体の三次元形状と反射特性を取得する(1)に記載の画像処理装置。
(3) 前記照明設定情報は予め設定した座標系を用いて生成されており、前記座標系を前記被写体に対応させることで前記照明設定情報を前記被写体に対応した照明設定情報に調整する(1)または(2)に記載の画像処理装置。
(4) 前記照明設定情報は、光源の配置と照明光に関する情報を含む光源情報と、前記光源情報で示された光源で照明される基準被写体のサイズと配置に関する情報を含む基準被写体情報を有する(3)に記載の画像処理装置。
(5) 前記照明設定情報調整部は、前記基準被写体のサイズと配置が前記撮像画像の被写体と一致するように前記座標系を前記被写体に対応させる(4)に記載の画像処理装置。
(6) 前記照明設定情報調整部は、前記撮像被写体情報と前記調整後の照明設定情報を用いて描画処理を行うことにより得られる被写体画像が飽和を生じないように、前記照明設定情報を調整する(1)乃至(5)のいずれかに記載の画像処理装置。
(7) 前記照明設定情報選択部は、前記照明設定情報に関係付けられた設定選択画像を表示して、ユーザが選択した設定選択画像に対応する照明設定情報を選択した照明設定情報とする(1)乃至(6)のいずれかに記載の画像処理装置。
(8) 前記照明設定情報選択部は、前記照明設定情報に基づいたプリセット情報を前記設定選択画像として表示して、ユーザが選択したプリセット情報に対応する照明設定情報を選択する(7)に記載の画像処理装置。
(9) 前記照明設定情報選択部は、プリセット画像を前記設定選択画像として表示して、ユーザが選択したプリセット画像から照明設定情報を取得する(7)に記載の画像処理装置。
(10) 前記照明設定情報選択部は、照明設定情報に関係付けられているメタデータを前記設定選択画像として表示して、ユーザが選択したメタデータに対応する照明設定情報を選択する(7)に記載の画像処理装置。
(11) 前記被写体情報取得部は、前記被写体についての撮像モード情報を取得して、
前記照明設定情報選択部は、前記被写体情報取得部で取得された撮像モード情報と一致する撮像モード情報を有した照明設定情報を選択する(1)乃至(10)のいずれかに記載の画像処理装置。
(12) 前記被写体情報取得部で取得された撮像被写体情報と、前記照明設定情報調整部で調整された照明設定情報に基づいて描画処理を行う画像生成部をさらに備える(1)乃至(11)のいずれかに記載の画像処理装置。
(13) 前記被写体を異なる視点位置から撮像した画像を前記撮像画像として取得する画像取得部をさらに備える(1)乃至(12)のいずれかに記載の画像処理装置。
11・・・画像取得部
12・・・被写体情報取得部
21・・・プリセット情報選択部
22・・・プリセット画像選択部
23・・・照明設定情報取得部
24・・・メタデータ選択部
25・・・対応情報選択部
31・・・照明設定情報調整部
41・・・画像生成部
45・・・画像表示部
Claims (14)
- 撮像画像から被写体における照明に関係した属性を示す撮像被写体情報を取得する被写体情報取得部と、
照明設定情報をユーザ操作に応じて選択する照明設定情報選択部と、
前記照明設定情報選択部で選択された照明設定情報を、前記被写体情報取得部で取得された撮像被写体情報に基づいて前記被写体に対応した照明設定情報に調整する照明設定情報調整部と
を備える画像処理装置。 - 前記被写体情報取得部は、前記撮像被写体情報として前記撮像画像から被写体の三次元形状と反射特性を取得する
請求項1記載の画像処理装置。 - 前記照明設定情報は予め設定した座標系を用いて生成されており、前記座標系を前記被写体に対応させることで前記照明設定情報を前記被写体に対応した照明設定情報に調整する
請求項1記載の画像処理装置。 - 前記照明設定情報は、光源の配置と照明光に関する情報を含む光源情報と、前記光源情報で示された光源で照明される基準被写体のサイズと配置に関する情報を含む基準被写体情報を有する
請求項3記載の画像処理装置。 - 前記照明設定情報調整部は、前記基準被写体のサイズと配置が前記撮像画像の被写体と一致するように前記座標系を前記被写体に対応させる
請求項4記載の画像処理装置。 - 前記照明設定情報調整部は、前記撮像被写体情報と前記調整後の照明設定情報を用いて描画処理を行うことにより得られる被写体画像が飽和を生じないように、前記照明設定情報を調整する
請求項1記載の画像処理装置。 - 前記照明設定情報選択部は、前記照明設定情報に関係付けられた設定選択画像を表示して、ユーザが選択した設定選択画像に対応する照明設定情報を選択した照明設定情報とする
請求項1記載の画像処理装置。 - 前記照明設定情報選択部は、前記照明設定情報に基づいたプリセット情報を前記設定選択画像として表示して、ユーザが選択したプリセット情報に対応する照明設定情報を選択する
請求項7記載の画像処理装置。 - 前記照明設定情報選択部は、プリセット画像を前記設定選択画像として表示して、ユーザが選択したプリセット画像から照明設定情報を取得する
請求項7記載の画像処理装置。 - 前記照明設定情報選択部は、照明設定情報に関係付けられているメタデータを前記設定選択画像として表示して、ユーザが選択したメタデータに対応する照明設定情報を選択する
請求項7記載の画像処理装置。 - 前記被写体情報取得部は、前記被写体についての撮像モード情報を取得して、
前記照明設定情報選択部は、前記被写体情報取得部で取得された撮像モード情報と一致する撮像モード情報を有した照明設定情報を選択する
請求項1記載の画像処理装置。 - 前記被写体情報取得部で取得された撮像被写体情報と、前記照明設定情報調整部で調整された照明設定情報に基づいて描画処理を行う画像生成部をさらに備える
請求項1記載の画像処理装置。 - 前記被写体を異なる視点位置から撮像した画像を前記撮像画像として取得する画像取得部をさらに備える
請求項1記載の画像処理装置。 - 被写体情報取得部において、撮像画像から被写体における照明に関係した属性を示す撮像被写体情報を取得する工程と、
照明設定情報選択部において、ユーザ操作に応じて照明設定情報を選択する工程と、
照明設定情報調整部において、前記照明設定情報選択部で選択された照明設定情報を、前記被写体情報取得部で取得された撮像被写体情報に基づいて前記被写体に対応した照明設定情報に調整する工程と
含む画像処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/127,226 US10229483B2 (en) | 2014-04-30 | 2015-02-13 | Image processing apparatus and image processing method for setting an illumination environment |
EP15785695.6A EP3139589B1 (en) | 2014-04-30 | 2015-02-13 | Image processing apparatus and image processing method |
JP2016515872A JP6493395B2 (ja) | 2014-04-30 | 2015-02-13 | 画像処理装置と画像処理方法 |
CN201580021217.3A CN106256122B (zh) | 2014-04-30 | 2015-02-13 | 图像处理设备和图像处理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-093518 | 2014-04-30 | ||
JP2014093518 | 2014-04-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015166684A1 true WO2015166684A1 (ja) | 2015-11-05 |
Family
ID=54358425
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/053947 WO2015166684A1 (ja) | 2014-04-30 | 2015-02-13 | 画像処理装置と画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10229483B2 (ja) |
EP (1) | EP3139589B1 (ja) |
JP (1) | JP6493395B2 (ja) |
CN (1) | CN106256122B (ja) |
WO (1) | WO2015166684A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105447829A (zh) * | 2015-11-25 | 2016-03-30 | 小米科技有限责任公司 | 图像处理方法及装置 |
JP2017097729A (ja) * | 2015-11-26 | 2017-06-01 | 矢崎総業株式会社 | レンダリング計算方法および表示装置 |
JP2019083008A (ja) * | 2017-10-31 | 2019-05-30 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、及びプログラム |
JP2019179463A (ja) * | 2018-03-30 | 2019-10-17 | キヤノン株式会社 | 画像処理装置、その制御方法、プログラム、記録媒体 |
JP2020524430A (ja) * | 2017-06-04 | 2020-08-13 | アップル インコーポレイテッドApple Inc. | ユーザインタフェースカメラ効果 |
JP2021018643A (ja) * | 2019-07-22 | 2021-02-15 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11330184B2 (en) | 2020-06-01 | 2022-05-10 | Apple Inc. | User interfaces for managing media |
US11350026B1 (en) | 2021-04-30 | 2022-05-31 | Apple Inc. | User interfaces for altering visual media |
US11468625B2 (en) | 2018-09-11 | 2022-10-11 | Apple Inc. | User interfaces for simulated depth effects |
US11641517B2 (en) | 2016-06-12 | 2023-05-02 | Apple Inc. | User interface for camera effects |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
JP7388771B1 (ja) | 2022-11-18 | 2023-11-29 | Lineヤフー株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP7421607B2 (ja) | 2020-11-09 | 2024-01-24 | キヤノン株式会社 | 画像処理装置およびその方法 |
US11895391B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Capturing and displaying images with multiple focal planes |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3057067B1 (en) * | 2015-02-16 | 2017-08-23 | Thomson Licensing | Device and method for estimating a glossy part of radiation |
JP6789877B2 (ja) * | 2017-04-28 | 2020-11-25 | キヤノン株式会社 | 情報処理装置、画像処理システム、制御システム及び情報処理方法 |
US11212459B2 (en) * | 2017-05-16 | 2021-12-28 | Technion Research & Development Foundation Limited | Computational imaging of the electric grid |
US11182638B2 (en) * | 2017-06-29 | 2021-11-23 | Sony Interactive Entertainment Inc. | Information processing device and material specifying method |
US20210134049A1 (en) * | 2017-08-08 | 2021-05-06 | Sony Corporation | Image processing apparatus and method |
US10520424B2 (en) * | 2018-04-03 | 2019-12-31 | Hiwin Technologies Corp. | Adaptive method for a light source for inspecting an article |
CN108986128A (zh) * | 2018-04-19 | 2018-12-11 | 芜湖圣美孚科技有限公司 | 一种用于机器视觉的直射式照明系统 |
JP6934565B2 (ja) | 2018-05-08 | 2021-09-15 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および被写体情報取得方法 |
WO2019215820A1 (ja) * | 2018-05-08 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および被写体情報取得方法 |
CN111489448A (zh) * | 2019-01-24 | 2020-08-04 | 宏达国际电子股份有限公司 | 检测真实世界光源的方法、混合实境系统及记录介质 |
EP3934231A4 (en) * | 2019-03-18 | 2022-04-06 | Sony Group Corporation | IMAGING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004021388A (ja) * | 2002-06-13 | 2004-01-22 | Nippon Hoso Kyokai <Nhk> | 画像処理装置及びそれを備えた撮影システム |
JP2007066012A (ja) * | 2005-08-31 | 2007-03-15 | Toshiba Corp | 映像描画装置、方法およびプログラム |
JP2009223906A (ja) * | 1996-10-29 | 2009-10-01 | Intel Corp | コンピュータグラフィックス/画像生成装置の照明および陰影シミュレーション |
JP2013009082A (ja) * | 2011-06-23 | 2013-01-10 | Nippon Telegr & Teleph Corp <Ntt> | 映像合成装置、映像合成方法、および映像合成プログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI303791B (en) | 2002-03-21 | 2008-12-01 | Microsoft Corp | Graphics image rendering with radiance self-transfer for low-frequency lighting environments |
US20110043881A1 (en) * | 2008-05-07 | 2011-02-24 | Koninklijke Philips Electronics N.V. | Device and process for controlled conveying of different visual impressions of a room while retaining identical room illumination |
CN101425179B (zh) * | 2008-11-18 | 2012-03-28 | 清华大学 | 一种人脸图像重光照的方法及装置 |
JP5106459B2 (ja) * | 2009-03-26 | 2012-12-26 | 株式会社東芝 | 立体物判定装置、立体物判定方法及び立体物判定プログラム |
JP2011209019A (ja) | 2010-03-29 | 2011-10-20 | Sony Corp | ロボット装置及びロボット装置の制御方法 |
CN101872491B (zh) * | 2010-05-21 | 2011-12-28 | 清华大学 | 基于光度立体的自由视角重光照方法和系统 |
CN101916455B (zh) * | 2010-07-01 | 2012-06-27 | 清华大学 | 一种高动态范围纹理三维模型的重构方法及装置 |
WO2012056578A1 (ja) * | 2010-10-29 | 2012-05-03 | オリンパス株式会社 | 画像解析方法および画像解析装置 |
WO2013191689A1 (en) * | 2012-06-20 | 2013-12-27 | Image Masters, Inc. | Presenting realistic designs of spaces and objects |
-
2015
- 2015-02-13 CN CN201580021217.3A patent/CN106256122B/zh not_active Expired - Fee Related
- 2015-02-13 WO PCT/JP2015/053947 patent/WO2015166684A1/ja active Application Filing
- 2015-02-13 EP EP15785695.6A patent/EP3139589B1/en active Active
- 2015-02-13 US US15/127,226 patent/US10229483B2/en not_active Expired - Fee Related
- 2015-02-13 JP JP2016515872A patent/JP6493395B2/ja not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009223906A (ja) * | 1996-10-29 | 2009-10-01 | Intel Corp | コンピュータグラフィックス/画像生成装置の照明および陰影シミュレーション |
JP2004021388A (ja) * | 2002-06-13 | 2004-01-22 | Nippon Hoso Kyokai <Nhk> | 画像処理装置及びそれを備えた撮影システム |
JP2007066012A (ja) * | 2005-08-31 | 2007-03-15 | Toshiba Corp | 映像描画装置、方法およびプログラム |
JP2013009082A (ja) * | 2011-06-23 | 2013-01-10 | Nippon Telegr & Teleph Corp <Ntt> | 映像合成装置、映像合成方法、および映像合成プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3139589A4 * |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105447829B (zh) * | 2015-11-25 | 2018-06-08 | 小米科技有限责任公司 | 图像处理方法及装置 |
CN105447829A (zh) * | 2015-11-25 | 2016-03-30 | 小米科技有限责任公司 | 图像处理方法及装置 |
JP2017097729A (ja) * | 2015-11-26 | 2017-06-01 | 矢崎総業株式会社 | レンダリング計算方法および表示装置 |
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
US11641517B2 (en) | 2016-06-12 | 2023-05-02 | Apple Inc. | User interface for camera effects |
JP2022084635A (ja) * | 2017-06-04 | 2022-06-07 | アップル インコーポレイテッド | ユーザインタフェースカメラ効果 |
JP2020524430A (ja) * | 2017-06-04 | 2020-08-13 | アップル インコーポレイテッドApple Inc. | ユーザインタフェースカメラ効果 |
US11687224B2 (en) | 2017-06-04 | 2023-06-27 | Apple Inc. | User interface camera effects |
JP7033152B2 (ja) | 2017-06-04 | 2022-03-09 | アップル インコーポレイテッド | ユーザインタフェースカメラ効果 |
JP7247390B2 (ja) | 2017-06-04 | 2023-03-28 | アップル インコーポレイテッド | ユーザインタフェースカメラ効果 |
JP2019083008A (ja) * | 2017-10-31 | 2019-05-30 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、及びプログラム |
JP7169841B2 (ja) | 2017-10-31 | 2022-11-11 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、及びプログラム |
JP7059076B2 (ja) | 2018-03-30 | 2022-04-25 | キヤノン株式会社 | 画像処理装置、その制御方法、プログラム、記録媒体 |
JP2019179463A (ja) * | 2018-03-30 | 2019-10-17 | キヤノン株式会社 | 画像処理装置、その制御方法、プログラム、記録媒体 |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11468625B2 (en) | 2018-09-11 | 2022-10-11 | Apple Inc. | User interfaces for simulated depth effects |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11895391B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11669985B2 (en) | 2018-09-28 | 2023-06-06 | Apple Inc. | Displaying and editing images with depth information |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
JP2021018643A (ja) * | 2019-07-22 | 2021-02-15 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP7378997B2 (ja) | 2019-07-22 | 2023-11-14 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
US11330184B2 (en) | 2020-06-01 | 2022-05-10 | Apple Inc. | User interfaces for managing media |
US11617022B2 (en) | 2020-06-01 | 2023-03-28 | Apple Inc. | User interfaces for managing media |
JP7421607B2 (ja) | 2020-11-09 | 2024-01-24 | キヤノン株式会社 | 画像処理装置およびその方法 |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US11418699B1 (en) | 2021-04-30 | 2022-08-16 | Apple Inc. | User interfaces for altering visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11350026B1 (en) | 2021-04-30 | 2022-05-31 | Apple Inc. | User interfaces for altering visual media |
US11416134B1 (en) | 2021-04-30 | 2022-08-16 | Apple Inc. | User interfaces for altering visual media |
JP7388771B1 (ja) | 2022-11-18 | 2023-11-29 | Lineヤフー株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
US10229483B2 (en) | 2019-03-12 |
JPWO2015166684A1 (ja) | 2017-04-20 |
EP3139589A4 (en) | 2017-11-22 |
CN106256122B (zh) | 2019-09-24 |
JP6493395B2 (ja) | 2019-04-03 |
EP3139589A1 (en) | 2017-03-08 |
EP3139589B1 (en) | 2022-04-06 |
CN106256122A (zh) | 2016-12-21 |
US20170124689A1 (en) | 2017-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6493395B2 (ja) | 画像処理装置と画像処理方法 | |
US11694392B2 (en) | Environment synthesis for lighting an object | |
JP4945642B2 (ja) | 3d画像の色補正の方法及びそのシステム | |
JP6864449B2 (ja) | イメージの明るさを調整する方法及び装置 | |
US10403045B2 (en) | Photorealistic augmented reality system | |
JP6073858B2 (ja) | 顔の位置検出 | |
US10169891B2 (en) | Producing three-dimensional representation based on images of a person | |
JP2013235537A (ja) | 画像作成装置、画像作成プログラム、及び記録媒体 | |
JP2013127774A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP6869652B2 (ja) | 画像処理装置、撮像装置、画像処理方法、画像処理プログラム、および、記憶媒体 | |
WO2020075252A1 (ja) | 情報処理装置、プログラム及び情報処理方法 | |
JP2009211513A (ja) | 画像処理装置及びその方法 | |
JP2003216973A (ja) | 三次元画像処理方法、三次元画像処理プログラム、三次元画像処理装置および三次元画像処理システム | |
US20180213196A1 (en) | Method of projection mapping | |
JP2008204318A (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP5926626B2 (ja) | 画像処理装置及びその制御方法、プログラム | |
JP2007272847A (ja) | 照明シミュレーション方法及び画像合成方法 | |
JP5506371B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP7476511B2 (ja) | 画像処理システム、画像処理方法及びプログラム | |
US9734579B1 (en) | Three-dimensional models visual differential | |
JP2006031595A (ja) | 画像処理装置 | |
TWI603287B (zh) | 虛擬物件之影像合成方法與裝置 | |
JP2015045958A (ja) | 表示処理装置、表示処理方法、及びプログラム | |
Plopski et al. | Reflectance and light source estimation for indoor AR Applications | |
Lee et al. | Interactive retexturing from unordered images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15785695 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016515872 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015785695 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015785695 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15127226 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |