WO2010079557A1 - 撮像装置向き検出装置および当該装置を備える移動体 - Google Patents
撮像装置向き検出装置および当該装置を備える移動体 Download PDFInfo
- Publication number
- WO2010079557A1 WO2010079557A1 PCT/JP2009/007034 JP2009007034W WO2010079557A1 WO 2010079557 A1 WO2010079557 A1 WO 2010079557A1 JP 2009007034 W JP2009007034 W JP 2009007034W WO 2010079557 A1 WO2010079557 A1 WO 2010079557A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- polarization
- image
- orientation
- imaging device
- sky
- Prior art date
Links
- 230000010287 polarization Effects 0.000 claims abstract description 668
- 238000003384 imaging method Methods 0.000 claims description 193
- 238000012545 processing Methods 0.000 claims description 98
- 238000001514 detection method Methods 0.000 claims description 91
- 238000004364 calculation method Methods 0.000 claims description 50
- 238000012937 correction Methods 0.000 claims description 32
- 238000006243 chemical reaction Methods 0.000 claims description 21
- 238000003860 storage Methods 0.000 claims description 5
- 238000004891 communication Methods 0.000 claims description 3
- 235000019557 luminance Nutrition 0.000 description 119
- 238000000034 method Methods 0.000 description 70
- 238000010586 diagram Methods 0.000 description 66
- 230000008569 process Effects 0.000 description 20
- 230000005540 biological transmission Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 13
- 230000010365 information processing Effects 0.000 description 11
- 238000009434 installation Methods 0.000 description 11
- 230000008859 change Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 239000000047 product Substances 0.000 description 7
- 238000000926 separation method Methods 0.000 description 7
- 239000004038 photonic crystal Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000002834 transmittance Methods 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000005684 electric field Effects 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 239000013598 vector Substances 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000010355 oscillation Effects 0.000 description 2
- 235000012736 patent blue V Nutrition 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000002902 bimodal effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000005389 magnetism Effects 0.000 description 1
- 230000005415 magnetization Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 239000012466 permeate Substances 0.000 description 1
- 238000010587 phase diagram Methods 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 150000003384 small molecules Chemical class 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
Definitions
- the present invention relates to an imaging device orientation detection device that can acquire the relative positional relationship between the imaging device and the sun during shooting.
- the present invention also relates to an imaging device or a moving body that includes an imaging device orientation detection device.
- image processing includes, for example, (i) correction of backlight, which has been the main cause of camera shooting failures, (ii) high resolution of an image known as digital zoom, (iii) human face And (iv) augmented reality in which an image virtually created by Computer-Graphics is superimposed and displayed on a real image.
- Japanese Patent Application Laid-Open No. 2004-228561 proposes a method of attaching a sensor to the upper part of a camera to detect a light source and instructing a photographer about a shooting direction.
- a photoelectric conversion element provided with a fisheye lens is used as a sensor for knowing the light source direction.
- a light source direction is determined by obtaining a position where the intensity of light collected from the whole sky is maximum on the sensor.
- the sun reflected by the window is seen at a high position, a strong light source exists in a direction other than the actual sun direction, and the solar direction detection may fail.
- Patent Document 2 proposes a method of using the whole sky polarization state in order to acquire information reflecting the sun position more accurately.
- the light source detection unit is installed on the upper part of the camera as in Patent Document 1.
- the light source detection unit performs all-sky polarization imaging using an all-sky observation sensor including a polarization filter.
- the polarization characteristics of the whole sky are acquired from a plurality of images taken by rotating the polarization filter, and the solar direction is determined therefrom.
- the sky polarization is also taken up in Non-Patent Document 1, and the polarization state of the sky is observed using a fish-eye camera capable of photographing a wide sky region, as in Patent Documents 1 and 2 described above. .
- the concrete method is not described, it mentions the possibility that the solar direction can be obtained from the polarization state.
- Patent Document 3 discloses a patterned polarizer for acquiring a plurality of polarized images having different polarization main axes.
- Non-Patent Document 2 Non-Patent Document 3
- Non-Patent Document 4 The empty polarization pattern is also described in Non-Patent Document 2, Non-Patent Document 3, and Non-Patent Document 4.
- an all-sky observation sensor is required on the upper part of the camera in addition to the image sensor, so that it must be disadvantageous in terms of size. Further, it may be difficult to hold and inconvenience for a photographer who holds the camera.
- the shooting range includes the sun in most cases.
- sunlight is very strong, for example, a mechanism for reducing incident light components is required. That is, it is not easy to input to the camera photographing / photoelectric conversion element including the sun.
- the present invention has been made in consideration of the above-described problems, and the relative positional relationship between the imaging device and the sun can be obtained from a partial sky state in a scene image without providing an all-sky observation sensor. Another object is to provide an apparatus capable of detecting the orientation of an imaging apparatus.
- Another object of the present invention is to provide a mobile object (including a mobile terminal, a mobile phone, an automobile, etc.) provided with such an imaging device and having a function of detecting the orientation of the mobile object. There is.
- An imaging apparatus orientation detection apparatus is an imaging apparatus orientation detection apparatus that detects an orientation of an imaging apparatus including an imaging unit that acquires a polarization image including a polarization phase image and a luminance image by imaging, and the polarization image And an image processing unit that generates a blue sky polarization phase image indicating a polarization phase of a blue sky region included in the luminance image based on the luminance image, and an imaging device orientation determined by the orientation of the imaging unit based on the blue sky polarization phase image And an output unit that outputs information indicating the imaging device direction estimated by the direction estimation unit.
- a solar position acquisition unit that acquires information regarding the position of the sun at the time of shooting is provided, and the direction estimation unit estimates the direction of the imaging device using the information.
- the apparatus includes an all-sky polarization map acquisition unit that acquires an all-sky polarization map indicating a polarization state of the sky at the time of photographing based on information on the position of the sun, and the direction estimation unit includes the blue sky polarization phase.
- the orientation of the imaging device is estimated based on the image and the all-sky polarization map.
- the omnidirectional polarization map acquisition unit acquires an omnidirectional polarization map indicating a sky polarization state at the time of photographing from a database including the omnidirectional polarization map.
- a storage device for storing the database is provided.
- a communication device accesses an external storage device that stores the database.
- the all-sky polarization map acquisition unit generates an all-sky polarization map indicating a sky polarization state at the time of photographing by calculation.
- the direction estimation unit calculates the direction of the blue sky region from the polarization phase of the blue sky region, and estimates the direction of the imaging device.
- the apparatus includes an all-sky polarization map acquisition unit that acquires an all-sky polarization map indicating a sky polarization state at the time of photographing, and the direction estimation unit operates in at least one of a search mode and a calculation mode, and the search In the mode, the direction of the blue sky region is searched based on the blue sky polarization phase image and the all-sky polarization map, and in the calculation mode, the direction of the blue sky region is calculated from the polarization phase of the blue sky region.
- a horizontality correction unit that corrects the inclination of the imaging device is provided.
- the tilt of the imaging device includes a tilt in the roll direction.
- the imaging device includes a spirit level, acquires a level by the level, and corrects an inclination of the imaging device based on the acquired level.
- the camera has an angle-of-view acquisition unit that acquires an angle of view of an imaging range and determines a range of a blue sky region based on the acquired angle of view.
- the imaging unit includes a plurality of polarizers having different polarization main axis angles, and acquires the polarization image according to light transmitted through the plurality of polarizers.
- the polarization image includes a polarization degree image in addition to the polarization phase image.
- the image processing unit uses the degree of polarization to cut out the blue sky region when the sky polarization degree is equal to or higher than a reference value, and uses the hue when the polarization degree is lower than the reference value.
- the blue sky region is cut out and the blue sky polarization phase image is output.
- a reliability determination unit that determines the reliability of the estimation result and presents information to the user is provided.
- a solar altitude determination unit that determines whether or not estimation is possible according to the altitude of the sun obtained from information on the position of the sun at the time of shooting is provided.
- coordinate conversion is performed based on the altitude and orientation of the sun and the orientation of the imaging device to obtain the sun position in camera coordinates.
- the imaging device (camera) of the present invention includes an imaging device having an imaging unit that acquires a polarization image including a polarization phase image and a luminance image, and any one of the imaging device orientation detection devices described above.
- a moving body of the present invention is a moving body including any one of the above-described imaging device orientation detection devices, an imaging device having an imaging unit that acquires a polarization image including a polarization phase image and a luminance image, and the moving body.
- a moving body direction estimation unit that determines the direction of the moving body from the detected direction of the imaging apparatus according to the relationship between the direction and the direction of the imaging apparatus.
- a portable device of the present invention is a portable device including any one of the imaging device orientation detection devices described above, and an imaging device having an imaging unit that acquires a polarization image including a polarization phase image and a luminance image, and the portable device A portable device orientation estimation unit that determines the orientation of the portable device from the detected orientation of the imaging device according to the relationship between the orientation and the orientation of the imaging device;
- An image input device includes an imaging unit that acquires a polarization image including a polarization phase image and a luminance image by photographing, and a polarization phase of a blue sky region included in the luminance image based on the polarization image and the luminance image.
- An image processing unit that generates a blue sky polarization phase image to be shown, an orientation estimation unit that estimates an imaging device direction determined by the orientation of the imaging unit based on the blue sky polarization phase image, and data of an image captured by the imaging unit, And an output unit that outputs information indicating the imaging device direction estimated by the orientation estimation unit.
- the image format of the present invention holds image data, data indicating the date and time of shooting, data indicating the longitude and latitude of the shooting location, and data indicating the orientation of the imaging device.
- a polarization image and a luminance image are obtained by the imaging device, and a blue sky polarization phase indicating a polarization phase of a blue sky region included in the luminance image based on the polarization image and the luminance image.
- a program of the present invention is a program for an imaging device orientation detection device that detects an imaging device orientation at the time of shooting using a sky polarization pattern, and acquires a polarization image and a luminance image by the imaging device; Generating a blue sky polarization phase image indicating a polarization phase of a blue sky region included in the luminance image based on the polarization image and the luminance image; estimating an imaging device direction based on the blue sky polarization phase image; Causing the computer to execute a step of outputting information indicating the orientation of the imaging apparatus.
- the orientation of an imaging device or a moving body is acquired using partial polarization information of the sky without providing an all-sky observation sensor. it can.
- FIG. 3 is a schematic diagram illustrating a basic configuration of a color polarization acquisition unit 301.
- FIG. (A) And (b) is the schematic diagram which looked at a part of imaging surface of the polarization
- (A), (b), (c) is a graph which shows typically the wavelength characteristic of a B, G, R polarization pixel, respectively. It is a graph which shows the transmission region and polarization separation region of G color filter. It is a graph which shows the brightness
- (A), (b), and (c) are a polarization degree image, a polarization phase image, and a color image (luminance image) photographed by the scene image / scene polarization image acquisition unit, respectively, and (d) is a color image.
- It is a schematic diagram of an image.
- (A) And (b) is a schematic diagram shown about the roll horizontal degree correction
- (A)-(f) is a figure which shows the processing result at the time of applying this method to the daytime east sky scene image actually.
- (A)-(d) is a figure which shows the final process result about the image of FIG.
- (A)-(f) is a figure which shows the processing result (failure) at the time of applying this method to the east sky scene image of the evening actually.
- (A)-(d) is a figure which shows the processing result at the time of applying to the east sky scene image of the evening using hue similarity in this method.
- (A) And (b) is a block diagram which shows the structure of a camera direction acquisition part.
- (A)-(c) is a conceptual diagram which shows the relationship between an all-sky polarization map and a blue sky polarization phase image.
- (A) And (b) is a schematic diagram which shows an astronomical coordinate system. It is a schematic diagram which shows the relationship between a picked-up image and a camera. It is a schematic diagram which shows the relationship between a camera, an image coordinate system, and a celestial sphere coordinate system. It is a block diagram which shows the structure of the image input device in the 2nd Embodiment of this invention. It is the schematic diagram which showed the blue sky polarization
- the present invention pays attention to the fact that the blue sky is polarized, and the knowledge that the orientation of the imaging device can be estimated based on the polarization information of the blue sky region included in the scene image by using the polarization pattern. was completed based on.
- FIG. 1A consider a case where an imaging device (referred to as a “camera” for simplicity) 10 takes a landscape photograph in the line-of-sight direction (z-axis direction) outdoors.
- a camera referred to as a “camera” for simplicity
- FIG. 1B it is assumed that a part of the blue sky (blue sky region) is included in the captured scene image.
- the blue sky region in the scene image shown in FIG. 1B is obliquely shaded, and this oblique line schematically represents the polarization phase of the blue sky.
- the polarization phase is an angle (phase angle) indicating the polarization main axis direction, and is defined by a rotation angle around the camera viewing direction (z-axis direction).
- the direction (phase angle) of the polarization phase is not directly visible by human vision, but is information that does not appear in a normal image (luminance image).
- a polarizing filter is arranged in front of the camera 10 and the polarizing filter is rotated around the camera viewing direction (z-axis direction) shown in FIG. 1A, light (polarized light) emitted from the blue sky is generated at a specific rotation angle.
- the polarizing filter can be transmitted with the highest transmittance.
- the transmission axis direction (angle) of the polarizing filter at this time corresponds to the polarization direction (phase angle) of the blue sky region positioned ahead of the camera viewing direction (z axis).
- FIG. 1C and FIG. 1D are diagrams showing examples of the polarization phase pattern of the whole sky, and the polarization phase patterns of the hemispherical whole sky are shown in the respective circles. The center of this circle corresponds to the zenith and the outer periphery corresponds to the horizon.
- FIG. 1C and FIG. 1D a large number of curves are described, and the tangential direction at an arbitrary position of each curve indicates the direction of the polarization phase (angle) at that position.
- This all-sky polarization pattern has the polarization direction in a concentric direction centered on the sun in the simplest model, but actually has four points with unique polarization characteristics. It has been. This is described in detail in Non-Patent Document 4.
- the polarization of the blue sky changes according to the position of the sun in the sky. For this reason, when the position of the sun in the sky is obtained based on information such as the shooting date and time, the shooting position (longitude, latitude), the polarization phase pattern of the sky at that time is determined.
- the polarization phase at an arbitrary position in the sky can be obtained by calculation if the shooting date / time and position are given, but a storage device that stores a map (all-sky polarization map) associating the position on the sky with the polarization phase. It can also be stored in
- FIG. 1E is a diagram showing an example of the positional relationship between the camera viewing direction (z-axis direction) and the sun.
- a rectangular range (image area) of the captured scene image is schematically shown at the tip of the camera viewing direction (z-axis direction).
- the arrow written in this rectangular range indicates the polarization direction (polarization phase direction) of the blue sky region in that direction. Since the polarization phase of the blue sky at a certain date and time varies depending on the position in the sky, when the camera direction changes, the polarization phase direction of the captured blue sky region also changes.
- the sky polarization information is acquired from a database or by calculation, while the polarization state of the blue sky region included in the shooting scene is detected, and compared with the sky polarization information, The direction and the relationship between the sun position and the camera can be obtained. According to another aspect, it is also possible to obtain the direction of the blue sky region by calculation without using a database, and thereby estimate the camera direction (image pickup device direction).
- FIG. 1F shows a configuration of the image input apparatus according to the present embodiment.
- This image input device includes a blue sky polarization phase image acquisition unit 100, a camera orientation estimation unit 101, and an output unit 102.
- the blue sky polarization phase image acquisition unit 100 includes a scene image / scene polarization image acquisition unit 100a, a roll horizontality correction unit 100b, and a blue sky polarization image processing unit 100c, and outputs a blue sky polarization phase image ⁇ sky.
- the “polarized image” means an image in which each of a plurality of pixels constituting the image displays polarization information of the pixel.
- the polarization information includes the polarization degree and the polarization phase (phase angle). Therefore, the “polarized image” means a “polarization degree image” in which the polarization degree of each pixel is displayed two-dimensionally, and a “polarization degree image” in which the polarization phase of each pixel is two-dimensionally displayed unless otherwise specifically limited. This is a generic term for “polarization phase image”.
- the degree of polarization and the magnitude (numerical value) of the polarization phase of each pixel can be expressed by the brightness or hue of the pixel.
- the degree of polarization and the magnitude of the polarization phase are expressed by the level of brightness.
- Rotation 201 around an axis (here, x axis) extending to the side of the camera is yaw.
- the rotation 202 around the axis extending in the vertical direction of the camera (here, the y axis) is the pitch.
- a rotation 203 around an axis (here, z-axis) extending in the front-rear direction of the camera is a roll.
- the scene image / scene polarized image acquisition unit 100 a and the level for measuring the roll direction inclination are measured. 2 may be provided in the camera of FIG. 2, and the roll level correction unit 100b, the blue sky polarization image processing unit 100c, the camera orientation estimation unit 101, and the output unit 102 may be provided outside the camera.
- the camera includes an imaging unit that functions as the scene image / scene polarized image acquisition unit 100a, the content of the scene image / scene polarized image that is captured changes depending on the orientation of the camera.
- the series of processing for estimating the camera direction is preferably executed inside the camera, but it is not necessarily executed inside the camera.
- an apparatus that includes an “imaging unit” that acquires a luminance image and a polarization image and that allows the user to change the imaging direction is referred to as a “camera”, regardless of whether or not it is built in the camera.
- a device that estimates the camera orientation is referred to as a “camera orientation detection device” or an “imaging device orientation detection device”.
- An apparatus including both the “imaging unit” and the “imaging device orientation detection device (camera orientation detection device)” is referred to as an “image input device”. From the “imaging unit” to the “imaging device orientation detection device (camera orientation detection device)”, data of the scene image and the scene polarization image is sent. This can be done through the information transmission medium.
- An apparatus provided with such an “imaging apparatus orientation detection apparatus” is not limited to an imaging apparatus such as a camera.
- a mobile device including a notebook computer
- a mobile device such as a mobile phone may be provided with an “imaging device” and an “imaging device orientation detection device”.
- a moving body such as an automobile or a motorcycle may be provided with an “imaging device” and an “imaging device orientation detection device”.
- the orientation of the mobile device or the moving body does not need to match the orientation of the imaging device included in these devices. Since the orientation of the mobile device or the moving body and the orientation of the imaging device are in a predetermined relationship, if the orientation of the imaging device is detected, the orientation of the mobile device or the mobile body can be obtained from the detected orientation of the imaging device. it can.
- the “camera” in the present specification is not limited to a so-called general camera that is usually taken by a person with a hand.
- An imaging device provided in a moving body such as an automobile is also included in the “camera”.
- the configuration of the scene image / scene polarized image acquisition unit 100a of the present embodiment will be described with reference to FIG.
- the scene image and the scene polarization image are preferably acquired at the same time, but may be acquired at intervals up to several seconds.
- Patent Literature 3 discloses a technique for simultaneously acquiring a monochrome image and a polarization image.
- a patterned polarizer having a plurality of different polarization main axes (transmission axes) is spatially arranged on an image sensor in order to simultaneously acquire a luminance image and a partially polarized image of a subject.
- a photonic crystal or a structural birefringent wave plate array is used as the patterned polarizer.
- these techniques cannot obtain a color image and a polarized image at the same time.
- the scene image / scene polarization image acquisition unit 100a in FIG. 3 acquires color image information for the subject in real time and simultaneously acquires polarization image information, and acquires two types of polarization image information (polarization degree image ⁇ ). And a polarization phase image ⁇ ).
- the color polarization acquisition unit 301 can acquire both color moving image information and polarization image information in real time.
- Signals indicating color moving image information and polarization information image information are output from the color polarization acquisition unit 301 and provided to the color information processing unit 302 and the polarization information processing unit 303, respectively.
- the color information processing unit 302 and the polarization information processing unit 303 perform various processes on the signal, and output a color image C, a polarization degree image ⁇ , and a polarization phase image ⁇ .
- FIG. 4 is a schematic diagram showing a basic configuration of the color polarization acquisition unit 301.
- the color filter 401 and the patterned polarizer 402 are disposed so as to overlap the front surface of the image sensor pixel 403.
- Incident light passes through the color filter 401 and the patterned polarizer 402 to reach the image sensor, and the luminance is observed by the image sensor pixel 403.
- FIG. 5A is a diagram of a part of the imaging surface of the color polarization acquisition unit 301 viewed from directly above the optical axis direction.
- FIG. 5A shows only 16 pixels (4 ⁇ 4) on the imaging surface for simplicity.
- the four rectangular areas 501 to 504 shown in the figure indicate the corresponding portions of the Bayer type color mosaic filter installed on the four pixel cells, respectively.
- a rectangular area 501 is a B (blue) filter area and covers the pixel cells B1 to B4.
- B (blue) patterned polarizers having different polarization main axes are in close contact with the pixel cells B1 to B4.
- the “polarization main axis” is an axis parallel to the polarization plane (transmission polarization plane) of light transmitted through the polarizer.
- polarizer units microwavepolarizers
- transmission polarization planes of different angles in the same color pixel are arranged adjacent to each other. More specifically, four types of polarizer units having different transmission polarization plane directions are arranged in pixels of the same color of R (red), G (green), and B (blue).
- One polarizer unit corresponds to four finely polarized pixels.
- a code such as G1 is given to each polarization pixel.
- FIG. 5B shows the polarization main axes assigned to the four finely polarized pixels with which the patterned polarizer for B (blue) is in close contact.
- the straight line described in each finely polarized pixel schematically shows the polarization main axis direction of the minute polarizing plate.
- a position indicated by reference numeral “505” indicates a virtual pixel position in which four pixels in the rectangular area 501 in the imaging system are collectively displayed.
- the patterned polarizers of the rectangular regions 502 to 504 are also divided into portions having four different polarization main axes as shown in FIG.
- each color pixel includes a plurality of finely polarized pixels having different polarization principal axes, and the color mosaic arrangement itself is arbitrary.
- each finely polarized pixel is referred to as a “polarized pixel”.
- 6A to 6C are graphs schematically showing the wavelength characteristics of B (blue), G (green), and R (red) polarized pixels, respectively.
- the vertical axis of each graph is the intensity of transmitted light, and the horizontal axis is the wavelength.
- the polarization pixels for B, G, and R have polarization characteristics that transmit TM waves (Transverse Magnetic Wave) and reflect (do not transmit) TE waves (TransverseTrElectric Wave) in each wavelength band of B, G, and R. Have.
- the TM wave is a wave whose magnetic field component is transverse to the incident surface
- the TE wave is a wave whose electric field component is transverse to the incident surface.
- FIG. 6A shows the polarization characteristics 602 and 603 of the B-polarized image and the transmission characteristics 601 of the B color filter.
- Polarization characteristics 602 and 603 indicate the transmittance of the TM wave and the TE wave, respectively.
- FIG. 6B shows the polarization characteristics 605 and 606 of the G-polarized image and the transmission characteristics 604 of the G color filter.
- Polarization characteristics 605 and 606 indicate the transmittance of the TM wave and the TE wave, respectively.
- FIG. 6C shows polarization characteristics 608 and 609 of the R-polarized image and transmission characteristics 607 of the R color filter.
- Polarization characteristics 608 and 609 indicate the transmittances of the TM wave and the TE wave, respectively.
- 6A to 6C can be realized by using, for example, a photonic crystal described in Patent Document 3.
- a photonic crystal In the case of a photonic crystal, light having an electric field vector oscillation plane parallel to a groove formed on the surface thereof is TE wave, and light having an electric field vector oscillation plane is TM wave.
- An important point in the present embodiment is to use a patterned polarizer that exhibits polarization separation characteristics in each of the B, G, and R transmission wavelength bands, as shown in FIGS.
- FIG. 7 shows a case where the wavelength is shifted between the transmission range of the G color filter and the polarization separation range determined by the polarization characteristics 6101 and 6102. According to the polarizer exhibiting such characteristics, the intended operation of the present invention cannot be performed.
- the luminance dynamic range and the number of bits of the image sensor be as large as possible (for example, 16 bits) in order to reliably acquire a polarization component included in a particularly bright specular reflection portion of the subject and a polarization component included in the shadow region of the subject. .
- the luminance information acquired for each polarization pixel by the configuration shown in FIG. 5 is processed by the polarization information processing unit 303 in FIG. Hereinafter, this process will be described.
- the observation luminance when the rotation angle ⁇ of the polarization main axis is ⁇ i is Ii.
- i is an integer from 1 to N, and “N” is the number of samples.
- FIG. 8 shows luminances 701 to 704 corresponding to the 4-pixel samples ( ⁇ i, Ii).
- the relationship between the polarization principal axis angle ⁇ i and the luminance 701 to 704 is expressed by a sine function curve.
- a sine function curve In FIG. 8, four points of luminance 701 to 704 are described so as to be positioned on one sine function curve.
- the sine function curve is determined based on more observation brightness, a part of the observation brightness may slightly deviate from the sine function curve, but there is no problem even in that case.
- polarization information in this specification means amplitude modulation degree and phase information in a sinusoidal curve indicating the dependence of luminance on the polarization principal axis angle.
- the reflected light luminance I with respect to the principal axis angle ⁇ of the patterned polarizer is set as follows using the internal four pixel luminance values for each of the same color regions 501 to 504 shown in FIG. Approximate.
- A, B, and C are constants, and represent the amplitude, phase, and average value of the polarization luminance fluctuation curve, respectively.
- B takes a negative value.
- (Equation 1) can be expanded as follows. However, A and B are shown by the following (Formula 3) and (Formula 4), respectively. If A, B, and C that minimize (Equation 5) below are obtained, the relationship between the luminance I and the polarization principal axis angle ⁇ can be approximated by the sine function of (Equation 1).
- the three parameters A, B, and C of the sine function approximation are determined for one color.
- a polarization degree image showing the polarization degree ⁇ and a polarization phase image showing the polarization phase ⁇ are obtained.
- the degree of polarization ⁇ represents the degree to which the light of the corresponding pixel is polarized
- the polarization phase ⁇ represents the principal axis angle of the partial polarization of the light of the corresponding pixel.
- the principal axis angle of polarized light is the same between 0 and 180 ° ( ⁇ ).
- the values ⁇ and ⁇ (0 ⁇ ⁇ ⁇ ⁇ ) are calculated by the following (formula 6) and (formula 7), respectively.
- the patterned polarizer of this embodiment may be a photonic crystal, a film-type polarizing element, a wire grid type, or a polarizing element based on other principles.
- the color information processing unit 302 illustrated in FIG. 3 calculates color luminance using information output from the color polarization acquisition unit 301.
- the luminance of the light transmitted through the polarizer is different from the original luminance of the light before entering the polarizer.
- a value obtained by averaging the observed luminances of all polarized polarization main axes corresponds to the original luminance of light before entering the polarizer.
- the observation luminance at the pixel of the angle polarization pixel R1 is expressed as IR1
- the color luminance can be calculated based on the following (Equation 8).
- a normal color mosaic image can be generated by obtaining the luminance in each polarization pixel.
- a color image C is generated by converting each pixel into a color image having RGB pixel values based on the mosaic image. Such conversion is realized by using a known interpolation technique such as a Bayer mosaic interpolation method.
- each luminance and polarization information of each pixel in each of the color image C, the polarization degree image ⁇ , and the polarization phase image ⁇ are obtained using the four polarization pixels shown in FIG. Therefore, it can be considered that each luminance and polarization information represents a representative value at the virtual pixel point 505 located at the center of the four polarization pixels shown in FIG. Accordingly, the resolution of the color image and the polarization image is reduced to 1/2 ⁇ 1/2 of the resolution of the original color single-plate image sensor. For this reason, it is desirable that the number of pixels of the image sensor is as large as possible.
- FIG. 9 is an image with a scene of a distant building as a subject.
- the polarization degree image ⁇ in FIG. 9A represents the intensity of polarization by the brightness of a pixel. The higher the brightness of a pixel (white), the higher the polarization degree of the pixel.
- the polarization phase image ⁇ in FIG. 9B represents the angle of the polarization phase by brightness.
- the polarization phase is expressed by assigning values from 0 to 180 degrees to the brightness. Note that since the phase angle has periodicity, the phase angles of white and black on the polarization phase image are actually continuous.
- FIG. 9C is a normal RGB color luminance image. However, in this drawing, a hue is not expressed, and only a luminance of each pixel is described as a monochrome luminance image expressed by brightness.
- FIG. 9D is a schematic diagram corresponding to the image of FIG. Although it is difficult to understand in the photograph, 801 is the sky, 802 is the cloud, 803 is the building, 804 is planted, and 805 is a part of the camera stand.
- scene shooting is corrected so that the horizontal line is horizontal in the screen.
- the tilt correction of the photographed scene image / scene polarization image is performed by the roll horizontality correction unit 100b (FIG. 1F).
- the inclination around the camera optical axis is corrected. That is, the inclination is corrected by rotating the direction 203 around the direction 203 by ⁇ r so that the x-axis in FIG. 2 is horizontal with the ground.
- a tilted image is obtained as schematically shown in FIG.
- a vertical line 9011 from the ground and a horizontal line 902 from the ground are defined as shown in FIG.
- the camera x-axis in FIG. 2 can be regarded as equivalent to the x-axis in FIG.
- the horizontal line 902, that is, the inclination of the camera in the x-axis direction from the ground is ⁇ r. Therefore, first, this inclination ⁇ r is detected by a level installed in the camera.
- the level to be mounted may be any level as long as it can be mounted inside the camera as disclosed in, for example, Japanese Patent Application Laid-Open No. 2007-240832.
- the polarization phase correction of the polarization phase image is performed using the inclination ⁇ r obtained from the level.
- the corrected polarization phase direction ⁇ new can be obtained by correcting ⁇ obtained by Equation 7 using Equation 9 below.
- FIG. 10B shows an image after tilt correction.
- the vertical 9011 from the ground in FIG. 10A before correction is rotated by ⁇ r to a new vertical 9012 orthogonal to the horizontal line 902 of the image parallel to the X ′ axis in FIG. 10B.
- the coordinates (Xr, Yr) of the pixel 903 in FIG. 10A are converted into coordinates (Xr ′, Yr ′) represented by the following Expression 10 as a result of rotating the image by the angle ⁇ r.
- a pixel having coordinates (Xr ′, Yr ′) is described as a pixel 904 in FIG.
- this correction may be performed after the cloud region is removed, for example, since it is only necessary to know the polarization phase angle of the sky region with respect to the horizon.
- the entire sky polarization map may be converted into camera coordinates in the subsequent calculation by the camera direction estimation unit.
- the blue sky polarization image processing unit 100c receives the polarization degree image ⁇ , the polarization phase image ⁇ , and the color image C and outputs a blue sky polarization phase image ⁇ SKY.
- the blue sky polarization phase image ⁇ SKY is used for estimating the camera direction and the sun direction from the scene.
- the polarization degree binarization unit 1001 binarizes the polarization degree image ⁇ with a threshold value T ⁇ .
- the luminance conversion unit 1002 converts the color image C into a luminance image Y.
- the luminance binarization units 1003 and 1004 binarize the converted luminance image Y using threshold values TC1 and TC2.
- the image calculation unit 1005 performs an AND (logical product) operation on the polarization image ⁇ ′ binarized by the polarization degree binarization unit 1001 and the luminance image C1 ′ binarized by the luminance binarization unit 1003. Then, the mask image A ′ is output.
- the hue similarity conversion unit 1006 performs HSV conversion on the color image C, and outputs a hue similarity image h representing the hue similarity with the sky blue hue.
- the hue similarity binarization unit 1007 performs threshold processing on the hue similarity image h with the threshold value TH, and outputs an image h ′ in which only the sky hue region is extracted.
- the image calculation unit 1008 performs a logical AND operation on the luminance C2 ′ binarized by the luminance binarization unit 1004 and the specific hue binarized by the hue similarity binarization unit 1007.
- the output selection unit 1009 includes a binarized luminance / polarization degree image C1 ′, a first blue sky region mask A ′ generated from ⁇ ′, and a binarized luminance / hue similarity image C2 ′, Which of the second blue sky region masks B ′ generated from h ′ is to be used is determined based on the output ⁇ d of the polarization degree determination unit 1010.
- the image calculation unit 1011 performs a logical AND operation on the adopted blue sky region mask Msky and the polarization phase image ⁇ to generate a blue sky polarization phase image ⁇ sky.
- a blue sky region detection method there is a method of searching for a flat region from a color image whose hue on the image is similar to blue.
- a blue sky including a cloudy sky is stochastically obtained from color information and texture information.
- color information is used, (i) when the hue information of the color gradually changes from blue to magenta, red, such as sunset sky, or (ii) when the building on the ground is blue or white, There is a problem that it becomes impossible to distinguish from the sky or clouds.
- the sky can be detected using only monochrome luminance information without explicitly using color information that changes variously due to physical factors in the sky.
- an area having the highest luminance in the scene image may be assumed to be empty. According to experiments, when sky detection based on such assumptions was performed, good results were obtained to some extent in the case of cloudy sky or sunset sky. However, when the weather was fine, the brightness of mirror reflection of buildings on the ground was often higher than the brightness of the sky, and good results were not obtained. In the case of fine weather, rather than the specular reflection of sunlight, the artifact (building) was subjected to the blue sky all-around lighting, causing specular reflection stronger than expected on its smooth surface. It is thought that it was.
- the blue sky region is detected using the degree of polarization of the scene in addition to the luminance.
- This utilizes the fact that the degree of polarization of the sky in fine daylight is very high near the horizon.
- the polarization state of the sky in the whole sky is recorded every hour for 12 hours from morning (sunrise) to evening (sunset). According to this document, except for the east-west direction in the morning and evening. Most of the time, the degree of polarization of the sky is strong near the horizon. According to experiments, the degree of polarization of this sky is often stronger than the degree of polarization of distant views such as mountains on the ground and artifacts such as buildings, and can therefore be an effective sky detection means.
- the roofs and glass of buildings on the ground are also very strongly polarized.
- a mask using both the polarization degree and the threshold value for luminance may be generated and the area to be removed may be detected.
- the degree of polarization near the horizon is low in the east-west direction, which is the path of the sun, and in addition, the west sky in the morning and the east sky in the evening have low brightness. In many cases, this technique cannot be applied. In this case, the color hue and brightness may be used for detection. Details of this case will be described later.
- the operation of the blue sky polarization image processing unit 100c having the configuration of FIG. 11A will be described with reference to FIG. 12 showing an actual scene image.
- the actual scene image has a circular imaging range, but this is due to lens vignetting in the camera device at the time of the experiment, and may be considered as a rectangular image.
- the blue sky polarization image processing unit 100c can be realized by a minimum configuration 1012 surrounded by a broken line shown in FIG. 11A depending on conditions. First, the operation of the minimum configuration 1012 will be described with reference to FIGS.
- FIG. 12A shows a polarization degree image ⁇ of a scene image.
- FIG. 12B is a diagram schematically showing the content of the polarization degree image ⁇ of FIG.
- the scene image includes a sky region 1101, a building region 1102, a cloud region 1103, a ground region 1104, and a camera stand 1105.
- the result of processing the polarization degree image ⁇ by the polarization degree binarization unit 1001 is an image ( ⁇ ′) in FIG.
- the binarization threshold T ⁇ 0.14.
- the binarization threshold value T ⁇ is determined from the polarization degree histogram. In this scene, the sky area 1101 and the landscapes 1102 and 1104 such as buildings on the ground are divided into a high polarization area and a low polarization area to form a bimodal distribution. An intermediate value between two peaks in the polarization degree histogram is defined as a threshold value T ⁇ .
- the binarization threshold value T ⁇ is a threshold value for determining the degree of polarization, and satisfies the relationship of 0 ⁇ T ⁇ ⁇ 1.
- the cloud region 1103 on the right side of the building is also removed when the degree of polarization is low. Only the lower black camera mount 1105 is strongly polarized and cannot be removed and remains.
- FIG. 12D shows a luminance image Y obtained by processing the color image C of the scene image by the luminance conversion unit 1002.
- the brightness of the sky region 1101 and the brightness of the building 1102 are almost equal, and separation by brightness is difficult. Even in such a case, by setting the thresholds TC1 and TC2 appropriately, Has been removed.
- the threshold values TC1 and TC2 in this embodiment are normalized so that the magnitudes of 0 ⁇ TC1 ⁇ 1 and 0 ⁇ TC2 ⁇ 1 in order to evaluate the luminance value.
- the luminance value of 0 to 255 is normalized to a numerical value of 0 to 1 and compared with the threshold values TC1 and TC2.
- the luminance value of 0 to 65535 is normalized to a numerical value of 0 to 1, and then compared with the threshold values TC1 and TC2.
- FIG. 13B is a schematic diagram of the same scene image as FIG. FIG. 13C shows a mask image A ′ as in FIG.
- the blue sky polarization phase image ⁇ sky of FIG. 13D is obtained by the above processing.
- the polarization phase may not be disturbed even in the cloud region, particularly when the cloud is thin.
- the blue sky region may include a cloud. Whether or not the polarization phase is disturbed by the cloud is a measure of the degree of decrease in the degree of polarization in the cloud region. According to the method of the present embodiment that determines the blue sky region based on the polarization, there is an advantage that only the cloud region having a low degree of polarization can be automatically removed.
- FIGS. 14A to 14F show a scene polarization degree image ⁇ , a schematic diagram of a scene image, a binarized scene polarization degree image ⁇ ′, a scene luminance image Y, and a binarized scene luminance image C2 ′, respectively. It is. As shown in FIG. 14B, this scene image includes an empty area 1201, a building area 1202, a ground area 1203, and a camera stand 1204.
- the polarization degree determination unit 1010 shown in FIG. 11A is used in such a case.
- the average degree of polarization is obtained based on the polarization degree histogram of the scene polarization degree image ⁇ .
- T ⁇ 1 0.1
- this is not adopted, and the color hue and luminance are used. Switch to the method you used.
- the process will be described with reference to FIGS. 11A and 15.
- the hue similarity conversion unit 1006 converts the color image C into a hue similarity image by obtaining an error of the hue angle indicating the difference between the hue angle of blue, which is a sky hue, and the hue angle of the color image C. Is done.
- the reason why the color of the sky is blue is that processing using this color image is used only when the degree of polarization of the sky is low and the brightness is low, either in the morning west sky or in the evening east sky. Therefore, it is based on the assumption that the sky color can be regarded as blue.
- Hsky 254 °
- an input scene hue angle is Htest.
- RGB_to_H Hue, Saturation, Lightness space hue H conversion formula
- the hue angle is a cycle of 360 degrees
- the hue similarity ⁇ H is It is expressed by the formula.
- the hue similarity image h is subjected to threshold processing by the hue similarity binarization unit 1007, whereby a blue sky region candidate mask image h ′ is obtained.
- FIG. 15A shows a hue error image converted by the hue similarity conversion unit from the same scene image as that of FIG.
- FIG. 15 (b) shows a schematic diagram of the scene image of FIG. 15 (a).
- FIG. This is a mask image B ′ as a result of a logical product operation of the hue binarization result h ′ shown in FIG. 15A and the luminance binarization result C2 ′ shown in FIG.
- this mask image B ′ is adopted instead of the mask image A ′ from the polarization degree.
- the image operation unit 1011 performs a logical product operation on the mask image B ′ and the polarization phase image ⁇ to obtain a blue sky polarization phase image ⁇ sky.
- the mask may be switched according to the shooting date and time. For example, after 4 pm, sunset is defined as evening, and when it is not evening, the blue sky region is determined with only the minimum configuration 1012, and switching using the entire configuration in FIG. 11A is possible in the evening.
- the output selection unit 1009 uses the first blue sky region mask generated from the binarized luminance / polarization degree images C1 ′ and ⁇ ′. Based on the output ⁇ d of the polarization degree determination unit 1010, which one of A ′ and the second blue sky region mask B ′ generated from the binarized luminance / hue similarity images C2 ′ and h ′ is to be adopted. To decide.
- the blue sky polarization image processing unit 100c in FIG. 11B selects which mask is to be created before the first blue sky region mask A ′ and the second blue sky region mask B ′.
- the unit 1014 determines based on the output ⁇ d of the polarization degree determination unit 1010. For example, when ⁇ d output from the polarization degree determination unit 1010 does not exceed the threshold value, the selection unit 1014 in FIG. 11B selects creation of the second blue sky region mask B ′ instead of the first blue sky region mask A ′. To do.
- the blue sky polarization image processing unit 100c in FIG. 11B does not create the first blue sky region mask A ', but creates only the second blue sky region mask B'. Then, only the second blue sky region mask B ′ is input to the image calculation unit 1011. For this reason, it is only necessary to create the selected mask, and processing for creating a mask that is not used can be omitted.
- a selection unit 1014 determines which mask is to be created before the first blue sky region mask A ′ and the second blue sky region mask B ′ are created. Will be determined. However, in the blue sky polarization image processing unit 100c of FIG. 11C, the selection of the mask is determined based on the shooting date / time information output by the date / time information acquisition unit 1016, not based on the output ⁇ d of the polarization degree determination unit 1010. When the time indicates evening (for example, after 4 pm and until sunset), the selection unit 1014 in FIG. 11C creates the second blue sky region mask B ′ instead of the first blue sky region mask A ′. select. As a result, only the second blue sky region mask B ′ is input to the image calculation unit 1011.
- FIG. 16 shows a configuration of the camera direction estimation unit 101.
- FIG. 16A shows a configuration for executing the search mode.
- This configuration includes a sun position acquisition unit 1301, an all-sky polarization phase map acquisition unit 1302, a blue sky region direction estimation unit 1303, and an angle of view acquisition unit 1304. It has.
- FIG. 16B shows a configuration for executing the calculation mode, and this configuration includes a sun position acquisition unit 1301, an angle of view acquisition unit 1304, and a blue sky region direction calculation unit 1305.
- the input is the same blue sky polarization phase image ⁇ , and the configuration of the solar position acquisition unit 1301 is also common.
- the camera direction estimation unit 101 may include both of the configurations illustrated in FIGS. 16A and 16B, or may include only one of them. Further, which mode is used can be selected by the photographer or may be automatically determined inside the camera. Details of each mode will be described later.
- the camera direction estimation unit 101 determines in which direction the captured polarization phase pattern is in the whole sky. Therefore, first, the sky polarization state will be described.
- the sun's light that falls from the sky has the property of electromagnetic waves.
- an electromagnetic wave changes during propagation, when there is a change in the medium, a structural change in the propagation path, or when an object suddenly appears, a secondary electromagnetic wave is emitted in the changing region. This is scattering.
- this scatterer which is a structure that generates re-radiation, is sufficiently larger than the wavelength of the radio wave, the phenomenon of the scatterer surface can be treated locally as plane wave reflection / incidence.
- the scattering that occurs in such a case is “geometric optical scattering”.
- the electromagnetic field on the scatterer surface can be approximated by a static magnetic field.
- the scattering that occurs in such a case is “Rayleigh scattering”.
- the scattering characteristic of Rayleigh scattering shows the same characteristic as the directivity of a minute dipole regardless of the shape of the scatterer.
- the blue sky in the daytime is because blue with a short wavelength is strongly scattered and reaches our eyes.
- the evening sky is red because the blue component dissipates as the distance from the sun, which is the light source, increases, and the red color remaining as transmitted light reaches our eyes.
- the light undergoes Rayleigh scattering, the light has a property of being polarized due to the positional relationship with the sun as a light source. This is the reason why an empty polarization pattern is created and changes from moment to moment.
- the polarization phase characteristics depend on the positional relationship between the sun, the observation point, and the viewpoint, and when the polarization of the sky is observed from the ground, the polarization component in the tangential direction of the circle centered on the sun is strongly observed. Will come to be.
- Non-Patent Document 3 As a result of actual measurement, there are three singular points with unique polarization characteristics other than the sun.
- Non-Patent Document 4 succeeds in obtaining a sky polarization pattern close to reality using a theoretical model that takes them into consideration.
- a cloud is a collection of droplets such as water droplets. If it is a thin cloud, it is transparent, but when it gets darker, it turns white and the other side is invisible. This is because multiple scattering occurs between cloud particles. Multiple scattering is a repetition of scattering in which light scattered by a scatterer is incident on another scatterer and causes another scatterer. In particular, when a large number of scatterers are densely distributed, the light that has been scattered and dispersed overlaps with each other, and similarly, the light polarized by scattering also overlaps and loses its polarization characteristics. Of course, the polarization characteristic does not completely disappear, and the polarization characteristic may remain depending on the thickness and amount of the cloud. Therefore, in this method, the cloud region is not removed at the initial stage, but only the region with a low degree of polarization is removed, and the method is applied only to the region where the sky polarization can be used.
- the sky polarization pattern depends on the position of the sun (the altitude and direction of the sun: hereinafter, sometimes simply referred to as the “sun position”). Therefore, first, it is necessary to acquire information indicating the solar position by the solar position acquisition unit 1301 (hereinafter, it may be simply referred to as “acquire the solar position”).
- the sun position varies depending on the date, time, and position (latitude and longitude) looking up at the whole sky, for example, the altitude of the sun in the whole sky using a clock, GPS, etc. And can be obtained by calculating the direction.
- the solar altitude / azimuth calculation method in this case will be described.
- the angle variable is set as ⁇ 0, and the number of days elapsed since the new year's day is used and defined as follows. If the solar declination of the day is set as ⁇ , it can be obtained as follows using ⁇ 0. If the latitude ⁇ , the longitude ⁇ , the equation of time Eq, the Japan standard time JST, and the standard meridian longitude JSK are set, the solar hour angle t can be obtained by the following procedure. From the above, the solar azimuth ⁇ s and the altitude hs can be obtained by the following equations. The above calculation is based on the approximate calculation formula of Rissho Univ. Nakagawa Laboratory.
- This method of obtaining is merely an example, and there are various other methods of obtaining the solar position.
- it can be obtained by a calculation formula described in documents such as “Calculation of sunrise / sunset” Nagasawa Kou (Jinjinshokan).
- it may be obtained by a method using a science chronology.
- the whole sky polarization phase pattern on the photographing point at the time of photographing is obtained by the whole sky polarization phase map obtaining unit 1302.
- data in which the polarization phase pattern at each solar altitude / azimuth is recorded by actual observation can be created and accumulated to be used as a database. If a panoramic polarization phase map corresponding to the solar altitude and direction at the time of photographing is obtained from the database thus created, the panoramic polarization pattern can be obtained. If there is no data equal to the solar altitude and direction at the time of shooting, it is sufficient to supplement by using a plurality of close ones.
- Non-Patent Document 1 Non-Patent Document 2
- Non-Patent Document 4 Non-Patent Document 4.
- FIG. 17 shows a schematic diagram of the entire sky polarization phase pattern obtained by calculation based on the above-described theoretical model, and a conceptual diagram of matching with the blue sky polarization phase pattern.
- the large circle in the center of FIG. 17A is an example of an all-sky polarization diagram around 9 am.
- the phase pattern is drawn with a dotted line for easy understanding, but in reality, each point on the map has a phase, and this figure visualizes a part of the phase plane.
- the entire sky polarization pattern is a concentric pattern centered on the sun 1401. More specifically, it is known that a Babinet point 1402 known as a singular point, an Arago point 1403, and the like influence. A point 1404 just below the zenith is the camera position. A line extending perpendicularly from the zenith to the horizon, that is, a perpendicular to the horizon (Local Meridian) is defined. The dotted lines indicate the inclination of the polarization phase with respect to the Local Meridian to which the light from each point of the whole sky map viewed from the camera position belongs. How to read the phase diagram will be described in detail below.
- the direction of the phase is defined so that the direction along the Local Meridian is 0 ° and positive in the clockwise direction, as shown in the figure showing the polarization phase ⁇ pix at the position 1405.
- the respective polarization phase directions are indicated by dotted arrows.
- the magnitude of this phase is determined with a local meridian passing through one point on the sky ahead of the camera's line of sight as a reference line. It is assumed that how the phase lines on the map intersect with the reference line gives the direction (phase angle) of the polarization phase.
- the phase angle at the position 1405 near the horizon is about ⁇ 20 ° with respect to Local Meridian.
- the phase angle at the position 1406 on the solar path is ⁇ 90 ° with respect to Local Meridian. This is known to occur at all points along the solar path.
- the phase angle at position 1407 is about 40 ° with respect to Local Meridian.
- the phase angle at the position 1408 near the sun is approximately 0 ° with respect to Local Meridian.
- FIGS. 17B and 17C are schematic diagrams 1409 and 1410 of the polarization phase images to be photographed.
- the arrows in the drawing only schematically show the direction of the polarization phase for explanation, and do not actually appear on the photograph.
- each pixel has a value indicating the polarization phase. From the arrangement of the polarization phases of a plurality of pixels, it can be seen which part of the sky was photographed.
- a blue sky polarization phase image 1409 shown in FIG. 17B and a blue sky polarization phase image 1410 shown in FIG. 17C are obtained at a certain time. They appear to be the same scene but have different sky polarization phase patterns.
- the blue sky polarization phase image 1409 has a polarization axis in the direction of the 11 o'clock direction of polarization in the sky region as a whole.
- the blue sky polarization phase image 1410 the polarization in the sky region has a polarization axis in the direction of 2 o'clock as a whole.
- the blue sky polarization phase image 1409 is taken in the north direction having the deflection phase at position 1405. I understand that.
- the blue sky polarization phase image 1410 is an image of the south direction having the deflection phase at the position 1407. In this way, even two images that look the same in the scene image can be identified as scenes with different line-of-sight directions due to the different polarization phases of the sky regions included in the image.
- FIG. 17A the polarization phase pattern is shown on two-dimensional coordinates for the sake of simplicity.
- the search for the actual phase polarization pattern is 3 It is desirable to carry out on the dimensional earth coordinates.
- FIG. 18 is a conceptual diagram showing the relationship between the sun and the camera orientation in the earth coordinate system.
- the azimuth is set as ⁇ 1 and the elevation angle is set as ⁇ 2.
- the direction is north in the x-axis + direction, south in the-direction, west in the y-axis + direction, and east in the-direction.
- the coordinates do not necessarily have to be taken in this way, and any coordinates may be used so that the correspondence between the direction and the celestial sphere can be understood.
- the azimuth of the camera is 0 degrees on the east side and is an angle rotated around the north.
- the elevation angle of the camera is an angle viewed upward from the horizon.
- FIG. 18A the sun 1501 is at the coordinate Ps.
- a point 1502 on the celestial sphere 1505 corresponding to a certain point in the blue sky polarization phase image is set at the coordinate Pv.
- Reference numeral 1503 denotes a polarization phase ⁇ pix at Pv.
- the position of the camera 1504 is at the coordinates Pc (0, 0, 0).
- the zenith angle of Ps is ⁇ 2, and the zenith angle of Pv is ⁇ 2.
- FIG. 18B is a view of the xy plane of FIG. 18A as viewed from above the z axis.
- the azimuth angle of Ps is ⁇ 1, and the azimuth angle of Pv is ⁇ 1.
- each point located at the coordinates Ps and Pv can be expressed as follows using ⁇ 1 to ⁇ 2.
- the pixel position of the photographed image is replaced with an angle from the camera center in order to take correspondence between the pixel position of the photographed image and the position on the celestial sphere.
- FIG. 19 shows a schematic diagram of the positional relationship between the camera and the captured image. Components common to those in FIG. 18 are given the same reference numerals.
- the pixel position 1601 to be processed is Pg (pgx, pgy). Further, it is assumed that the image center 1602 corresponds to Pv1502 in FIG. 18, and its coordinates are set as Pgc (cx, cy).
- the x-direction field angle of the camera is ⁇ px
- the y-direction field angle is ⁇ py.
- This angle of view is acquired by the angle of view acquisition unit 1304 in FIG. Since the angle of view is determined by the focal length of the lens and the chip size, it may be given in advance as data in the memory inside the camera. Here, it is acquired and used. The angle of view is used to determine the range of the blue sky region.
- angles between the straight line connecting the camera 1504 and the Pg 1601 and the straight line connecting the camera 1504 and the Pgc 1602 are ⁇ px ′ in the wx direction and ⁇ py ′ in the wy direction. Since the image has already been corrected for horizontality, ⁇ py ′ contributes only to ⁇ 2 in the camera elevation angle direction. At this time, these angles ⁇ px ′ and ⁇ py ′ can be expressed by the following equations. At this time, the celestial sphere position Pgv corresponding to Pg is expressed by the following equation.
- the polarization phase can be known, so that the radius of the celestial sphere may be set to 1 in the future. From the above, the position of the polarization phase corresponding to each pixel on the celestial sphere is obtained.
- FIG. 20 shows an overhead view of the camera 1504, the image area 2000, and the celestial sphere 1505.
- a line connecting the image center 1602 from the camera 1504 reaches a point 1502 on the celestial sphere 1505.
- Components common to those in FIGS. 18 and 19 are denoted by the same reference numerals.
- the range of view angles 1603 and 1604 is equal to the shooting range.
- the conventional pattern matching method may be applied for camera orientation estimation. Since the polarization phase is one cycle from 0 ° to 180 °, it is preferable to subtract 180 from the phase of 180 ° to 360 ° and fall within the range of 0 ° to 180 °.
- SSD Sud of Squared Difference
- the camera orientation is virtually determined, it can be seen where the above-mentioned camera center pixel position Pgc is on the all-sky polarization map.
- a difference between the polarization phase at each pixel position of the blue sky polarization image and the polarization phase at the position corresponding to each pixel position on the all-sky polarization map is obtained, and a square error is calculated.
- the square error is calculated while changing the virtually determined camera direction, and the camera direction that minimizes the error is determined.
- the polarization phase at the point Pgv 1607 on the all-sky polarization pattern is ⁇ pgv.
- the polarization phase at a point Pg1601 on the actual image is ⁇ pg
- the square error is Err.
- the square error Err is expressed by the following equation.
- the point Pv1502 on the all-sky polarization pattern corresponding to the image center Pgc 1606 is moved so that the square error Err is minimized.
- the sun position is acquired by the sun position acquisition unit 1301.
- a theoretical all-sky polarization pattern is obtained by a mathematical expression.
- the azimuth angle of the camera can be obtained by calculating using the obtained pattern and the sun position.
- Non-Patent Document 1 when the expression of Non-Patent Document 1 is used, the following expression is established, assuming that the polarization phase at a certain pixel in the blue sky polarization image is ⁇ pg.
- ⁇ 1 and ⁇ 2 can be calculated using Equation 22 from ⁇ pg, ⁇ 1, and ⁇ 2 corresponding to each pixel in the blue sky polarization phase image.
- Equation 22 it is sufficient to calculate at least three points.
- noise exists in the polarization phase image, and therefore it is preferable to use as many points as possible.
- a method of applying dynamic programming repeatedly may be used.
- the mathematical formula used in the calculation mode is not limited to the formula 22.
- the camera orientation can be acquired in the same manner using other sky polarization equations.
- the output unit 102 in FIG. 1F outputs the azimuth and elevation angle of the camera acquired by the above method as data in a format required in the subsequent stage. That is, information indicating ⁇ 1 and ⁇ 2 in FIG. 19 is output in an output form corresponding to the case.
- FIG. 21 is a block diagram showing a configuration of a camera orientation detection apparatus according to the second embodiment of the present invention.
- the same components as those shown in FIG. 1F are denoted by the same reference numerals, and detailed description thereof is omitted here.
- the first difference between the first embodiment and the present embodiment is that the apparatus of the first embodiment includes the “blue sky polarization phase image acquisition unit 100” (FIG. 1F), whereas the first embodiment is different from the first embodiment.
- the apparatus is provided with a “blue sky polarized image acquisition unit 1700” (FIG. 21).
- blue sky polarization image means both a blue sky polarization phase image and a blue sky polarization degree image. That is, in this embodiment, not only a blue sky polarization phase image but also a blue sky polarization degree image is acquired.
- a second difference between the first embodiment and the present embodiment is that the apparatus of the present embodiment has a “camera direction estimation unit 1701” that executes processing different from the “camera direction estimation unit 101” of the first embodiment. It is in preparation.
- FIG. 22A shows a configuration diagram of the blue sky polarization image processing unit 100c in the blue sky polarization image acquisition unit 1700. Components that are the same as those shown in FIG. 11A are given the same reference numerals, and detailed descriptions thereof are omitted.
- the output selection unit 1009 employs either the first mask image A ′ or the second mask image B ′.
- the adopted mask image (“mask image” C ′ ”) and an AND with the polarization degree image ⁇ is calculated by the image calculation unit 1801 to calculate the blue sky polarization degree image ⁇ sky and output it together with the blue sky polarization phase image ⁇ sky.
- FIG. 23A is the polarization phase image ⁇ of the scene image
- the image of FIG. 23B is the polarization degree image ⁇ of the scene image.
- the sky region 1101, the building region 1102, the cloud region 1103, the ground region 1104, and the camera stand 1105 are included in the image. These symbols are the same as those assigned to the corresponding regions in FIG.
- the image in FIG. 23D is a mask A ′ generated by the minimum configuration 1012.
- the result of AND operation with the polarization phase image ⁇ by the image calculation unit 1011 is the blue sky polarization phase image ⁇ sky of FIG. 23E
- the result of AND operation with the polarization degree image ⁇ by the image calculation unit 1801 is shown in FIG. It is a blue sky polarization degree image ⁇ sky of (f). It can be seen that both have a characteristic pattern of the sky.
- the sky polarization degree pattern can be obtained by calculation from the position of the sun, and can be used for estimation of the camera orientation.
- the blue sky polarized image processing unit 100c in FIG. 22A can be modified as shown in FIGS. 22B and 22C. That is, in the configuration example of FIG. 22B, the selection unit 1014 determines which mask should be created before the first blue sky region mask A ′ and the second blue sky region mask B ′ are created. It is determined based on the output ⁇ d of 1010. In the configuration example of FIG. 22C, the selection unit 1014 determines which mask should be created in the shooting date / time information before the first blue sky region mask A ′ and the second blue sky region mask B ′ are created. Determine based on.
- FIGS. FIG. 24 shows the configuration of the search mode
- FIG. 27 shows the configuration of the calculation mode.
- the same reference numerals are given to components common to the components in FIG. 16, and detailed description thereof will be omitted here.
- the difference between the corresponding configuration of the first embodiment and the configuration of the present embodiment is that it has a blue sky polarization image, that is, a blue sky polarization degree image ⁇ sky as an input in addition to a blue sky polarization phase image ⁇ sky.
- the yaw level acquisition unit 1901 acquires the angle in the yaw direction of the camera, that is, the elevation angle.
- the all-sky polarization map candidate area acquisition unit 1903 cuts out only the part that becomes the candidate area from the all-sky polarization map area corresponding to the acquired blue sky polarization image based on the angle of view and the elevation angle of the camera.
- the blue sky region direction estimation unit 1303 functions in the same manner as the blue sky region direction estimation unit 1303 in the first embodiment.
- the reliability determination unit 1904 determines the reliability of the acquired camera orientation. Details of each part will be described below.
- the yaw level acquisition unit 1901 acquires the elevation angle of the camera.
- the search area of the whole sky polarization map in the subsequent stage is limited.
- the same level as the roll level correction unit 100b of the blue sky polarization image acquisition unit 1700 may be installed on the same plane so that the elevation angle can be acquired.
- the mounting level may be anything as long as it can be mounted inside the camera as described in, for example, patent literature.
- the obtained elevation angle corresponds to ⁇ 2.
- the blue sky region direction estimation unit 1303 can easily obtain the orientation of the camera simply by changing only the orientation angle ⁇ 1 of the camera.
- Figure 25 shows a conceptual diagram. Similarly to FIG. 17, the sun 1401 and the camera position 1404 are shown in a diagram showing the polarization phase of the sky. In the schematic diagram 1409 of the captured polarization image, the sky polarization phase is schematically shown by arrows. Note that the procedure for limiting the search area is the same for the all-sky polarization degree map, and thus the illustration is omitted.
- the search range is included in the all-sky polarization map by the all-sky polarization map candidate region acquisition unit 1903.
- a search in which the elevation angle and the angle of view are fixed within this area 2001 and only the camera orientation is a variable may be performed in the same manner as the method described in the first embodiment. For this reason, high-speed search can be expected.
- Matching is performed by appropriately changing the values before and after the obtained field angle and within a suitable range of values before and after the elevation angle. Of course, it can also be used.
- FIG. 26A a mode for determining the reliability based on the solar altitude will be described.
- the configuration of FIG. 26A is characterized in that a solar position acquisition unit 1301 and a solar altitude determination unit 1302 are provided.
- the solar altitude determination unit 1302 determines “no reliability” if the altitude of the sun obtained from the solar position acquisition unit 1301 is equal to or higher than a predetermined altitude, and performs processing such as processing stop and error display. This is because when the sun approaches the zenith, the polarization maps become almost equal in the east, west, south, and north directions, and the reliability of the determination decreases.
- FIG. 26B shows the polarization phase pattern of the sky when the sun 2101 is located at the zenith.
- the polarization phase from the Local Meridian is 90 ° in any direction. With this, the camera orientation cannot be obtained.
- the camera orientation acquisition is stopped, an error is displayed to the user, and the process ends. For example, if the solar altitude is within 5 degrees of the zenith angle, the determination is impossible.
- the apparatus includes both the yaw direction horizontality acquisition unit 1901 shown in FIG. 24, the all-sky polarization map candidate region 1903, and the solar position acquisition unit 1301 and solar altitude determination unit 1302 shown in FIG. 26A.
- the reliability determination unit 1904 determines the reliability of the estimation result and presents it to the user.
- the reliability since the information on the solar altitude is obtained, the reliability can be evaluated by this, but information other than the solar altitude can be used for the evaluation of the reliability. For example, when a plurality of candidate areas are acquired, the reliability may be lowered if the number of candidate areas is large. Alternatively, the reliability may be lowered when a region having a low degree of polarization is selected.
- the user needs to take some measures against this information. For example, in the case where a plurality of camera orientations are presented, there is a means of selecting a camera closest to the actual shooting direction / shooting position / sun position from among them. Alternatively, there is a means of changing the shooting direction by a recommended function of the camera, which will be described later, or a means of postponing the shooting time until the time suggested by the display unit of the camera if the shooting situation permits. .
- FIG. 16B A difference from FIG. 16B is that a partial blue sky region direction calculation unit 1304 is provided, and the determination result of the degree of horizontality and solar altitude in the yaw direction is added as information input to the partial blue sky region direction calculation unit 1304. Is a point. These are used as constraint conditions in the calculation, and the blue sky region direction is obtained by calculation in the same manner as in the first embodiment. For example, if the output result of the solar altitude determination unit 1902 is “calculation is impossible because the altitude is too high”, the subsequent calculation is stopped, and the reliability determination unit 1904 determines “no reliability”. Similarly, the user is notified on the display or the like that the determination is impossible.
- the reliability determination unit 1904 finally determines the reliability of the estimation result and notifies the user. When the reliability is low, the user is required to take some means as described above.
- FIG. 28 is a diagram showing a configuration of a camera orientation detection apparatus according to the third embodiment of the present invention. 28, the same reference numerals are given to the same components as those in FIG. 21, and the detailed description thereof is omitted here.
- the characteristic point of this embodiment is the configuration and function of the output unit 2201.
- the output unit 2210 includes a part that calculates the sun direction in the camera coordinates, and creates and outputs data including information indicating the sun direction and information indicating the camera direction in a prescribed format.
- the output unit 2201 will be described.
- the configuration of the output unit 2201 is shown in FIG. Based on the azimuth and elevation angle of the camera, the coordinate conversion unit 2301 calculates the sun position in the camera coordinates. Then, the image format creation unit 2302 creates an image Im having a format including the camera direction and the sun direction. Hereinafter, the flow will be described.
- CamVect (xcm, ycm, zcm) is obtained on the celestial sphere as the camera orientation.
- the sun is assumed to be Ps (xs, ys, zs) on the celestial sphere.
- CamVect it is necessary to make the camera optical axis direction coincide with the z-axis. At this time, a 3 ⁇ 3 rotation matrix R1 is considered.
- the value of the roll direction horizontality acquired by the blue sky polarized image acquisition unit is used to return to the actual camera coordinate state. . This can be done by preparing a rotation matrix R2 that reversely calculates Equation 10.
- the sun SUNcm (xscm, yscm, zscm) on the camera coordinates is obtained as follows using R1 and R2.
- the present invention can be used not only for processing inside the camera but also for post-processing by a computer using, for example, a photographed image outside the camera. Therefore, a unique format for extracting information on the camera direction and the sun direction to the outside is required.
- the image format 2401 is the shooting time / time and latitude / longitude data 2402 for the captured image Im. -Camera orientation (azimuth / elevation angle) information on celestial coordinates 2403-2404 ⁇ Sun position information on camera coordinates 2405 ⁇ Blue sky area division result data 2406 Blue sky region polarization phase image 2407 ⁇ Digital camera image data 2408 Etc. at the same time.
- other common image formats may have information at the same time.
- the minimum required information differs depending on the application. However, the minimum information includes the longitude and latitude of the shooting location and the camera orientation (orientation). An image in this format is output from the image input device.
- the camera orientation 2403 can be known, so it can be seen whether the Arc de Triomphe or Champs Elysee was taken against the world map. If this is used, it can be used, for example, for classification of images to be captured by an individual stored in a PC. Alternatively, the image information on the web can be used for shooting target classification.
- an approximate camera viewpoint direction can be recognized. If the camera optical axis direction is known, for example, it can be used as very effective information when performing image composition between images classified for each subject on an individual or on the web, and use in the CG / CV field Can provide useful information.
- the sun direction coordinates 2404 Since the sun direction coordinates 2404 have been derived, the positional relationship of the sun with respect to the camera is known. For example, if the camera is facing the sun, it is predicted that the camera is backlit. Therefore, it is possible to perform backlight correction or “recommend backlight correction” to the user. In this case, if the display unit performs processing such as when the camera approaches the direction that the camera recommends, the camera will focus when the camera approaches that direction, and when the camera moves away from the camera, the user will naturally recommend it. It is also possible to make a framing proposal that can point the camera in the direction. It is possible to calculate the sun position on the PC from the date and time, latitude / longitude data 2402, and camera orientation / elevation angle data 2403 and 2405. Therefore, the sun direction 2404 is not necessarily derived in the format. There is no need.
- the format holds the blue sky region division result data 2406, it can be used to convert the color of the blue sky region of the photograph taken in the daytime into the evening sky color or present it as a conversion candidate.
- another conversion such as making it a little darker
- other parts other than the blue sky area it becomes possible to convert a more natural scene image, which is useful information in image processing software. Can provide.
- the image input device (camera etc.) provided with the camera orientation detection device of the present invention
- the polarization pattern of the entire sky is used. No special lens is required to get.
- the camera orientation detection method obtains a polarization image and a color image in step S2500 by the camera, and the polarization phase of the blue sky region included in the color image based on the polarization image and the color image.
- the camera orientation detection method of the present invention for executing such steps is not limited to being applied to an apparatus having the above-described configuration, and can be implemented even when applied to an apparatus having another configuration. .
- the above embodiment is related to an imaging device including an “imaging device direction detection unit”, but the present embodiment is related to a moving object (typically, an automobile) including an “imaging device direction detection unit”. is doing. That is, the moving body in the present embodiment includes an imaging device having an imaging unit that acquires a polarization image including a polarization phase image and a luminance image, and the imaging device orientation detection device described above. Further, the moving body includes a moving body direction estimation unit that determines the direction of the moving body from the detected direction of the imaging apparatus in accordance with the relationship between the direction of the moving body and the direction of the imaging apparatus.
- the position of the vehicle is determined by GPS, and on the assumption that the vehicle is moving forward or backward, based on the position change when the vehicle moves, Find the direction (direction).
- the GPS position data does not change not only when the vehicle stops but also when the direction changes at approximately the same position, such as at an intersection. Therefore, the azimuth of the moving body is estimated from past azimuth data and information such as the number of rotations of the tire during traveling.
- the current actual direction may not be correctly indicated depending on the stop time of the moving body, the road surface condition, and the driving situation up to the stop. For example, when the car stops due to spinning, the direction of rotation of the car cannot be calculated from the number of rotations of the tire, so it is impossible to estimate which direction the car has stopped.
- FIG. 32A is a diagram illustrating a configuration of a moving body orientation detection device included in the moving body according to the present embodiment.
- the moving body direction detection device includes a moving body direction estimation unit 2600 that calculates the moving body direction and a moving body direction output unit 2601 that outputs the moving body direction. is there.
- a database 260 that provides information that defines the relationship between the camera orientation and the moving object orientation is connected to the moving object orientation estimating unit 2600.
- the database 2600 may be included in the mobile body, or may be appropriately connected to the database 2600 installed outside the mobile body by wire or wirelessly.
- FIG. 33 is a flowchart showing the operation of the moving body orientation detection device provided in the moving body in the present embodiment.
- the camera orientation estimation unit 101 executes camera orientation estimation step S2702 to estimate the orientation of the imaging device (imaging device orientation).
- the moving body direction estimation unit 2600 executes the moving body direction estimation step S2703 to estimate the direction of the moving body.
- the relationship between the orientation of the imaging device and the orientation of the moving body will be described. As will be described later with reference to FIG. 35, the relationship between the orientation of the imaging device and the orientation of the moving body changes depending on the mounting position of the camera (imaging unit). For this reason, the orientation of the imaging device does not necessarily match the orientation of the moving body. For this reason, for example, it is necessary to determine the orientation of the moving body from the orientation of the imaging device in accordance with the mounting position of the imaging unit on the moving body (referred to simply as “camera position”).
- data (table) having a structure as shown in FIG. 32B is stored in the database 260.
- the coordinates of the moving body direction can be calculated. For example, when the imaging device is installed at the rear of an automobile, the orientation for the moving object is obtained by rotating the orientation for the camera by 180 degrees.
- the relationship between the camera direction and the moving body direction is defined not only by the camera position but also by the relationship between the camera line-of-sight direction and the moving body direction. For this reason, it is preferable that the data stored in the database 260 has information indicating a more accurate relationship between the camera orientation and the moving body orientation.
- the moving body direction output unit 2601 executes the moving body direction output step S2704 to perform processing so that the moving body direction information can be presented to the user with a display, voice, or the like.
- a situation is shown in which a car 2801 enters the intersection 2800 and stops.
- a speed difference such as GPS
- the selected road is different from the road that was intended to pass, it is necessary to go back to the correct road from the intersection, which is complicated.
- the present invention it is possible to present the direction of the vehicle to the user without traveling a certain distance by simply acquiring the blue sky polarization image outside the vehicle shown in the image region 2802 of FIG.
- the presentation method may use a display 2803 or an audio warning 2804.
- the direction of the car can be displayed on the map in the form of an arrow 2805 so that the user can easily understand the direction of the own car. effective.
- the scene image / scene polarization image acquisition unit 100a, the blue sky polarization image processing unit 100c, and the camera orientation estimation unit 101 illustrated in FIG. 32A perform the same operations as the processing units with the same numbers illustrated in FIG. Description is omitted.
- the roll level correction unit 100b may be operated in the same manner as in FIG. 1F, but may be another operation described below.
- the imaging unit is fixed while driving on or in the car. Therefore, once the roll level with respect to the ground is stored at the time of installation, correction may be performed using the level at the time of installation. Since it is not necessary to extract the horizontal line every time, there is an effect that processing can be performed at higher speed.
- FIG. 35 shows a typical installation example of a polarization imaging device (imaging device) 2900 on a moving body.
- FIG. 35 shows mobile bodies (automobiles) 2901 to 2904 according to this embodiment.
- the moving body 2901 is installed at a position on the front bonnet or a position on the rear bonnet like the polarization imaging element 2905. In this way, a blue sky polarization phase image can be acquired from a relatively high position that does not interfere with driving.
- the polarization image sensor 2906 or the polarization image sensor 2907 may be installed at a position lower than the hood, for example, near the lower part of the vehicle body. By doing so, the influence on the appearance can be reduced. Further, it may be installed obliquely like a polarization imaging device 2907.
- a blue sky polarized image can be stably acquired from a higher position.
- polarization imaging elements 2905 to 2908 described above are referred to as polarization imaging elements 2900 because they are the same polarization imaging elements that differ only in the installation location.
- the user may determine the installation position of the polarization imaging device 2900 at any time at any time before boarding. For example, if the imaging device and the device including the blue sky polarization image processing unit can be connected by a cable or the like, as shown by the moving body 2904, the user can select the position of the imaging unit from a wide range. This is convenient because it can be used in accordance with the convenience of each user. If the imaging unit is in a position where a blue sky polarization phase image outside the vehicle can be acquired and is not directed directly above the vehicle body (the possibility of including the sun in the image is very high) May be installed. Therefore, other than the example of arrangement shown here, it may be installed at a position that satisfies the above installation conditions.
- a message may be issued to the mobile unit 3000 from the apparatus of this embodiment. For example, as shown in the display 3001, “cannot be used because of low reliability” is displayed, or the user is notified by an audio warning 3002 that the apparatus of the present invention is not usable. There is. By doing in this way, possibility that a user will receive wrong information can be reduced.
- a navigation system of a portable device obtains the position of a person who moves by GPS, and on the assumption that the person is moving forward or backward, based on the position change when the person moves, Is looking for which direction (direction).
- the orientation detection device in the present embodiment can estimate the orientation of the portable device held by the person even when the person is not walking.
- FIG. 37 is a diagram illustrating a configuration example of the portable device orientation detection device according to the present embodiment.
- the configuration of FIG. 37 is different from the configuration of the other embodiments in that it includes a mobile device orientation estimation unit 3100 that determines the mobile device orientation and a mobile device orientation output unit 3101 that outputs the mobile device orientation.
- FIG. 38 is a flowchart illustrating the operation of the mobile device orientation detection device provided in the moving body according to the present embodiment.
- the image processing step S3201 is executed in the blue sky polarized image processing unit 100c. In this way, a blue sky polarization image is acquired.
- the camera orientation estimation unit 101 executes the camera orientation estimation step S3202 to estimate the orientation of the image sensor.
- the mobile device orientation estimation unit 3100 executes the mobile device orientation estimation step S3203 to estimate the orientation of the mobile device.
- the mobile device orientation can be determined from the camera orientation based on the relationship between the camera orientation and the mobile device orientation.
- the portable device orientation output step S 3204 performs processing so that the orientation information of the portable device can be presented to the user with a display, voice, or the like.
- FIG. 39 illustrates an example of usage status.
- a situation is shown in which a person 3301 has reached the branch 3300 and stopped.
- the speed difference such as GPS
- the orientation of the mobile device is presented to the user only by acquiring a blue sky polarized image outside the vehicle shown in the image region 3302 of FIG. I can do it.
- the presentation method may use a display 3303 or an audio warning 3304.
- the orientation of the portable device is displayed on the map with an arrow 3305 or the like, so that the user can easily understand his / her orientation. .
- the scene image / scene polarization image acquisition unit 100a, the blue sky polarization image processing unit 100c, and the camera orientation estimation unit 101 illustrated in FIG. 37 perform the same operations as the processing units of the same number illustrated in FIG. Description is omitted.
- the roll levelness of the image sensor with respect to the ground is an element indicating the gripping state of the mobile device.
- both the image sensor and the display are fixedly mounted on a portable device.
- the state of the user looking at the display also generally stands or sits vertically on the ground. Therefore, if the level of the image sensor with respect to the ground is known, it is equal to the level of the display with respect to the user. Therefore, if the roll horizontality is input not only to the blue sky polarization phase but also input to the output unit and corrected to perform display according to the horizontality, the user can easily turn the roll. There is an effect of becoming recognizable.
- FIG. 40 shows a representative example of installation examples of the polarization imaging device 3400 on a portable device.
- FIG. 40 shows mobile devices (mobile phones) 3401 to 3403 according to this embodiment.
- a camera with a mobile phone such as the polarization imaging device 3400 in the portable device 3401, uses a polarization imaging device capable of capturing color images and polarization images at the same time, without causing inconvenience to the user.
- a blue sky polarization phase image can be acquired.
- the device of the present embodiment displays, for example, the polarization phase shown in FIG.
- An image converted into an image may be output.
- an image obtained by extracting only the sky may be created, or the image may be used for image conversion such that only the empty area is replaced with another texture.
- the polarization image sensor 3404 is installed near the place where the highest altitude is located, so that the blue sky polarized image can be stably displayed from a higher position. May be available.
- the user may be able to determine the installation position of the element at any time.
- the mobile device 3403 where it can be used to obtain a blue sky polarization phase image and in a direction that does not face directly above (the possibility of including the sun in the image is very high) A person may install it. This is convenient because each user can easily adjust the acquisition position of the blue sky region.
- the present embodiment of the present invention includes everything installed at a position that satisfies the previous installation conditions.
- a message may be issued to the portable device 3500 from the apparatus of the present invention.
- the message “cannot be used due to low reliability” is displayed, or a voice warning 3502 informs the user that the apparatus of the present invention cannot be used. There is. By doing in this way, possibility that a user will receive wrong information can be reduced.
- FIG. 42 is a diagram illustrating a configuration of an imaging device orientation detection device according to the sixth embodiment of the present invention. 42 differs from the configuration in the other embodiments in that it includes a scene image / scene polarization image acquisition unit 3600a that acquires a scene image and a blue sky polarization image processing unit 3600c that determines the polarization state of the blue sky region. It is in. Since the configuration other than this is the same as the configuration of the first embodiment, the description thereof is omitted here.
- the image acquisition unit 3600 of FIG. 42 includes a scene image / scene polarization image acquisition unit 3600a and a level for measuring the tilt in the roll direction, and includes a roll level correction unit 100b, a blue sky polarization image processing unit 3600c, and a camera orientation.
- the estimation unit 101 and the output unit 102 may be provided outside the camera.
- FIG. 43 shows the configuration of the scene image / scene polarized image acquisition unit 3600a.
- the camera includes an imaging unit that functions as a scene image / scene polarized image acquisition unit 3600a, the content of the scene image / scene polarized image captured varies depending on the orientation of the camera.
- the series of processing for estimating the camera direction is preferably executed inside the camera, but it is not necessarily executed inside the camera.
- FIG. 43 shows the configuration of the scene image / scene polarized image acquisition unit 3600a of this embodiment.
- the scene image and the scene polarization image are preferably acquired at the same time, but may be acquired at intervals up to several seconds.
- the scene image / scene polarization image acquisition unit 3600a in FIG. 43 acquires the luminance image information for the subject in real time and simultaneously acquires the polarization image information to obtain two types of polarization image information (the polarization degree image ⁇ and the polarization phase image ⁇ ). ) Is output.
- the polarization acquisition unit 3701 can acquire both luminance moving image information and polarization image information in real time. Signals indicating luminance moving image information and polarization information image information are output from the polarization acquisition unit 3701 and provided to the luminance information processing unit 3702 and the polarization information processing unit 3703, respectively.
- the luminance information processing unit 3702 and the polarization information processing unit 3703 perform various processes on the signal, and output a luminance image C, a polarization degree image ⁇ , and a polarization phase image ⁇ .
- the polarization acquisition unit 3701 acquires a monochrome image and a polarization image at the same time.
- the technique disclosed in Patent Document 3 can be used.
- a patterned polarizer having a plurality of different polarization main axes (transmission axes) is spatially arranged on an image sensor for the purpose of simultaneously obtaining a luminance image and a partially polarized image of a subject.
- a photonic crystal or a structural birefringent wave plate array is used as the patterned polarizer.
- FIG. 44 shows an example of such a polarization luminance imaging device.
- a narrow band color filter 3800 and a patterned polarizer 3801 are placed on the front surface of the image sensor pixel 3802.
- Incident light passes through the narrowband color filter 3800 and the patterned polarizer 3801 and reaches the image sensor, and monochrome luminance is observed by the image sensor pixel 3802. In this way, both luminance image information and polarization image information can be acquired simultaneously.
- the narrow band color filter 3800 it is desirable to use a filter having a transmission band of, for example, 500 to 550 (nm) so as to select a wavelength band in which the patterned polarizer operates.
- FIG. 45 is a diagram of a part of the imaging surface of the polarization acquisition unit 3701 viewed from directly above the optical axis direction. For simplicity, only four finely polarized pixels (2 ⁇ 2) that are in close contact with each other on the imaging surface are shown.
- the stripe written on each finely polarized pixel schematically shows the polarization main axis direction of the minute polarizing plate.
- the luminance dynamic range and the number of bits of the image sensor are as large as possible (for example, 16 bits) in order to reliably acquire the polarization component included in the particularly specular reflection portion of the subject and the polarization component included in the shadow region of the subject. Is desirable.
- the luminance information acquired for each polarization pixel by the configuration shown in FIG. 45 is processed by the polarization information processing unit 3703 in FIG. This process is the same as the process described with reference to FIG.
- the three parameters A, B, and C of the sine function approximation are determined by the above processing.
- a polarization degree image showing the polarization degree ⁇ and a polarization phase image showing the polarization phase ⁇ are obtained.
- the degree of polarization ⁇ represents the degree to which the light of the corresponding pixel is polarized
- the polarization phase ⁇ represents the principal axis angle of the partial polarization of the light of the corresponding pixel.
- the principal axis angle of polarized light is the same between 0 and 180 ° ( ⁇ ).
- the values ⁇ and ⁇ (0 ⁇ ⁇ ⁇ ⁇ ) are calculated by (Expression 6) and (Expression 7), respectively, as in the first embodiment.
- the patterned polarizer of this embodiment may be a photonic crystal, a film-type polarizing element, a wire grid type, or a polarizing element based on other principles.
- the luminance of the light transmitted through the polarizer is different from the original luminance of the light before entering the polarizer.
- a value obtained by averaging the observed luminances of all polarized polarization main axes corresponds to the original luminance of light before entering the polarizer.
- a normal luminance image can be generated by obtaining the luminance in each polarization pixel.
- each luminance and polarization information of each pixel in each of the luminance image C, the polarization degree image ⁇ , and the polarization phase image ⁇ are obtained using the four polarization pixels shown in FIG. Therefore, it can be considered that each luminance and polarization information represents a representative value at a virtual pixel point 3900 located at the center of the four polarization pixels shown in FIG. Therefore, the resolution of the luminance image and the polarization image is reduced to 1/2 ⁇ 1/2 of the resolution of the original image sensor. For this reason, it is desirable that the number of pixels of the image sensor is as large as possible.
- the blue sky polarization image processing unit 3600c receives the polarization degree image ⁇ , the polarization phase image ⁇ , and the luminance image Y, and outputs a blue sky polarization phase image ⁇ SKY.
- the blue sky polarization phase image ⁇ SKY is used for estimating the camera direction and the sun direction from the scene.
- the polarization degree binarization unit 1001 binarizes the polarization degree image ⁇ with a threshold value T ⁇ .
- the luminance binarization unit 1003 binarizes the luminance image Y using the threshold value TC1.
- the image calculation unit 1005 performs an AND (logical product) operation on the polarization image ⁇ ′ binarized by the polarization degree binarization unit 1001 and the luminance image C1 ′ binarized by the luminance binarization unit 1003. Then, the mask image A ′ is output.
- the image calculation unit 1011 performs a logical AND operation on the adopted blue sky region mask Msky and the polarization phase image ⁇ to generate a blue sky polarization phase image ⁇ sky.
- the binarization threshold value T ⁇ is determined from a histogram created from the polarization degree of each pixel in the image, and an intermediate value of two peaks of the polarization degree in the polarization degree histogram may be set as the threshold value T ⁇ .
- the binarization threshold value T ⁇ is a threshold value for determining the degree of polarization, and satisfies the relationship of 0 ⁇ T ⁇ ⁇ 1.
- the blue sky polarization phase image ⁇ sky is obtained by the above processing.
- the polarization phase may not be disturbed even in the cloud region, particularly when the cloud is thin.
- the blue sky region may include a cloud. Whether or not the polarization phase is disturbed by the cloud is a measure of the degree of decrease in the degree of polarization in the cloud region. According to the method of the present embodiment that determines the blue sky region based on the polarization, there is an advantage that only the cloud region having a low degree of polarization can be automatically removed.
- an output selection unit 4001 and a polarization degree determination unit 1010 shown in FIG. 46B may be used.
- the output selection unit 4001 determines whether to adopt the first blue sky region mask A ′ generated from the binarized luminance image C1 ′ and the polarization degree image ⁇ ′ based on the output ⁇ d of the polarization degree determination unit 1010. To decide.
- the output selection unit 4001 and the polarization degree determination unit 1010 may be used to switch according to the shooting date and time instead of switching whether or not the blue sky region can be extracted. For example, after 4 pm, sunset is defined as evening. When it is not evening, the blue sky region is determined with the configuration of FIG. 46A, and in the evening, whether the extraction is possible is determined using the configuration of FIG. 46B. It is also possible.
- the output selection unit 4001 makes the determination.
- the selection unit 4101 determines whether the mask should be created in the output ⁇ d of the polarization degree determination unit 1010 before the blue sky region mask A ′ is created. Judgment based on. For example, when ⁇ d output from the degree of polarization determination unit 1010 does not exceed the threshold value, the selection unit 4101 in FIG. 46C selects the processing stop without creating the blue sky region mask A ′. As a result, the blue sky polarization image processing unit 3600c in FIG. 46C does not create the first blue sky region mask A 'and stops the processing. The user may perform out-of-service display or the like. For this reason, it is sufficient to create a mask only when the degree of polarization is sufficiently high, and processing for creating a mask that is not used can be omitted.
- the selection unit 4201 determines whether a mask should be created before the first blue sky region mask A 'is created. However, in the blue sky polarization image processing unit 3600c in FIG. 46D, selection of whether or not to create a mask is determined based on the shooting date and time information output by the date and time information acquisition unit 1016, not based on the output ⁇ d of the polarization degree determination unit 1010. . When the time indicates evening (for example, after 4 pm and until sunset), the selection unit 4201 in FIG. 46D determines that the first blue sky region mask A ′ cannot be created. As a result, the mask A 'is not created and the processing is stopped. Similarly to FIG. 46C, since an unused mask is not created, there is an effect that the processing becomes more efficient.
- the orientation of the imaging device is estimated. Note that the operations of the camera direction estimation unit 101 and the output unit 102 illustrated in FIG. 42 are the same as those in the other embodiments, and thus detailed description thereof is omitted here.
- the camera orientation detection method includes step S4300 of acquiring a polarization image and a luminance image by the camera, and polarization of the blue sky region included in the luminance image based on the polarization image and the luminance image.
- a step S4301 for generating a blue sky polarization phase image indicating the phase, a step S4302 for estimating the camera direction based on the blue sky polarization phase image, and a step S4303 for outputting information indicating the camera direction are included.
- the camera orientation detection method of the present invention for executing such steps is not limited to being applied to an apparatus having the above-described configuration, and can be implemented even when applied to an apparatus having another configuration. .
- the camera orientation detection method in the present embodiment is particularly the case shown in the examples of FIGS. 46B, 46C, and 46D, that is, based on the output result from the polarization degree determination unit 1010, the output selection unit 4001. In the case where it is determined whether or not the process can be continued, the camera direction estimation unit 101 does not necessarily have to operate when the process cannot be continued.
- this apparatus is provided with a path for sending, for example, an instruction “cannot continue processing” from the blue sky polarization image processing unit 3600c to the output unit 102 and displaying it to the user. Also good.
- the method for realizing the present apparatus may output the processing continuation permission information directly from the step S4301 for generating the blue sky polarization phase image to the output step.
- the image input device of the present invention can acquire light source information between a camera and the sun in a general environment scene by using a sky polarization phenomenon in a completely passive manner, various digital still cameras, digital movie cameras, Applicable to surveillance cameras. Further, it is considered that the present invention can be used as a practical input device even when information amount is given by computer graphics processing to image luminance information that is considered to be insufficient in future miniaturization of cameras.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
Abstract
Description
図1Fは、本実施形態に係る画像入力装置の構成を示している。この画像入力装置は、青空偏光位相画像取得部100、カメラ向き推定部101、出力部102を備えている。青空偏光位相画像取得部100は、シーン画像・シーン偏光画像取得部100a、ロール水平度補正部100b、青空偏光画像処理部100cを有し、青空偏光位相画像φskyを出力する。
図21は、本発明の第2の実施形態に係るカメラ向き検出装置の構成を示すブロック図である。図21において、図1Fに示す構成要素と共通の構成要素には同一の符号を付しており、ここではその詳細な説明を省略する。
図28は、本発明の第3の実施形態に係るカメラ向き検出装置の構成を示す図である。図28において、図21の構成要素と共通の構成要素には同一の符号を付しており、ここでは、その詳細な説明を省略する。
・時刻、緯度経度データ 2402
・天球座標上でのカメラ向き(方位・仰角)情報 2403・2404
・カメラ座標上での太陽位置情報 2405
・青空領域分割結果データ 2406
・青空領域偏光位相画像 2407
・デジカメ画像データ 2408
などを同時に有することを特徴とする。もちろん、その他一般的な画像フォーマットが有するような情報を同時に有していてもよい。
i)撮影位置とカメラ向きから、撮影対象・撮影物体の認識、ラベリング
ii)カメラ位置と太陽位置から、逆光補正・色補正などの画像補正
iii)青空領域の色変換
iv)画像の真贋判断
等の処理を行うことが可能である。これらの4つの適用例について、以下で説明する。
以下、本発明の第4の実施形態を説明する。
以下、本発明の第5の実施形態を説明する。
本発明の第6の実施形態について以下で説明する。
100 青空偏光位相画像取得部
100a シーン画像・シーン偏光画像取得部
100b ロール水平度補正部
100c 青空偏光画像処理部
101 カメラ向き推定部
102 出力部
1301 太陽位置取得部
1302 全天偏光位相マップ取得部
1303 青空領域方向推定部
1304 画角取得部
1305 青空領域方向計算部
1901 ピッチ方向水平度取得部
1902 太陽高度判定部
1904 信頼度判定部
2301 座標変換部
2401 画像フォーマット
Claims (26)
- 偏光位相画像を含む偏光画像、および輝度画像を撮影によって取得する撮像部を備えた撮像装置の向きを検出する撮像装置向き検出装置であって、
前記偏光画像および輝度画像に基づいて、前記輝度画像に含まれる青空領域の偏光位相を示す青空偏光位相画像を生成する画像処理部と、
前記撮像部の向きによって決まる撮像装置向きを前記青空偏光位相画像に基づいて推定する向き推定部と、
前記向き推定部で推定された撮像装置向きを示す情報を出力する出力部と
を備える撮像装置向き検出装置。 - 撮影時における太陽の位置に関する情報を取得する太陽位置取得部を備え、
前記向き推定部は、前記情報を用いて撮像装置向きの推定を行う、請求項1に記載の撮像装置向き検出装置。 - 前記太陽の位置に関する情報に基づいて、撮影時における空の偏光状態を示す全天偏光マップを取得する全天偏光マップ取得部を備え、
前記向き推定部は、前記青空偏光位相画像および前記全天偏光マップに基づいて、前記撮像装置向きを推定する請求項2に記載の撮像装置向き検出装置。 - 前記全天偏光マップ取得部は、全天偏光マップを含むデータベースから撮影時における空の偏光状態を示す全天偏光マップを取得する、請求項3に記載の撮像装置向き検出装置。
- 前記データベースを格納する記憶装置を備える請求項4に記載の撮像装置向き検出装置。
- 前記データベースを格納する外部の記憶装置にアクセスする通信装置を備える請求項4に記載の撮像装置向き検出装置。
- 前記全天偏光マップ取得部は、撮影時における空の偏光状態を示す全天偏光マップを計算によって生成する請求項3に記載の撮像装置向き検出装置。
- 前記向き推定部は、前記青空領域の偏光位相から前記青空領域の方向を計算し、前記撮像装置の向きを推定する請求項1に記載の撮像装置向き検出装置。
- 撮影時における空の偏光状態を示す全天偏光マップを取得する全天偏光マップ取得部を備え、
前記向き推定部は、サーチモードおよび計算モードの少なくとも一方で動作し、
前記サーチモードでは、前記青空偏光位相画像および前記全天偏光マップに基づいて、前記青空領域の方向を探索し、前記計算モードでは、前記青空領域の偏光位相から前記青空領域の方向を計算する、請求項1に記載の撮像装置向き検出装置。 - 前記撮像装置の傾きを補正する水平度補正部を備える請求項1に記載の撮像装置向き検出装置。
- 前記撮像装置の傾きは、ロール方向の傾きを含む請求項10に記載の撮像装置向き検出装置。
- 前記撮像装置は、水準器を備えており、
前記水準器によって水平度を取得し、取得した水平度に基づいて撮像装置の傾きを補正する請求項11に記載の撮像装置向き検出装置。 - 撮像範囲の画角を取得し、取得した画角に基づいて青空領域の範囲を決定する画角取得部を有する請求項1に記載の撮像装置向き検出装置。
- 前記撮像部は、偏光主軸角度が異なる複数の偏光子を備え、
前記複数の偏光子を透過してくる光に応じて前記偏光画像を取得する、請求項1に記載の撮像装置向き検出装置。 - 前記偏光画像は、前記偏光位相画像に加えて偏光度画像を含む、請求項1に記載の撮像装置向き検出装置。
- 前記画像処理部は、空の偏光度が基準値以上の場合、偏光度を利用して前記青空領域を切り出し、前記偏光度が前記基準値よりも低い場合、色相を利用して前記青空領域を切り出し、前記青空偏光位相画像を出力する請求項1に記載の撮像装置向き検出装置。
- 推定結果の信頼度を判定し、使用者に情報を提示する信頼度判定部を備える請求項2に記載の撮像装置向き検出装置。
- 撮影時における太陽の位置に関する情報から得られる太陽の高度に応じて推定の可否を判定する太陽高度判定部を備えた請求項17記載の撮像装置向き検出装置。
- 太陽の高度および方位、ならびに撮像装置向きに基づいて、座標変換を行ってカメラ座標での太陽位置を取得する請求項2に記載の撮像装置向き検出装置。
- 偏光位相画像を含む偏光画像、および輝度画像を取得する撮像部を有する撮像装置と、
請求項1に記載の撮像装置向き検出装置と、
を備える撮像装置。 - 請求項1に記載の撮像装置向き検出装置を備える移動体であって、
偏光位相画像を含む偏光画像、および輝度画像を取得する撮像部を有する撮像装置と、
前記移動体の向きと前記撮像装置向きとの関係に応じて、検出された前記撮像装置向きから前記移動体の向きを決定する移動体向き推定部と、
を備えている、移動体。 - 請求項1に記載の撮像装置向き検出装置を備える携帯機器であって、
偏光位相画像を含む偏光画像、および輝度画像を取得する撮像部を有する撮像装置と、
前記携帯機器の向きと前記撮像装置向きとの関係に応じて、検出された前記撮像装置向きから前記携帯機器の向きを決定する携帯機器向き推定部と、
を備えている、携帯機器。 - 偏光位相画像を含む偏光画像、および輝度画像を撮影によって取得する撮像部と、
前記偏光画像および輝度画像に基づいて、前記輝度画像に含まれる青空領域の偏光位相を示す青空偏光位相画像を生成する画像処理部と、
前記撮像部の向きによって決まる撮像装置向きを前記青空偏光位相画像に基づいて推定する向き推定部と、
前記撮像部によって撮像された画像のデータ、および前記向き推定部で推定された撮像装置向きを示す情報を出力する出力部と
を備える画像入力装置。 - 画像データと、
撮影時の日時を示すデータと、
撮影場所の経度緯度を示すデータと、
撮像装置向きを示すデータと保持する画像フォーマット。 - 撮像装置によって偏光画像および輝度画像を取得するステップと、
前記偏光画像および輝度画像に基づいて、前記輝度画像に含まれる青空領域の偏光位相を示す青空偏光位相画像を生成するステップと、
撮像装置向きを前記青空偏光位相画像に基づいて推定するステップと、
撮像装置向きを示す情報を出力するステップと
を含む撮像装置向き検出方法。 - 撮影時の撮像装置向きを天空の偏光パターンを利用して検出する撮像装置向き検出装置のためのプログラムであって、
撮像装置によって偏光画像および輝度画像を取得するステップと、
前記偏光画像および輝度画像に基づいて、前記輝度画像に含まれる青空領域の偏光位相を示す青空偏光位相画像を生成するステップと、
撮像装置向きを前記青空偏光位相画像に基づいて推定するステップと、
撮像装置向きを示す情報を出力するステップと
をコンピュータに実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/935,321 US8390696B2 (en) | 2009-01-06 | 2009-12-18 | Apparatus for detecting direction of image pickup device and moving body comprising same |
CN2009801306543A CN102177719B (zh) | 2009-01-06 | 2009-12-18 | 摄像装置朝向检测装置和具备该装置的移动体 |
EP09837447.3A EP2375755B8 (en) | 2009-01-06 | 2009-12-18 | Apparatus for detecting direction of image pickup device and moving body comprising same |
JP2010545634A JP5357902B2 (ja) | 2009-01-06 | 2009-12-18 | 撮像装置向き検出装置および当該装置を備える移動体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-001074 | 2009-01-06 | ||
JP2009001074 | 2009-01-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010079557A1 true WO2010079557A1 (ja) | 2010-07-15 |
Family
ID=42316336
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/007034 WO2010079557A1 (ja) | 2009-01-06 | 2009-12-18 | 撮像装置向き検出装置および当該装置を備える移動体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8390696B2 (ja) |
EP (1) | EP2375755B8 (ja) |
JP (1) | JP5357902B2 (ja) |
CN (1) | CN102177719B (ja) |
WO (1) | WO2010079557A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012191267A (ja) * | 2011-03-08 | 2012-10-04 | Ricoh Co Ltd | 物体の像を取得する装置、物体の像を取得する方法、プログラム、及び記録媒体 |
JP2012189826A (ja) * | 2011-03-10 | 2012-10-04 | Ricoh Co Ltd | 物体の像を取得する装置、物体の像を取得する方法、プログラム、及び記録媒体 |
WO2016008203A1 (zh) * | 2014-07-15 | 2016-01-21 | 中兴通讯股份有限公司 | 自动获取拍摄参数的方法及装置 |
JP2018061124A (ja) * | 2016-10-04 | 2018-04-12 | 株式会社ソニー・インタラクティブエンタテインメント | 撮影装置、情報処理システム、情報処理装置、および偏光画像処理方法 |
WO2020241130A1 (ja) | 2019-05-29 | 2020-12-03 | 古野電気株式会社 | 情報処理システム、方法、及びプログラム |
CN115014311A (zh) * | 2022-08-08 | 2022-09-06 | 中国人民解放军国防科技大学 | 一种基于大气偏振信息剔除天空遮挡的光罗盘定向方法 |
WO2023171470A1 (ja) | 2022-03-11 | 2023-09-14 | パナソニックIpマネジメント株式会社 | 光検出装置、光検出システム、およびフィルタアレイ |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2447672B (en) | 2007-03-21 | 2011-12-14 | Ford Global Tech Llc | Vehicle manoeuvring aids |
US9124804B2 (en) * | 2010-03-22 | 2015-09-01 | Microsoft Technology Licensing, Llc | Using accelerometer information for determining orientation of pictures and video images |
JP5761601B2 (ja) * | 2010-07-01 | 2015-08-12 | 株式会社リコー | 物体識別装置 |
JP5063749B2 (ja) * | 2010-07-12 | 2012-10-31 | キヤノン株式会社 | 撮影制御システム、撮像装置の制御装置及びその制御方法、並びにプログラム |
EP2498583B1 (fr) * | 2011-03-07 | 2017-05-03 | Zedel | Lampe LED dotée d' un dispositif de sécurité |
US9854209B2 (en) | 2011-04-19 | 2017-12-26 | Ford Global Technologies, Llc | Display system utilizing vehicle and trailer dynamics |
US9683848B2 (en) | 2011-04-19 | 2017-06-20 | Ford Global Technologies, Llc | System for determining hitch angle |
US9926008B2 (en) | 2011-04-19 | 2018-03-27 | Ford Global Technologies, Llc | Trailer backup assist system with waypoint selection |
US9723274B2 (en) | 2011-04-19 | 2017-08-01 | Ford Global Technologies, Llc | System and method for adjusting an image capture setting |
US9555832B2 (en) | 2011-04-19 | 2017-01-31 | Ford Global Technologies, Llc | Display system utilizing vehicle and trailer dynamics |
US9374562B2 (en) | 2011-04-19 | 2016-06-21 | Ford Global Technologies, Llc | System and method for calculating a horizontal camera to target distance |
US10196088B2 (en) | 2011-04-19 | 2019-02-05 | Ford Global Technologies, Llc | Target monitoring system and method |
US8970691B2 (en) * | 2011-08-26 | 2015-03-03 | Microsoft Technology Licensing, Llc | Removal of rayleigh scattering from images |
HUP1100482A2 (en) * | 2011-09-05 | 2013-04-29 | Eotvos Lorand Tudomanyegyetem | Method for cloud base height measuring and device for polarization measuring |
US8923567B2 (en) | 2011-12-19 | 2014-12-30 | General Electric Company | Apparatus and method for predicting solar irradiance variation |
US8750566B2 (en) | 2012-02-23 | 2014-06-10 | General Electric Company | Apparatus and method for spatially relating views of sky images acquired at spaced apart locations |
US9562764B2 (en) | 2012-07-23 | 2017-02-07 | Trimble Inc. | Use of a sky polarization sensor for absolute orientation determination in position determining systems |
US9445011B2 (en) * | 2012-10-22 | 2016-09-13 | GM Global Technology Operations LLC | Dynamic rearview mirror adaptive dimming overlay through scene brightness estimation |
CN103942523B (zh) * | 2013-01-18 | 2017-11-03 | 华为终端有限公司 | 一种日照场景识别方法及装置 |
CN103197685A (zh) * | 2013-04-11 | 2013-07-10 | 南京信息工程大学 | 一种太阳能自动跟踪系统 |
US20150042793A1 (en) * | 2013-08-10 | 2015-02-12 | Trex Enterprises Corporation | Celestial Compass with sky polarization |
EP3060880A4 (en) * | 2013-10-22 | 2017-07-05 | Polaris Sensor Technologies, Inc. | Sky polarization and sun sensor system and method |
US9464887B2 (en) | 2013-11-21 | 2016-10-11 | Ford Global Technologies, Llc | Illuminated hitch angle detection component |
US9464886B2 (en) | 2013-11-21 | 2016-10-11 | Ford Global Technologies, Llc | Luminescent hitch angle detection component |
JP6332281B2 (ja) * | 2013-12-17 | 2018-05-30 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
US9296421B2 (en) | 2014-03-06 | 2016-03-29 | Ford Global Technologies, Llc | Vehicle target identification using human gesture recognition |
KR20150106719A (ko) * | 2014-03-12 | 2015-09-22 | 삼성전자주식회사 | 전자 장치의 촬영 위치 안내 방법 및 이를 이용한 전자 장치 |
CN104125400B (zh) * | 2014-07-15 | 2018-03-30 | 中兴通讯股份有限公司 | 一种提示用户的方法及电子设备 |
US10175358B2 (en) | 2014-08-04 | 2019-01-08 | Elbit Systems Of America, Llc | Systems and methods for northfinding |
CN106537409B (zh) * | 2014-08-18 | 2020-02-14 | 谷歌有限责任公司 | 确定影像的罗盘定位 |
US10112537B2 (en) | 2014-09-03 | 2018-10-30 | Ford Global Technologies, Llc | Trailer angle detection target fade warning |
US9607242B2 (en) | 2015-01-16 | 2017-03-28 | Ford Global Technologies, Llc | Target monitoring system with lens cleaning device |
CN104994271B (zh) * | 2015-05-28 | 2016-06-29 | 北京航天控制仪器研究所 | 一种索道摄像机系统及其控制及视频信号传输方法 |
US9836060B2 (en) | 2015-10-28 | 2017-12-05 | Ford Global Technologies, Llc | Trailer backup assist system with target management |
US9792522B2 (en) * | 2015-12-01 | 2017-10-17 | Bloomsky, Inc. | Weather information extraction using sequential images |
CN106441310B (zh) * | 2016-11-30 | 2019-06-04 | 北京航空航天大学 | 一种基于cmos的太阳方位角计算方法 |
US10976239B1 (en) * | 2017-03-14 | 2021-04-13 | Hart Scientific Consulting International Llc | Systems and methods for determining polarization properties with high temporal bandwidth |
CN106973236B (zh) * | 2017-05-24 | 2020-09-15 | 湖南盘子女人坊文化科技股份有限公司 | 一种拍摄控制方法及装置 |
US10710585B2 (en) | 2017-09-01 | 2020-07-14 | Ford Global Technologies, Llc | Trailer backup assist system with predictive hitch angle functionality |
CN108387206B (zh) * | 2018-01-23 | 2020-03-17 | 北京航空航天大学 | 一种基于地平线与偏振光的载体三维姿态获取方法 |
CN108225335B (zh) * | 2018-01-23 | 2020-06-19 | 中国人民解放军国防科技大学 | 一种用于多目偏振视觉的航向角求解方法 |
US11543485B2 (en) | 2018-05-22 | 2023-01-03 | Samsung Electronics Co., Ltd. | Determining location or orientation based on environment information |
DE102018132590A1 (de) | 2018-12-18 | 2020-06-18 | Valeo Schalter Und Sensoren Gmbh | Determination of an attitude of a vehicle based on sky polarization by acelestial light source |
CN110887475B (zh) * | 2019-12-09 | 2021-12-10 | 北京航空航天大学 | 一种基于偏振北极点及偏振太阳矢量的静基座粗对准方法 |
CN113362392B (zh) * | 2020-03-05 | 2024-04-23 | 杭州海康威视数字技术股份有限公司 | 可视域生成方法、装置、计算设备及存储介质 |
CN111724440B (zh) * | 2020-05-27 | 2024-02-02 | 杭州数梦工场科技有限公司 | 监控设备的方位信息确定方法、装置及电子设备 |
US11995836B2 (en) | 2020-07-07 | 2024-05-28 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | System and method for performing sky-segmentation |
DE102021201987A1 (de) * | 2021-03-02 | 2022-09-08 | Carl Zeiss Microscopy Gmbh | Verfahren, Computerprogrammprodukt und Mikroskopiesystem zur Darstellung von polarisierenden Proben |
US11935263B1 (en) * | 2021-04-20 | 2024-03-19 | Apple Inc. | Global reference direction detection for orienting three-dimensional models |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08160507A (ja) | 1994-12-07 | 1996-06-21 | Canon Inc | カメラ |
JPH1188820A (ja) * | 1998-04-16 | 1999-03-30 | Toshiba Corp | 方位情報を記録可能な画像記録装置 |
JP2004048427A (ja) * | 2002-07-12 | 2004-02-12 | Koncheruto:Kk | 撮影位置および方位付のデジタルカメラ又は画像付携帯電話システム |
JP2004117478A (ja) | 2002-09-24 | 2004-04-15 | Fuji Photo Film Co Ltd | カメラ |
JP2007086720A (ja) | 2005-08-23 | 2007-04-05 | Photonic Lattice Inc | 偏光イメージング装置 |
JP2007240832A (ja) | 2006-03-08 | 2007-09-20 | Citizen Holdings Co Ltd | 自動合焦点装置 |
JP2008016918A (ja) * | 2006-07-03 | 2008-01-24 | Matsushita Electric Ind Co Ltd | 画像処理装置、画像処理システムおよび画像処理方法 |
WO2008149489A1 (ja) * | 2007-05-31 | 2008-12-11 | Panasonic Corporation | 画像処理装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5028138A (en) * | 1989-05-23 | 1991-07-02 | Wolff Lawrence B | Method of and apparatus for obtaining object data by machine vision form polarization information |
US5052799A (en) * | 1989-07-17 | 1991-10-01 | Thurman Sasser | Object orienting systems and systems and processes relating thereto |
US5424535A (en) * | 1993-04-29 | 1995-06-13 | The Boeing Company | Optical angle sensor using polarization techniques |
JP3935499B2 (ja) * | 2004-07-26 | 2007-06-20 | 松下電器産業株式会社 | 画像処理方法、画像処理装置および画像処理プログラム |
CN1910623B (zh) * | 2005-01-19 | 2011-04-20 | 松下电器产业株式会社 | 图像变换方法、纹理映射方法、图像变换装置和服务器客户机系统 |
CN101044507B (zh) * | 2005-09-01 | 2010-08-18 | 松下电器产业株式会社 | 图像处理方法以及图像处理装置 |
JP2009544228A (ja) * | 2006-07-18 | 2009-12-10 | ザ・トラスティーズ・オブ・ザ・ユニバーシティ・オブ・ペンシルバニア | 偏光を使用した、重複キャストシャドウ成分の分離およびコントラスト強調、ならびに陰影内の標的検出 |
JP2008026353A (ja) | 2006-07-18 | 2008-02-07 | Nikon Corp | 偏光方向検出装置とこれを有する撮像装置 |
WO2008026518A1 (fr) * | 2006-08-31 | 2008-03-06 | Panasonic Corporation | Dispositif, procédé et programme de traitement d'image |
CN101542232B (zh) * | 2007-08-07 | 2011-10-19 | 松下电器产业株式会社 | 法线信息生成装置以及法线信息生成方法 |
CN101542233B (zh) * | 2007-08-07 | 2011-12-28 | 松下电器产业株式会社 | 图像处理装置以及图像处理方法 |
CN100523820C (zh) * | 2007-11-01 | 2009-08-05 | 大连理工大学 | 一种运动方向角度偏振敏感检测方法和传感器装置 |
-
2009
- 2009-12-18 US US12/935,321 patent/US8390696B2/en active Active
- 2009-12-18 CN CN2009801306543A patent/CN102177719B/zh active Active
- 2009-12-18 JP JP2010545634A patent/JP5357902B2/ja active Active
- 2009-12-18 WO PCT/JP2009/007034 patent/WO2010079557A1/ja active Application Filing
- 2009-12-18 EP EP09837447.3A patent/EP2375755B8/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08160507A (ja) | 1994-12-07 | 1996-06-21 | Canon Inc | カメラ |
JPH1188820A (ja) * | 1998-04-16 | 1999-03-30 | Toshiba Corp | 方位情報を記録可能な画像記録装置 |
JP2004048427A (ja) * | 2002-07-12 | 2004-02-12 | Koncheruto:Kk | 撮影位置および方位付のデジタルカメラ又は画像付携帯電話システム |
JP2004117478A (ja) | 2002-09-24 | 2004-04-15 | Fuji Photo Film Co Ltd | カメラ |
JP2007086720A (ja) | 2005-08-23 | 2007-04-05 | Photonic Lattice Inc | 偏光イメージング装置 |
JP2007240832A (ja) | 2006-03-08 | 2007-09-20 | Citizen Holdings Co Ltd | 自動合焦点装置 |
JP2008016918A (ja) * | 2006-07-03 | 2008-01-24 | Matsushita Electric Ind Co Ltd | 画像処理装置、画像処理システムおよび画像処理方法 |
WO2008149489A1 (ja) * | 2007-05-31 | 2008-12-11 | Panasonic Corporation | 画像処理装置 |
Non-Patent Citations (6)
Title |
---|
DAISUKE MIYAZAKI ET AL.: "Polarization Analysis of the Skylight Caused by Rayleigh Scattering and Sun Orientation Estimation Using Fisheye-Lens Camera", PATTERN RECOGNITION AND MEDIA UNDERSTANDING SOCIETY OF THE INSTITUTE OF ELECTRONICS, INFORMATION, AND COMMUNICATION ENGINEERS IN JAPAN, vol. 108, no. 198, 2008, pages 25 - 32 |
HITOSHI TOKUMARU: "Light and Radio Waves", MORIKITA PUBLISHING, CO., LTD., article "March 21, 2000" |
ISTVAN POMOZI ET AL.: "How the Clear-Sky Angle of Polarization Pattern Continues Underneath Clouds: Full-Sky Measurements and Implications for Animal Orientation", THE JOURNAL OF EXPERIMENTAL BIOLOGY, vol. 204, 2001, pages 2933 - 2942 |
M. V. BERRY ET AL.: "Polarization Singularities in the Clear Sky", NEW JOURNAL OF PHYSICS, vol. 6, 2004, pages 162 |
MASAO SHIMIZU; MASATOSHI OKUTOMI: "Two-Dimensional Simultaneous Sub-pixel Estimation for Area-Based Matching", TRANSACTIONS OF THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS (OF JAPAN) D-II, vol. 2, February 2004 (2004-02-01), pages 554 - 564 |
See also references of EP2375755A4 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012191267A (ja) * | 2011-03-08 | 2012-10-04 | Ricoh Co Ltd | 物体の像を取得する装置、物体の像を取得する方法、プログラム、及び記録媒体 |
JP2012189826A (ja) * | 2011-03-10 | 2012-10-04 | Ricoh Co Ltd | 物体の像を取得する装置、物体の像を取得する方法、プログラム、及び記録媒体 |
WO2016008203A1 (zh) * | 2014-07-15 | 2016-01-21 | 中兴通讯股份有限公司 | 自动获取拍摄参数的方法及装置 |
JP2018061124A (ja) * | 2016-10-04 | 2018-04-12 | 株式会社ソニー・インタラクティブエンタテインメント | 撮影装置、情報処理システム、情報処理装置、および偏光画像処理方法 |
US10805528B2 (en) | 2016-10-04 | 2020-10-13 | Sony Interactive Entertainment Inc. | Image capturing apparatus, information processing system, information processing apparatus, and polarized-image processing method |
WO2020241130A1 (ja) | 2019-05-29 | 2020-12-03 | 古野電気株式会社 | 情報処理システム、方法、及びプログラム |
JP7419364B2 (ja) | 2019-05-29 | 2024-01-22 | 古野電気株式会社 | 情報処理システム、方法、及びプログラム |
US12100179B2 (en) | 2019-05-29 | 2024-09-24 | Furuno Electric Co., Ltd. | Information processing system, method, and program |
WO2023171470A1 (ja) | 2022-03-11 | 2023-09-14 | パナソニックIpマネジメント株式会社 | 光検出装置、光検出システム、およびフィルタアレイ |
CN115014311A (zh) * | 2022-08-08 | 2022-09-06 | 中国人民解放军国防科技大学 | 一种基于大气偏振信息剔除天空遮挡的光罗盘定向方法 |
CN115014311B (zh) * | 2022-08-08 | 2022-11-01 | 中国人民解放军国防科技大学 | 一种基于大气偏振信息剔除天空遮挡的光罗盘定向方法 |
Also Published As
Publication number | Publication date |
---|---|
CN102177719B (zh) | 2013-08-28 |
EP2375755B1 (en) | 2013-08-07 |
US20110018990A1 (en) | 2011-01-27 |
US8390696B2 (en) | 2013-03-05 |
JPWO2010079557A1 (ja) | 2012-06-21 |
CN102177719A (zh) | 2011-09-07 |
JP5357902B2 (ja) | 2013-12-04 |
EP2375755A1 (en) | 2011-10-12 |
EP2375755A4 (en) | 2012-10-24 |
EP2375755B8 (en) | 2013-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5357902B2 (ja) | 撮像装置向き検出装置および当該装置を備える移動体 | |
CA3157194C (en) | Systems and methods for augmentation of sensor systems and imaging systems with polarization | |
US8654179B2 (en) | Image processing device and pseudo-3D image creation device | |
TWI287402B (en) | Panoramic vision system and method | |
US9858639B2 (en) | Imaging surface modeling for camera modeling and virtual view synthesis | |
US9019341B2 (en) | Method for obtaining a composite image using rotationally symmetrical wide-angle lenses, imaging system for same, and CMOS image sensor for image-processing using hardware | |
CN101809993B (zh) | 利用旋转对称广角透镜获得全景图像的方法及其设备 | |
CN104243959B (zh) | 基于偏振定向和组合定位的智能复合眼镜 | |
US9488471B2 (en) | Methods and systems for navigation and terrain change detection | |
EA008402B1 (ru) | Размещаемая на транспортном средстве система сбора и обработки данных | |
JP2008016918A (ja) | 画像処理装置、画像処理システムおよび画像処理方法 | |
CN112857356A (zh) | 无人机水体环境调查和航线生成方法 | |
CN113805829B (zh) | 导航界面的显示方法、装置、终端、存储介质及程序产品 | |
JP2008230561A (ja) | 撮像制御装置および測光領域調整方法 | |
JP2005331320A (ja) | 天空率および日照時間算出システム、および、算出プログラム | |
CN117146780B (zh) | 成像方法、终端设备及介质 | |
Serres et al. | Passive Polarized Vision for Autonomous Vehicles: A Review | |
JP2011160345A (ja) | 日照確認装置 | |
CN115222807A (zh) | 数据处理方法、设备及存储介质 | |
KR20090062015A (ko) | 실시간 멀티밴드 카메라 시스템 | |
Mazurek et al. | Utilisation of the light polarization to increase the working range of the video vehicle tracking systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980130654.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09837447 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12935321 Country of ref document: US |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2010545634 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009837447 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |