WO2002065786A1 - Procede et appareil d'acquisition d'images omnidirectionnelles et de donnees tridimensionnelles avec annotation de donnees et procede d'extension de gamme dynamique - Google Patents

Procede et appareil d'acquisition d'images omnidirectionnelles et de donnees tridimensionnelles avec annotation de donnees et procede d'extension de gamme dynamique Download PDF

Info

Publication number
WO2002065786A1
WO2002065786A1 PCT/KR2002/000223 KR0200223W WO02065786A1 WO 2002065786 A1 WO2002065786 A1 WO 2002065786A1 KR 0200223 W KR0200223 W KR 0200223W WO 02065786 A1 WO02065786 A1 WO 02065786A1
Authority
WO
WIPO (PCT)
Prior art keywords
cameras
camera
exposure
images
camera module
Prior art date
Application number
PCT/KR2002/000223
Other languages
English (en)
Inventor
Kujin Lee
Inso Kweon
Howon Kim
Junsik Kim
Original Assignee
Kujin Lee
Inso Kweon
Howon Kim
Junsik Kim
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kujin Lee, Inso Kweon, Howon Kim, Junsik Kim filed Critical Kujin Lee
Priority to JP2002565367A priority Critical patent/JP2004531113A/ja
Publication of WO2002065786A1 publication Critical patent/WO2002065786A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation

Definitions

  • the present invention relates to photography, and more particularly to a method and apparatus to acquire omni-directional images and 3-dimensional data with data annotation, and to a dynamic range extension method.
  • Omni-directional photography is one way of effectively re-creating visual experience of a location at a specific moment.
  • Omni-directional photograph may cover up to 360 degree horizontally and 180 degree vertically to cover whole spherical view from the photographing viewpoint.
  • panoramic image which omits upper part and lower part of spherical view, is also widely or generally used.
  • Methods (c) and (d) capture omni-directional images over the camera rotating time so that moving object can be captured in different location of multiple images in method (c) or distorted in method (d).
  • Method (e) utilized a plurality of cameras to cover wider viewing angle. In method (e), as the number of camera increases, each camera can cover less viewing angle and overall image resolution can be desirably enhanced.
  • each camera of the image-capturing device can not be precisely controlled.
  • Some of the desired precise controls for image capturing while in fast motion are (1) precise synchronization of triggering of all cameras at the desired photographing location and (2) an adjustment of optimal exposure amount of each camera facing to different direction without creating motion blur.
  • the present invention is directed to the elimination of the problems appeared in applications and especially in method (e) by precisely sending triggering signal to each camera at a desired photographing location and by calculating optimal exposure of each camera and sending exposure signal to each camera in terms of gain, offset and exposure time.
  • the apparatus according to the present invention measures current location with GPS(Global Positioning System) sensor, distance sensor and direction sensor while in motion.
  • the present invention is also capable of acquiring omni-directional 3- dimensional data and dynamic range extension of the image acquisition apparatus.
  • the first object of the present invention is to provide method and apparatus for omni-directional image and 3-dimensional data acquisition with data annotation and dynamic rage extension method with multiple layers of camera set modules that omnidirectionally photograph surrounding objects of the apparatus with different photographing angles to same object to acquire 3-dimensional data.
  • the second object of the present invention is to provide method and apparatus for omni-directional image and 3-dimensional data acquisition with data annotation and dynamic rage extension method with multiple layers of camera set modules that are also capable of acquiring images of same object with different exposure amount to extend dynamic range of the image data acquisition apparatus.
  • the third object of the present invention is to provide method and apparatus for omni-directional image and 3-dimensional data acquisition with data annotation and dynamic rage extension method that annotates a photographing location and a photographing time of photographed image so that the image data can be used as geographical information and it can used in connection with other geographical information in other GIS (Geographical Information System) database.
  • the fourth object of the present invention is to provide method and apparatus for omni-directional image and 3-dimensional data acquisition with data annotation and dynamic rage extension method with multiple layers of camera set modules that enable related image data processing algorithm to enhance the calculation speed for 3- dimensional data acquisition.
  • Dimensional data acquisition apparatus comprising: a multi-camera module constructed in a manner that a plurality of cameras are symmetrically arranged with a specific point in a plane, each of the cameras taking charge of each of divided angles such that the camera module can take an omni-directional continuous panoramic photograph of surrounding objects with the specific point; first frame grabbers each of which is electrically connected to each of the cameras of the multi-camera module, to grab photographed images by frames; an exposure calculator electrically connected to the frame grabbers, to calculate exposure of each camera, based on the grabbed images by frames; an exposure signal generator electrically connected to each camera, to transmit information about the exposure as a signal on the basis of the exposure calculated by the exposure calculator; storage means electrically connected to the each first frame grabber, to store images photographed by the cameras according to photographing location and photographing time; a GPS sensor to sense the photographing location and photographing time; an annotation entering unit electrically connected to the GPS sensor to calculate location and time corresponding to each frame based on sensed data of the GPS sensor, the annotation entering unit being
  • the apparatus further comprising a plurality of light intensity sensors electrically connected to the exposure calculator to allow the exposure calculator to be able to calculate the exposure amount of each camera based on external light intensity.
  • the storage means is one of digital storage devices including a hard disk, compact disk, magnetic tape and memory.
  • the apparatus further comprising an audio digital converter electrically connected to the storage means, the audio digital converter converting an audio signal sensed by an audio sensor into a digital signal as an audio clip to correspondingly attach to each image or image group to be stored in the storage means.
  • the apparatus further comprising a video camera electrically connected to the storage means via a second frame grabber for grabbing photographed moving pictures by frames, to the storage means a unique video clip corresponding to each image or image group stored in the storage means.
  • the multi-camera module further has at least one camera placed at the top thereof so that the camera can photograph an object upward.
  • the apparatus further comprising mobile means, on which the multi-camera module is mounted, to enable continuous panoramic photographing of the camera module while moving.
  • the apparatus further comprising a distance sensor and a direction sensor for respectively sensing the distance and direction of the image photography by each camera;
  • a method for acquiring 3-dimensional data comprising the steps of: acquiring for multiple images of an object in the direction of height photographed by the cameras, multi camera modules in the direction of height, the multi camera module including a plurality of cameras which are symmetrically arranged at a specific point in a plane, and which take charge of each allocating viewing angle calculated by 360° divided by the number of the cameras; searching for corresponding points in each image; extracting for distance information using trigonometry; and acquiring for 3-dimensional data based on the distance information.
  • a method for extending dynamic range of images comprising the steps of: acquiring for multi images of an object, the multi images are photographed by the cameras which have different exposure amount each other, multi camera modules in the direction of height, the multi camera module including a plurality of cameras which are symmetrically arranged at a specific point in a plane, and which take charge of each allocating viewing angle calculated by 360° divided by the number of the cameras; selectively extracting for regions in the multi images, wherein the regions are included within the dynamic range of the camera; and acquiring for images of dynamic range extension, which are generated by composing the extracting regions.
  • FIG.l illustrates the system configuration of an omni-directional 3-dimension image data acquisition apparatus according to the first embodiment of the present invention
  • FIG.2A illustrates the first exemplary embodiment of a multi-camera module according to the present invention
  • FIG.2B illustrates the second exemplary embodiment of the multi-camera module according to the present invention
  • FIG.2C illustrates the third exemplary embodiment of the multi-camera module according to the present invention
  • FIG.2D illustrates the fourth exemplary embodiment of the multi-camera module according to the present invention
  • FIG.2E illustrates the fifth exemplary embodiment of the multi-camera module according to the present invention
  • FIG.2F illustrates the sixth exemplary embodiment of the multi-camera module according to the present invention
  • FIG.3 illustrates the system configuration of an omni-directional 3-dimension image data acquisition apparatus according to the second embodiment of the present invention
  • FIG.4 is a perspective view illustrating that the multi-camera module of the invention is set in a housing
  • FIG. 5 illustrates the first exemplary embodiment that the omni-directional 3- dimension image data acquisition apparatus of the invention is mounted on a mobile means
  • FIG. 6 illustrates the second exemplary embodiment that the omni-directional 3-dimension image data acquisition apparatus of the invention is mounted on a mobile means;
  • FIG. 7 is a flow diagram illustrating a method of acquiring 3-dimensional data according to the present invention.
  • FIG. 8 illustrates a method of extending dynamic range of camera system with images photographed according to the present invention.
  • FIG. 9 is a flow diagram illustrating a method of dynamic range extension according to the present invention.
  • FIG. 10 is a perspective view illustrating spherical coordinates set to the multi-camera module to apply image processing algorithm according to the present invention
  • FIG. 11 is a plan view illustrating the upper part of the coordinates of FIG. 10;
  • FIG. 12 is a perspective view illustrating two layers of spherical coordinates to apply image processing algorithm according to the present invention;
  • FIG. 13 illustrates a panorama stitching principle by cylindrical projection according to the invention
  • FIG. 14 illustrates a panorama stitching principle by spherical projection according to the invention
  • FIG. 15 is a flow diagram illustrating a process of acquiring 3-dimension depth data using epipolar geometry by the omni-directional 3-dimension image data acquisition apparatus according to the invention
  • FIG. 16 illustrates the first configuration for showing the principle of obtaining 3-dimensional depth information using spherical coordinate according to the present invention
  • FIG. 17 illustrates the second configuration for showing the principle of obtaining 3-dimensional depth information using rotated and aligned spherical coordinates according to the present invention
  • FIG.18 illustrates a spherical image representation in the form of 2-dimensional data structure according to the present invention in the latitude and longitude;
  • FIG. 19 illustrates the first configuration for showing the principle of obtaining 3-dimensional data acquisition with two sets multiple camera module of four cameras in horizontal arrangement or horizontal displacement according to the present invention
  • FIG. 20 illustrates the second configuration for showing the principle of obtaining 3-dimensional data acquisition with 2 layers of multiple camera module set in vertical arrangement or vertical displacement according to the present invention
  • FIG. 21 illustrates the third configuration for showing the principle of obtaining 3-dimensional data acquisition with 3 layers of multiple camera module set in vertical arrangement or vertical displacement according to the present invention
  • FIG. 22 illustrates the fourth configuration for showing the principle of obtaining 3-dimensional data acquisition with 6 layers of multiple cameras set in vertical arrangement or vertical displacement according to the present invention
  • FIG. 23 illustrates exemplary setting of the multi-camera module to explain 3- dimensional data acquisition using optical flow according to the present invention
  • FIG. 24 illustrates specific points to which an object corresponds when the multiple camera module photograph the object according to the present invention
  • FIG. 25 illustrates optical flow by images acquired by the multiple camera module according to the present invention
  • FIG. 26 illustrates epipolar planes between photographed objects and set of the multiple camera module according to the present invention
  • FIG. 27 illustrates the depth effect on the slope of optical flow by the multiple camera module with respect to an epipolar plane according to the present invention
  • FIG. 28 is a perspective view illustrating a principle of acquiring 3 dimension data for a specific feature point by displacement of the omni-directional 3-dimension image data acquisition apparatus of the present invention.
  • FIG. 29 illustrates a principle of feature tracking of an object when the omnidirectional 3-dimension image data acquisition apparatus of the invention photographs the obj ect while moving.
  • FIG.l illustrates the system configuration of an omni-directional 3-dimension image data acquisition apparatus according to the first embodiment of the present invention
  • FIGS. 2 A to 2F illustrate various exemplary embodiments of a multi- camera module according to the present invention.
  • the omni-directional 3-dimension image data acquisition apparatus 100 comprises a multi-camera module 10 and a computer vision system 30 electrically connected to the multi-camera module 10.
  • the multi-camera module 10 is constructed in such a manner that a plurality of cameras 11 are arranged symmetrically with a specific point in the center such that the camera module 10 can take a photograph of an object 200 360° omni-directionally.
  • each multi-camera module 10 with a plurality of cameras 11 is vertically stacked or displaced to form and forms multi layers.
  • Each multi-camera module 10 forming multi layers can photograph an object 200 corresponding to its height with changing the viewing angle of the object 200 and generating the images of the object 200.
  • the image acquisition apparatus can acquire 3-dimensional data by combining the images from a plurality of cameras 11 of each multi-camera module 10. Also, if each camera 11 of the stacked multi-camera module 10 photographs an object 200 with different exposure amount, then the apparatus can extend dynamic range.
  • the computer vision system 30 stores images photographed by each camera 11 as digital data. Images photographed by the multi-camera module 10 are grabbed by frames by first frame grabbers 31 of the computer vision system 30.
  • An annotation entering unit 35 enters annotations corresponding to the images by corresponding photographing locations and time, and a storage means 32 stores the images with the annotations.
  • the multi-camera module 10 is constructed in a manner that the optical axis of the cameras 11 are placed on the same plane and the cameras 11 are arranged symmetrically with a specific point in the center.
  • Each camera 11 takes charge of each of divided angles of omni-direction so as to be able to carry out 360° photographing with a panoramic photographing technique.
  • each camera 11 is in charge of the angle of 90° or 72° approximately.
  • a plurality of multi-camera modules 10 are stacked or displaced to form multi layers to allow the cameras 11 of the multi-camera module 10 to be capable of taking a picture of an object 200 with different photographing angles in the direction of height.
  • the cameras 11 of the multi-camera modules 10 are lined up in the direction of height.
  • the uppermost multi- camera module 10 further includes one camera 11 as shown in FIGS.
  • This camera 11 set on the top of the multi-camera modules 10 has upward photographing direction so that the multi-camera modules 10 can carry out upward photographing as well as omni-directional photographing.
  • the images photographed by the cameras 11 of the multi-camera module 10 are grabbed by frames by the first frame grabbers 31.
  • the first frame grabber 31 is respectively connected to each cameras 11 constructing each layer.
  • the images by frames grabbed by the first frame grabbers 31 are stored in the storage means 32 and, simultaneously, transmitted to an exposure calculator 33.
  • the storage means 32 that stores photographed images as digital data may be selected from digital storage devices including a hard disk, compact disk, magnetic tapes and memory.
  • the images sent to the exposure calculator 33 from the first frame grabbers 31 are analyzed by the exposure calculator 33, to thereby calculate proper exposure of each camera 11.
  • the calculated exposure is transmitted to an exposure signal generator 34 electrically connected to the exposure calculator 33.
  • the exposure signal generator 34 sends a signal corresponding to the degree each camera 11 should expose to each camera 11 based on the exposure.
  • geographical information such as a photographing location and time, distance, and direction of each camera 11 can be obtained using a GPS sensor 20 capable of collecting location information from a satellite in real time.
  • a distance sensor 37a and direction sensor 37b can be further included to obtain the photographing distance and direction.
  • the GPS sensor 20 can receive location data from a satellite in real time to confirm the location information in real time.
  • a GPS signal from the satellite may be cut off when photographing is carried out in an area where GPS signal can be blocked by, such as high building, tunnel, forest and so on.
  • the annotation-entering unit 35 is electrically connected with the GPS sensor 20, distance sensor 37a and direction sensor 37b to receive geographical information data sensed by the sensors 20, 37a and 37b.
  • the annotation-entering unit 35 enters annotation corresponding to each frame to be stored in the storage means 32.
  • the annotation is photographing location, direction and photographing time of each frame of photographed images.
  • the images in which annotations are entered by frames are stored in the storage means 32.
  • the storing means 32 stores the images transmitted from the camera 11 after the camera 11 photographing or at the same time when the camera 11 photographs and transmits thereto.
  • sensing operations of the sensors 20, 37a and 37b related with the storing and photographing operations, and the operations such as calculation and interchange of exposure information with respect to the camera 11 are carried out in relation to each other.
  • the photographing and storing operations and the operations related thereto start when a trigger signal generator 36 transmits the trigger signals the exposure signal generator 34. Also, the trigger signal generator 36 is electrically connected to the distance sensor 37a and the annotation-entering unit 35.
  • annotation entering unit 35 uses signal sensed by the distance sensor 37a and direction sensor 37b to calculate location information.
  • the trigger signal of the trigger signal generator 36 can be temporarily blocked to the exposure signal generator34 in order that image storing operation paces with the image acquiring operation.
  • the storage means 32 further connects with an audio digital converter 38 or a video camera 39, so that a corresponding audio clip or video clip as an accessory data attaches to each image or group of images to be stored in the storage means 32.
  • the audio digital converter 38 converts an analog audio signal sensed by an audio sensor 38a into a digital signal to store it in the storage means 32 as digital data.
  • the video camera 39 takes a motion picture of the objects 200 at a photographing location or a photographing interval of a location, corresponding to photographed image or image groups. The motion pictures are grabbed by frames by a second frame grabber 39a to be stored in the storage means 32.
  • FIG.3 illustrates the system configuration of an omni-directional 3-dimension image data acquisition apparatus according to the second embodiment of the present invention.
  • the exposure calculator 33 calculates the exposure of each camera 11.
  • the calculated exposure information is transmitted to each camera 11 by the exposure signal generator 34.
  • light intensity sensors 33a are electrically connected to the exposure calculator 33 to sense light intensity around the photographing location or in front of the object 200 to be photographed.
  • a light intensity sensing signal transmitted from the light intensity sensor 33a is delivered to the exposure calculator 33 that calculates the proper exposure of each camera 11.
  • the calculated exposure is transmitted as a signal to each camera 11 through the exposure signal generator 34.
  • Each camera 11 controls exposure amount thereof based on the exposure signal.
  • FIG. 4 is a perspective view illustrating that the multi-camera module of the invention is set in a housing
  • FIG. 5 illustrates the first exemplary embodiment that the omni-directional 3-dimension image data acquisition apparatus of the invention is mounted on a mobile means
  • FIG. 6 illustrates the second exemplary embodiment that the omni-directional 3-dimension image data acquisition apparatus of the invention is mounted on a mobile means.
  • the multi-camera module 10 and computer vision system 30 are mounted on a mobile means 60 to be given a mobile function to photograph the object 200 while moving.
  • the multi-camera module 10 is set inside a specific housing 40 to protect its body and expose only the lens part to the outside.
  • the bottom of the housing 40 is supported by a jig 50 to be raised to a specific height, and the housing 40 is moved up and down by an elevator 70 set in the mobile means 60.
  • the mobile means 60 is preferably an automobile having a driving engine or a cart capable of being moved by the human power or self-propelled by its own power supply.
  • FIG. 7 is a flow diagram illustrating a method of acquiring 3-dimensional data according to the present invention.
  • a multi-camera module 10 includes a plurality of cameras 11 which are symmetrically arranged each other with respect to a specific point and each optical center of which is in the same horizontal plane.
  • the multi-camera module 10 stacked or displaced to construct multiple layers in the direction of height, optical axis of cameras 11 of layer is coplanar and optical centers of cameras 11 in the direction of height facing same direction are in the same line.
  • All the cameras 11 included in every multi- camera module 10 forming multi layers photograph or acquire multiple images of an object 200 according to the direction of height SI 100. Due to the height difference among cameras 11, images taken from each cameras 11 are also different therefrom regarding the same object 200.
  • the apparatus matches feature points of same object 200 among the images SI 200, and extracts distance information of the images by measuring the distance of the feature points using geometric trigonometry SI 300. Then, the apparatus acquires 3- dimensional data based on the extracted distance information SI 400.
  • FIG. 8 illustrates a method of extending dynamic range of camera system with images photographed according to the present invention.
  • each camera 11 has limited dynamic range such that if an area of the object 200 is brighter than upper limit of the dynamic range, then the area is recorded as white, and if an area of the object 200 is darker than lower limit of dynamic range, then the area is recorded as black. Details of image of the area are lost in both cases. If the cameras 11 record the same object 200 with different exposure amount, even though brightness of an area of the object 200 exceeds dynamic range of the camera 11, other cameras 11 with different exposure amount may have the area sensed within the dynamic range. Each camera 11 can photograph different portions of the object 200 in detail, and a better image can be composed by selecting better regions from the images.
  • FIG. 9 is a flow diagram illustrating a method of dynamic range extension according to the present invention.
  • each set or layer of multi-camera module 10 has a plurality of cameras 11 which are symmetrically installed in a plane so that the optical center of each camera 11 is in the same plane and at a specific point. Further, each viewing angle of the cameras 11 is allocated by dividing 360 by the number of the cameras 11.
  • each camera 11 differently set the exposure amount photographs an object 200 and thereby multiple images of same object 200 from multiple cameras 11 are acquired ins Step S2100.
  • Properly exposed region is selectively extracted in each of the photographed multiple images. Namely, the properly exposed region is possibly selected in the definitely photographed image according to the white balance and details recorded in the region in Step S2200.
  • FIG. 10 is a perspective view illustrating spherical coordinates set to the multi-camera module to apply image processing algorithm according to the present invention.
  • Wide angle or omni-directional multi camera module 10 with cameras may have multiple optical centers.
  • Each camera 11 has its own optical center and effective viewing angle to cover portion of spherical coordinate.
  • the omni-directional image sensor has single optical center.
  • four distinct spherical coordinates are initially assumed and calibrated. Even though this example shows four cameras 11, it should be understood that any number of cameras 11 might be used without modifying disclosed method. These spherical coordinates can be rotated to any direction to align with paired cameras 11 to apply epipolar geometry more effectively.
  • FIG. 11 is a plan view illustrating the upper part of the coordinates of FIG. 10
  • four camera 11 with effective viewing angle 120° covers 360° of panorama with some overlapped viewing angle.
  • FIG. 12 is a perspective view illustrating two layers of spherical coordinates to apply image processing algorithm according to the present invention.
  • This vertical arrangement has advantages in capturing horizontal 360° panorama image that is the natural viewing direction for human with depth information.
  • the spherical coordinates are set to make their axes and connecting line between optical centers of paired two cameras 11 in the direction of height become collinear.
  • This spherical coordinate arrangement has advantage in applying epipolar geometry for searching corresponding points between two images. Longitudinal lines of said spherical coordinate become the epipolar lines for the two images taken from paired two cameras 11.
  • FIG. 13 illustrates a panorama stitching principle by cylindrical projection according to the invention
  • FIG. 14 illustrates a panorama stitching principle by spherical projection according to the invention.
  • FIGS. 13 and 14 show an exemplary panorama image generation by cylindrical projection or spherical projection from hexagonal collection of images. Cylindrical or spherical images are mapped onto the surface of cylinder or sphere before it is presented, then the cylinder or sphere is presented to the user as if it is observed from the center of cylinder or sphere through the window of the viewer software on computer monitor. Dotted lines in FIG. 13 show coverage of projection from optical center of each camera 11.
  • FIG. 15 is a flow diagram illustrating a process of acquiring 3-dimensional depth data using epipolar geometry by the omni-directional 3-dimension image data acquisition apparatus according to the invention.
  • FIG. 15 shows steps of image data acquisition to 3-dimensional data extraction using the spherical coordinate arrangement and applying epipolar geometry.
  • This method comprising the steps of: a spherical coordinate is assumed where the center of spherical coordinate is set to the projection center of the camera 11 and axis of the spherical coordinate and optical axis of the camera 11 is collinear SI 00.
  • Mapping table or function between points on the spherical coordinate and points on the image plane of the camera 11 is found by either empirical or mathematical calibration S200.
  • Step S500 Rotate the said spherical coordinates of those spheres to make their axes and connecting line between optical projection centers of paired cameras 11 become collinear and longitude 0° of two spherical coordinate are in same direction in Step S500. Apply epipolar geometry to correlate points between said two images in Step S600.
  • FIG. 16 illustrates the first configuration for showing the principle of obtaining 3-dimensional depth information using spherical coordinate according to the present invention
  • FIG. 17 illustrates the second configuration for showing the principle of obtaining 3-dimensional depth information using rotated and aligned spherical coordinates according to the present invention. Namely, FIGS. 16 and 17 illustrate principle of extracting 3-dimensional depth information from two images taken from two cameras 11.
  • the spherical coordinates SI, S2 is set for each camera 11 to the optical center of the camera 11 and axis of the spherical coordinate and optical axis of the camera 11 is collinear and mapping look up table of each camera 11 is calibrated for the said spherical coordinate.
  • epipolar planes between two pair of cameras 11 are not coplanar to their longitudinal lines of said spherical coordinate.
  • coordinate rotations of said spherical coordinate are performed. By applying rotation function to each point, a spherical coordinate can be rotated. Each point on old coordinate is one-to-one mapped to a point on new coordinate.
  • FIG. 17 illustrates new spherical coordinates set to each camera 11 by coordinate rotation where their centers of coordinates are the optical center Cl and C2 respectively and epipolar planes and longitudinal lines are coplanar where longitude 0° of each spherical coordinate are aligned to same direction.
  • FIG.18 illustrates a spherical image representation in the form of 2-dimensional data structure according to the present invention in the latitude and longitude. Namely, FIG. 18 illustrates an exemplary 2-D representation of two spherical images.
  • X-axis represents longitude and Y-axis represents latitude.
  • longitudinal lines appear as straight lines parallel to the Y-axis. This representation can be directly mapped to a two-dimensional array for fast computation.
  • FIG. 19 illustrates the first configuration for showing the principle of obtaining 3-dimensional data acquisition with two sets of multiple camera module 10 of four cameras 11 in horizontal arrangement or horizontal displacement according to the present invention.
  • cameras 11 S j and S 5 are used. Spherical coordinates of the cameras 11 Si . and S 5 are properly rotated to align with their baseline that contains centers of projection for cameras U S ! and S 5 .
  • cameras 11 S 3 and S 7 are used to determine 3-dimensional data of object 200 B. Spherical coordinates of those cameras 11 S 3 and S 7 are properly rotated to align with their baseline that contains centers of projection for cameras 11 S 3 and S 7 . If the object 200 A is moving in relation to the multi camera module 10, so that a position of A is moved to other position of A', it is out of coverage of camera 11 Si. In this case, a pair of new cameras 11 S 2 and S 5 can be used to trace movement of the object 200 A so that object 200 tracking can be seamless. This is another novel advantage of this omni-directional camera system with multi camera module 10. FIG.
  • the omni-directional camera system comprises two sets or layers of multi camera module 10 each of which includes four cameras 11 in vertical arrangement or displacement. Baseline and distance between two cameras 11 are fixed. Spherical coordinates are set so that their centers of coordinates are the optical projection centers of each camera 11 and epipolar planes and longitudinal lines are coplanar. This coordinate arrangement has advantage in epipolar line searching because longitudinal lines become the epipolar lines. Depth information to object 200 A can be calculated from photographing angle difference of object 200 A between two spherical coordinates.
  • FIG. 21 illustrates the third configuration for showing the principle of obtaining 3-dimensional data acquisition with 3 layers of multi camera module 10 set in vertical arrangement or vertical displacement according to the present invention.
  • the omni-directional camera system comprises three sets or layers of multi camera module 10 each of which includes four cameras 11 in vertical arrangement or displacement.
  • Method of calculating distance is same as two cameras 11 case but three cameras 11 provide more accurate distance information because their disparity of two adjacent images could be smaller while longest baseline distance could be same or bigger than two- camera system.
  • FIG. 22 illustrates the fourth configuration for showing the principle of obtaining 3-dimensional data acquisition with 6 layers of multi camera module set in vertical arrangement or vertical displacement according to the present invention.
  • the omni-directional camera system comprises six sets of multi camera module
  • the system can increase maximum baseline distance without increasing disparity amount of adjacent images. Also, images from multiple cameras 11 produce optical flow of an object 200 that can be used to determine depth of the object 200.
  • FIGS. 23 to 27 illustrate 3-dimensional sensing principle from optical flow acquired by array of cameras 11.
  • FIG. 23 illustrates exemplary setting of the multi-camera module to explain 3- dimensional data acquisition using optical flow according to the present invention.
  • FIG. 23 shows exemplary setting of objects 200 and cameras 11 to explain the principle.
  • the cameras 11 are arranged linearly and their poses are same and their optical centers reside on a same baseline.
  • FIG. 24 illustrates specific points to which an object corresponds when the multiple camera module photograph the object according to the present invention.
  • FIG. 24 illustrates viewing direction of several feature points on the objects 200 from each camera 11. For example, viewing direction to point a from cameras 11 A to F changes gradually from right to left.
  • FIG. 25 illustrates optical flow by images acquired by the multiple camera module according to the present invention.
  • FIG. 25 illustrates optical flow of objects 200 on images taken from cameras 11 A to F. Images taken from cameras 11 A and F are quite different, but intermediate images taken from cameras 11 B to E show optical flow of objects from cameras 11 A to F.
  • FIG. 26 illustrates epipolar planes between photographed objects and set of the multiple camera modules according to the present invention. Namely, FIG. 26 illustrates several epipolar planes for the array of cameras 11.
  • FIG. 27 illustrates the depth effect on the slope of optical flow by the multiple camera module with respect to an epipolar plane according to the present invention. Namely, FIG. 27 illustrates an exemplary optical flow for an epipolar plane M from images taken from cameras 11 A to F to show the effect of distance of objects 200 to the optical flow.
  • FIG. 28 is a perspective view illustrating a principle of acquiring 3 dimension data for a specific feature point by displacement of the omni-directional 3-dimension image data acquisition apparatus of the present invention. Namely, FIG. 28 illustrates 3- dimensional data acquisition principle in mobile omni-directional camera system. For example, this mobile system can be used to acquire spatial information for urban 3- dimensional modeling. The displacement of omni-directional camera system can be determined by counting wheel rotation and direction change of the vehicle.
  • Change in angular distance of viewing direction of the tracked feature point and displacement of camera system between two locations can determine location of the feature point by applying trigonometry. Spatial location of the tracked feature point can also be determined by disparity in the images of vertically arranged cameras 11. These redundant sources of depth information increase accuracy of 3-dimensional data acquisition.
  • FIG. 29 illustrates a principle of feature tracking of an object when the omnidirectional 3-dimension image data acquisition apparatus of the invention photographs the object while moving. Namely, FIG. 29 illustrates principle of 3-dimensional data acquisition in urban environment.
  • Urban structures are mostly consisting vertical rectangular planes, and lines are mostly vertical or horizontal.
  • a rectangular plane is assumed in one location, it can be traced along with the displacement of mobile omni-directional camera system. This is faster method than tracing each feature point.
  • location and pose of a rectangular plane is determined, the information about the location and pose of the rectangular plane can also be used to determine camera 11 locations and poses of other images also including the rectangular plane.
  • the number of the cameras 11 installed on one multi camera module 10 is variably due to their volume, even though the number of the cameras 11 in the preferred embodiments is 4 to 8.
  • the number of the sets or layers of the multi camera module 10 is variable according as the user desired. Namely, the multi camera module 10 can be vertically stacked or displaced to form one or more layers. Therefore, the method and apparatus for omni-directional image and 3- dimensional data acquisition can omni-directionally photograph one object and more objects and acquire 3-dimensional image data thereof.
  • the method and apparatus can extend dynamic range of the image by composing the acquired image data from a plurality of the camera 11 of one or more multi camera module(s) 10, which has different exposure amount each other.
  • the method and apparatus associate the acquired image data with annotation data such as a photographing location and a photographing time and generate a geographical information which may be connected with other geographical information generated in other geographical information system database and which may be used in other geographical information system.
  • the apparatus can be installed a mobile means so that it can continuously photograph an object while moving.
  • the forgoing embodiments are merely exemplary and are not to be construed as limiting the present invention.
  • the present teachings can be readily applied to other types of apparatuses.
  • the description of the present invention is intended to be illustrative, and not to limit the scope of the claims. Many alternatives, modifications, and variations will be apparent to those skilled in the art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

L'invention concerne un procédé et un appareil d'acquisition d'images omnidirectionnelles et de données tridimensionnelles par annotation de données et un procédé d'extension de gamme dynamique, servant à photographier, à acquérir des images tridimensionnelles photographiées par des caméras possédant chacune une quantité d'exposition différente en connexion avec la hauteur d'un objet, à étendre la gamme dynamique et à générer des informations géographiques par saisie d'une annotation telle que l'emplacement et l'heure de photographie dans les images photographiées, informations pouvant être connectées à une autre base de données du système d'information géographique. L'appareil comprend un ou plusieurs modules multicaméra (10) empilés et formant de multiples couches en hauteur pour l'acquisition d'images tridimensionnelles et l'extension de la gamme dynamique des images tridimensionnelles, chaque module multicaméra (10) comprenant plusieurs caméras (11) disposées de façon symétrique par rapport à un point spécifique sur un plan. De plus, les modules multicaméra de l'appareil sont connectés à un système (30) de vision par ordinateur de commande de la photographie et de stockage (32) des images photographiées, et peuvent être montés sur des organes mobiles.
PCT/KR2002/000223 2001-02-09 2002-02-09 Procede et appareil d'acquisition d'images omnidirectionnelles et de donnees tridimensionnelles avec annotation de donnees et procede d'extension de gamme dynamique WO2002065786A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002565367A JP2004531113A (ja) 2001-02-09 2002-02-09 注釈記入による全方位性3次元イメージデータ獲得装置、その方法及び感光範囲の拡大方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26776101P 2001-02-09 2001-02-09
US60/267,761 2001-02-09

Publications (1)

Publication Number Publication Date
WO2002065786A1 true WO2002065786A1 (fr) 2002-08-22

Family

ID=23020026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2002/000223 WO2002065786A1 (fr) 2001-02-09 2002-02-09 Procede et appareil d'acquisition d'images omnidirectionnelles et de donnees tridimensionnelles avec annotation de donnees et procede d'extension de gamme dynamique

Country Status (4)

Country Link
JP (1) JP2004531113A (fr)
KR (1) KR100591144B1 (fr)
CN (1) CN1531826A (fr)
WO (1) WO2002065786A1 (fr)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003083773A2 (fr) * 2002-03-27 2003-10-09 The Trustees Of Columbia University In The City Of New York Procede et systeme d'imagerie
WO2004015374A1 (fr) * 2002-08-09 2004-02-19 Surveylab Group Limited Instrument mobile, dispositif de visionnement et procedes de traitement et de stockage de donnees
EP1422660A2 (fr) * 2002-11-22 2004-05-26 Eastman Kodak Company Procédé et dispositif de production d'images utilisant une plage étendue pour la composition d'une image panoramique
WO2004109385A2 (fr) 2003-06-03 2004-12-16 Steuart Leonard P Iii Systeme de cameras 3d/360° numeriques
CN100574379C (zh) * 2003-10-28 2009-12-23 皇家飞利浦电子股份有限公司 具有全景摄影或镶嵌功能的数字照相机
WO2011143622A3 (fr) * 2010-05-13 2012-01-05 Google Inc. Acquisition sous-marine d'imagerie pour cartographier des environnements en 3d
WO2012056437A1 (fr) 2010-10-29 2012-05-03 École Polytechnique Fédérale De Lausanne (Epfl) Système de réseau de capteurs omnidirectionnels
CN102510474A (zh) * 2011-10-19 2012-06-20 中国科学院宁波材料技术与工程研究所 一种360度全景监控系统
EP2569951A1 (fr) * 2010-05-14 2013-03-20 Hewlett-Packard Development Company, L.P. Système et procédé de capture vidéo à points de vue multiples
EP2741138A1 (fr) * 2012-12-07 2014-06-11 Kabushiki Kaisha Topcon Caméra omnidirectionnelle avec GPS
CN104052966A (zh) * 2013-03-11 2014-09-17 玛珂系统分析和开发有限公司 用于确定位置的方法和装置
US9733080B2 (en) 2009-10-02 2017-08-15 Kabushiki Kaisha Topcon Wide-angle image pickup unit and measuring device
US11703820B2 (en) 2020-06-08 2023-07-18 Xiamen University Of Technology Monitoring management and control system based on panoramic big data
EP4138042A4 (fr) * 2021-05-31 2024-06-05 3I Inc. Procédé de fourniture de contenu d'espace intérieur virtuel et son serveur

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4551990B2 (ja) * 2005-02-03 2010-09-29 名古屋市 パノラマ映像作成方法と作成装置
WO2007124664A1 (fr) * 2006-04-29 2007-11-08 Shanghai Jietu Software Co., Ltd. Appareil et procédé permettant d'obtenir une représentation panoramique contenant des informations de position et procédé de création, d'annotation et d'affichage d'un service de cartographie électrique panoramique
US8341112B2 (en) * 2006-05-19 2012-12-25 Microsoft Corporation Annotation by search
JP4942177B2 (ja) * 2006-11-20 2012-05-30 キヤノン株式会社 情報処理装置及びその制御方法、プログラム
KR100955483B1 (ko) * 2008-08-12 2010-04-30 삼성전자주식회사 3차원 격자 지도 작성 방법 및 이를 이용한 자동 주행 장치의 제어 방법
KR101018233B1 (ko) * 2009-03-31 2011-02-28 강희민 에이치디알 영상 생성 장치 및 방법
JP6511221B2 (ja) 2010-08-26 2019-05-15 グーグル エルエルシー 入力テキスト文字列の変換
CN102663624A (zh) * 2012-04-19 2012-09-12 太仓欧卡网络服务有限公司 一种应用于电子商务的服装展示方法
US9843789B2 (en) * 2014-09-08 2017-12-12 Panasonic Intellectual Property Management Co., Ltd. Still-image extracting method and image processing device for implementing the same
WO2017149875A1 (fr) * 2016-02-29 2017-09-08 ソニー株式会社 Dispositif de commande de capture d'image, dispositif de capture d'image et procédé de commande de capture d'image
CN105787988B (zh) * 2016-03-21 2021-04-13 联想(北京)有限公司 一种信息处理方法、服务器及终端设备
US10200624B2 (en) * 2016-04-06 2019-02-05 Facebook, Inc. Three-dimensional, 360-degree virtual reality exposure control
DE102017118714A1 (de) * 2016-08-17 2018-02-22 Google Inc. Mehrstufiges Kameraträgersystem für die stereoskope Bildaufnahme
KR101843025B1 (ko) 2016-12-30 2018-03-28 (주)잼투고 카메라워크 기반 영상합성 시스템 및 영상합성방법
CN109708655A (zh) * 2018-12-29 2019-05-03 百度在线网络技术(北京)有限公司 导航方法、装置、车辆及计算机可读存储介质
KR102617222B1 (ko) * 2019-12-24 2023-12-26 주식회사 쓰리아이 전방위 화상정보 기반의 자동위상 매핑 처리 방법 및 그 시스템
KR102483388B1 (ko) * 2020-07-31 2022-12-30 주식회사 쓰리아이 전방위 이미지 처리 방법 및 이를 수행하는 서버
WO2022255546A1 (fr) * 2021-05-31 2022-12-08 주식회사 쓰리아이 Procédé de fourniture de contenu d'espace intérieur virtuel et son serveur
KR102600421B1 (ko) * 2021-05-31 2023-11-09 주식회사 쓰리아이 가상 실내공간 컨텐츠 제공 방법 및 그를 위한 서버
WO2024031245A1 (fr) * 2022-08-08 2024-02-15 北京原创力科技有限公司 Procédé et système de prise de vue vidéo synchrone basée sur un réseau de caméras

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0818858A (ja) * 1994-07-01 1996-01-19 Matsushita Electric Ind Co Ltd 画像合成装置
US6141034A (en) * 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0818858A (ja) * 1994-07-01 1996-01-19 Matsushita Electric Ind Co Ltd 画像合成装置
US6141034A (en) * 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7495699B2 (en) 2002-03-27 2009-02-24 The Trustees Of Columbia University In The City Of New York Imaging method and system
WO2003083773A2 (fr) * 2002-03-27 2003-10-09 The Trustees Of Columbia University In The City Of New York Procede et systeme d'imagerie
WO2003083773A3 (fr) * 2002-03-27 2004-03-11 Univ Columbia Procede et systeme d'imagerie
US7647197B2 (en) 2002-08-09 2010-01-12 Surveylab Group Limited Mobile instrument, viewing device, and methods of processing and storing information
WO2004015374A1 (fr) * 2002-08-09 2004-02-19 Surveylab Group Limited Instrument mobile, dispositif de visionnement et procedes de traitement et de stockage de donnees
EP1422660A2 (fr) * 2002-11-22 2004-05-26 Eastman Kodak Company Procédé et dispositif de production d'images utilisant une plage étendue pour la composition d'une image panoramique
EP1422660A3 (fr) * 2002-11-22 2005-01-05 Eastman Kodak Company Procédé et dispositif de production d'images utilisant une plage étendue pour la composition d'une image panoramique
US10218903B2 (en) 2003-06-03 2019-02-26 Leonard P. Steuart, III Digital 3D/360 degree camera system
US10574888B2 (en) 2003-06-03 2020-02-25 Leonard P. Steuart, III Digital 3D/360 degree camera system
US7463280B2 (en) 2003-06-03 2008-12-09 Steuart Iii Leonard P Digital 3D/360 degree camera system
EP1639405A4 (fr) * 2003-06-03 2007-07-04 Leonard P Steuart Iii SYSTEME DE CAMERAS 3D/360o NUMERIQUES
US9124802B2 (en) 2003-06-03 2015-09-01 Leonard P. Steuart, III Digital 3D/360 degree camera system
US8937640B2 (en) 2003-06-03 2015-01-20 Leonard P. Steuart, III Digital 3D/360 degree camera system
JP2011160442A (ja) * 2003-06-03 2011-08-18 Leonard P Steuart Iii ディジタル3d/360度カメラシステム
US9706119B2 (en) 2003-06-03 2017-07-11 Leonard P. Steuart, III Digital 3D/360 degree camera system
US11012622B2 (en) 2003-06-03 2021-05-18 Leonard P. Steuart, III Digital 3D/360 degree camera system
JP2007525054A (ja) * 2003-06-03 2007-08-30 スチュアート,レオナード,ピー.,Iii ディジタル3d/360度カメラシステム
US8274550B2 (en) 2003-06-03 2012-09-25 Steuart Iii Leonard P Skip Digital 3D/360 degree camera system
EP1639405A2 (fr) * 2003-06-03 2006-03-29 Steuart III, Leonard P. SYSTEME DE CAMERAS 3D/360o NUMERIQUES
WO2004109385A2 (fr) 2003-06-03 2004-12-16 Steuart Leonard P Iii Systeme de cameras 3d/360° numeriques
US8659640B2 (en) 2003-06-03 2014-02-25 Leonard P. Steuart, III Digital 3D/360 ° camera system
CN100574379C (zh) * 2003-10-28 2009-12-23 皇家飞利浦电子股份有限公司 具有全景摄影或镶嵌功能的数字照相机
US9733080B2 (en) 2009-10-02 2017-08-15 Kabushiki Kaisha Topcon Wide-angle image pickup unit and measuring device
WO2011143622A3 (fr) * 2010-05-13 2012-01-05 Google Inc. Acquisition sous-marine d'imagerie pour cartographier des environnements en 3d
EP2569951A1 (fr) * 2010-05-14 2013-03-20 Hewlett-Packard Development Company, L.P. Système et procédé de capture vidéo à points de vue multiples
US9264695B2 (en) 2010-05-14 2016-02-16 Hewlett-Packard Development Company, L.P. System and method for multi-viewpoint video capture
EP2569951A4 (fr) * 2010-05-14 2014-08-27 Hewlett Packard Development Co Système et procédé de capture vidéo à points de vue multiples
US10362225B2 (en) 2010-10-29 2019-07-23 Ecole Polytechnique Federale De Lausanne (Epfl) Omnidirectional sensor array system
WO2012056437A1 (fr) 2010-10-29 2012-05-03 École Polytechnique Fédérale De Lausanne (Epfl) Système de réseau de capteurs omnidirectionnels
CN102510474B (zh) * 2011-10-19 2013-12-25 中国科学院宁波材料技术与工程研究所 一种360度全景监控系统
CN102510474A (zh) * 2011-10-19 2012-06-20 中国科学院宁波材料技术与工程研究所 一种360度全景监控系统
US9544476B2 (en) 2012-12-07 2017-01-10 Kabushiki Kaisha Topcon Omnidirectional camera
EP2741138A1 (fr) * 2012-12-07 2014-06-11 Kabushiki Kaisha Topcon Caméra omnidirectionnelle avec GPS
CN104052966A (zh) * 2013-03-11 2014-09-17 玛珂系统分析和开发有限公司 用于确定位置的方法和装置
US11703820B2 (en) 2020-06-08 2023-07-18 Xiamen University Of Technology Monitoring management and control system based on panoramic big data
EP4138042A4 (fr) * 2021-05-31 2024-06-05 3I Inc. Procédé de fourniture de contenu d'espace intérieur virtuel et son serveur

Also Published As

Publication number Publication date
KR20030078903A (ko) 2003-10-08
CN1531826A (zh) 2004-09-22
JP2004531113A (ja) 2004-10-07
KR100591144B1 (ko) 2006-06-19

Similar Documents

Publication Publication Date Title
US7126630B1 (en) Method and apparatus for omni-directional image and 3-dimensional data acquisition with data annotation and dynamic range extension method
WO2002065786A1 (fr) Procede et appareil d'acquisition d'images omnidirectionnelles et de donnees tridimensionnelles avec annotation de donnees et procede d'extension de gamme dynamique
CN110462686B (zh) 用于从场景获得深度信息的设备和方法
US8937640B2 (en) Digital 3D/360 degree camera system
Nayar Omnidirectional video camera
Tan et al. Multiview panoramic cameras using mirror pyramids
KR101800905B1 (ko) 복수의 검출기 어레이를 구비한 다해상도 디지털 대형 카메라
CN104506761B (zh) 一种360度全景立体摄像机
JP2004514951A (ja) 球面立体視撮影システム及びその方法
KR20060120052A (ko) 파노라마 또는 모자이크 기능을 구비한 디지털 카메라
CN1896684A (zh) 地理数据收集装置
US6839081B1 (en) Virtual image sensing and generating method and apparatus
CN104079916A (zh) 一种全景三维视觉传感器及使用方法
CN102831816B (zh) 一种提供实时场景地图的装置
JP2019533875A (ja) 道路の合成トップビュー画像を生成するための方法およびシステム
JP3328478B2 (ja) カメラシステム
US7839490B2 (en) Single-aperture passive rangefinder and method of determining a range
US20050030392A1 (en) Method for eliminating blooming streak of acquired image
CN211047088U (zh) 一种可定位的全景三维成像系统
JPH1023465A (ja) 撮像方法及び装置
US20080252746A1 (en) Method and apparatus for a hybrid wide area tracking system
TWI626603B (zh) 圖像獲取方法及圖像獲取裝置
JP2001177850A (ja) 画像信号記録装置および方法、画像信号再生方法並びに記録媒体
CN110726407A (zh) 一种定位监控方法及装置
KR100591167B1 (ko) 획득된 이미지의 번짐자국 제거방법

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1020037010262

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2002565367

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 028071050

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 1020037010262

Country of ref document: KR

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
WWG Wipo information: grant in national office

Ref document number: 1020037010262

Country of ref document: KR