WO2008116288A1 - Caméra de report de distance de rapport de divergence - Google Patents

Caméra de report de distance de rapport de divergence Download PDF

Info

Publication number
WO2008116288A1
WO2008116288A1 PCT/CA2008/000502 CA2008000502W WO2008116288A1 WO 2008116288 A1 WO2008116288 A1 WO 2008116288A1 CA 2008000502 W CA2008000502 W CA 2008000502W WO 2008116288 A1 WO2008116288 A1 WO 2008116288A1
Authority
WO
WIPO (PCT)
Prior art keywords
target objects
distance
dimensional information
obtaining
light sources
Prior art date
Application number
PCT/CA2008/000502
Other languages
English (en)
Inventor
Keigo Iizuka
Original Assignee
Keigo Iizuka
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keigo Iizuka filed Critical Keigo Iizuka
Priority to US12/532,644 priority Critical patent/US8982191B2/en
Publication of WO2008116288A1 publication Critical patent/WO2008116288A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/586Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the present invention relates to a method and system for detecting and mapping three- dimensional information pertaining to an object.
  • the invention relates to a method and system that makes use of the divergence of light over distance as a means of determining distance.
  • Distance mapping or depth mapping cameras have become ubiquitous in numerous fields such as robotics, machine vision for acquiring three-dimensional (3D) information about objects, intelligent transport systems for assisting driver safety and navigation, bioscience for detecting 3D laparoscopic images of internal organs, non-contact fingerprinting, and image manipulation in movie or television studios.
  • the triangulation method uses two or more images taken by strategically placed cameras to calculate the position of the target object.
  • the 3D information is obtained by synchronizing the movement of the light projection spot with the direction of the return path of the light scattered to the detector. This triangulation method is limited in that it is too slow and generally can not provide for the real-time operation of a television camera.
  • the time of flight method makes use of the time required for a round trip of a laser beam using a phase or frequency modulated probe light.
  • a heterodyne detection converts the phase or frequency information into the distance to the target object. While depth resolution can be within micrometers, time of flight methods can be limited to the order of minutes in providing a depth map of a target object.
  • Projection methods determine depth information from the patterns of configured light projected onto a target object.
  • the best known projection method is the moire technique.
  • the moire technique incorporates two grid patterns projected before and after the surface distortion to generate a moire pattern of the deformed surface. While a moire pattern can be readily generated, not so are the corresponding distance calculations. This distance is calculated in a manner similar to applying triangulation at every intersection of the pattern.
  • the AXI-VISION CAMERATM method as described in US Patent 7,0165,519 Bl is based on a hybrid of the projection and time of flight methods.
  • the projecting light is temporally rather than spatially modulated.
  • an instantaneous time of flight pattern is captured using an ultra-fast shutter.
  • Distance is then calculated at every pixel providing a picture quality to that of High Definition Television (HDTV).
  • HDTV High Definition Television
  • the AXI-VISION CAMERATM method requires a large number of fast response time LEDs, a photomultiplier based shutter that are all secured to the AXI-VISION CAMERATM.
  • the object of the present invention is to provide a method and device for detecting and mapping three-dimensional information pertaining to one or more target objects while further addressing the limitations of the prior art.
  • a method of obtaining three-dimensional information for one or more target objects comprises the steps of: (a) selecting one or more target objects; (b) illuminating the one or more target objects using a first light source at a distance Xi from the one or more target objects, and capturing an image I 1 of the one or more target objects using at least one camera device; (c) illuminating the one or more target objects using a second light source at a distance X 2 from the one or more target objects, and capturing an image I 2 of the one or more target objects using the at least one camera device; and (d) calculating the distance X between the first and second light sources, and the one or more target objects, based on the decay of intensities of light sources over distances Xi and X 2 , using the ratio of the image intensities between the images I 1 and I 2 .
  • a system for obtaining three-dimensional information for one or more target objects comprises: (a) at least two light sources, including a first light source at a distance X 1 from one or more target objects, and a second light source at a distance X 2 from the one or more target objects; and (b) at least one camera device linked to, or incorporating, at least one computer device, the camera device, or the camera device and computer device together, being operable to: (i) capture and store digital frame information, including capturing an image I 1 of the one or more target objects, illuminated by the first light source, and an image I 2 of the same one or more target objects, illuminated by the second light source; and (ii)calculate the distance X between the at least two light sources and the one or more target objects based on the ratio of the decay of image intensities of light sources over distances X 1 and X 2 using the ratio of the image intensities between the images Ii and I 2 .
  • Fig. Ia illustrates the distance mapping apparatus capturing an image Ii of a target object using the first illuminating device as a light source.
  • Fig. Ib illustrates the distance mapping apparatus capturing an image I 2 of a target object using the second illuminating device as a light source.
  • Fig. Ic illustrates the amplitude ratio between I 1 and I 2 .
  • Fig. 2 further illustrates the geometry of the divergence ratio distance mapping camera.
  • Fig. 3 is a graph that illustrates the relationship between amplitude ratio R and the distance r/d.
  • Fig. 4 illustrates the ability to move the divergence ratio distance mapping camera to an arbitrary location.
  • Fig. 5 illustrates the double illuminator sets for eliminating shadows.
  • Fig. 6 illustrates a more detailed apparatus for the divergence ratio distance mapping camera.
  • Fig. 7a image taken with front illumination.
  • Fig. 7b image taken with back illumination.
  • Fig. 7d measured depth profile of the object.
  • Fig. Ia illustrates the distance mapping apparatus 1 capturing an image I 1 of a target object 3 using a first illuminating device 5 as a light source.
  • the first illuminating device 5 illuminates the target object 3 and the camera device 7 captures an image Ii that is stored by the system on a storage medium (see Fig. 6).
  • Fig. Ib illustrates the distance mapping apparatus 1 capturing an image I 2 of a target object 3 using a second illuminating device 9 as a light source.
  • the second illuminating device 9 illuminates the target object 3 and the camera device 7 captures an image I 2 that is stored by the system on a storage medium (see Fig. 6).
  • Fig. Ic illustrates the amplitude ratio between Ii and I 2 .
  • the present invention functions by comparing the relative image intensities between I 1 and I 2 on a pixel by pixel basis.
  • Fig. Ic demonstrates a graph wherein the relative image intensities between I 1 and I 2 have been plotted providing the amplitude ratio.
  • Fig. 2 further illustrates the geometry of a divergence ratio distance mapping camera apparatus, in accordance with one aspect of the present invention.
  • the apparatus is set up in the following manner: a camera device 7 is at a distance r from the target object 3, a first illuminating device 5 labelled LED Si is at a distance X 1 (r-d) from the target object 3, and a second illuminating device 9 labelled LED S 2 is at a distance X 2 (r+d) from the target object 3.
  • the camera device 7 is also linked to, or incorporates, a computer device, such as a processor 11 which is operable to compute the distance to the target object 3 from the relative image intensities of I 1 and I 2 .
  • the camera device 7 firstly captures an image Ii of the target object 3 using the first illuminating device 5 LED si as a light source.
  • This image Ii is stored by a storage medium, such as a frame grabber 21 (see Fig. 6) of processor 11 (the frame grabber 21 being hardwired to processor 11 or incorporated into computer programming made accessible to processor 11).
  • the camera device 7 then captures an image I 2 of the target object 3 using the second illuminating device 9 LED S 2 as a light source.
  • This image I 2 is also stored by the storage medium, such as the frame grabber 21 of the processor 11.
  • the processor 11 is operable to compare the relative image intensities of I 1 and I 2 on a pixel by pixel basis. Before this comparison can be performed, the processor 11 calculates the image intensity of I 1 using the first illuminating device 5 LED S 1 as a light source as well as calculating the image intensity of I 2 using the second illuminating device 9 LED S 2 as a light source. The calculated pixel distance information is stored to the storage medium.
  • the image intensity calculation is further utilized to calculate a distance X between the first and second light sources.
  • X is calculated based on the distance between the one or more target objects and the midpoint of the first and second light sources.
  • X may be specifically calculated based on the decay of light intensity over distances Xi and X 2 , using the ratio of the image intensities between the images Ii and I 2 .
  • the decay of light intensity over distance X is l/x n , where n can be either a positive or negative real number including a non-integer.
  • the first image I 1 and the second image I 2 are stored on a storage medium known by individuals skilled in the art; the distance X at the midpoint between the two light sources and the one or more target objects is calculated by analyzing images Ii and I 2 on a pixel by pixel basis; the calculated pixel distance information is stored using a known coordinate storage medium.
  • the pair of light sources 5, 9 that are used are infra red point light sources. It is commonly known to those skilled in the art that intensity of a point light source decays with the square of the distance due to the divergence property of light. Therefore the intensity of the light from the illuminating device 5 LED S 1 directed at the target object 3 located at a distance r from the camera device 7 is:
  • Po is the power of the point source from first light source 5 LED S ⁇ .
  • the target object 3 reflects light back towards the camera device 7.
  • the amount of the reflection is characterized by the radar term of back scattering cross section ⁇ .
  • the light power associated with the back scattering toward the camera device 7 is:
  • the processor 11 determines the distance of each pixel on a pixel by pixel basis and is operable to store the information in a coordinate storage medium as a distance map for the target object 3 in a manner that is known to those skilled in the art.
  • This distance map for the target object 3 contains all of the pixel positional information of the target object 3.
  • Fig. 3 is a plot that illustrates the relationship between amplitude ratio R and the distance r/d.
  • the sensitivity of the measurement is optimum near the origin and reduces as the asymptote is approached. It is also interesting to note that the Eq. (7) can be rewritten in the following manner:
  • Fig. 4 illustrates the ability to move the divergence ratio distance mapping camera or camera device 7 to an arbitrary location.
  • the ratio distance mapping camera apparatus was previously described with the camera device 7 in-line with the two illuminating devices: first light source 5 LED Si and second light source 9 LED S 2 .
  • the camera device 7 may be placed in an arbitrary location and the actual distance that is being measured is between the target object 3 and the center of the two illuminating devices (first light source 5 LED S 1 and second light source 9 LED S 2 ).
  • first light source 5 LED S 1 and second light source 9 LED S 2 As depicted in Fig.
  • the position of the camera device 7 has been relocated from the center of the two illuminating devices (first light source 5 LED Si and second light source 9 LED S 2 ) to an arbitrary location (X 15 Zi) in the x-z plane.
  • the target object 3 is located along the z-axis at coordinate (0,z).
  • the separation between the two light point sources is kept constant at 2d as before.
  • the distance measured is always along this z axis between target object 3 and the center of the two illuminating devices (first light source 5 LED S 1 and second light source 9 LED S 2 ).
  • This ability to position the camera independent of the orientation of the light sources provides considerable operational advantage that could be readily incorporated into different embodiments and arrangements of the present invention. For example if the LED's are installed either on the studio ceiling or wall, the hand-held camera does not have to bear an additional weight or attachment.
  • Fig. 5 illustrates the double illuminator sets for eliminating shadows.
  • the camera device 7 is positioned too away far from the connecting line between the two point sources of light (first light source 5 LED si and second light source 9 LED S 2 )
  • shadows may be incorporated into the distance map.
  • the shadow is an undesirable image product and may corrupt the accuracy of the distance map.
  • Fig. 5 demonstrates an embodiment of the present invention wherein two sets of LEDs are used (illuminator set 1 13 at a distance of Xi and X 3 from the target object and illuminator set 2 15 at a distance of X 2 and X 4 from the target object) to illuminate the target object 3, in this case an overturned cup.
  • each pair of illuminator sets 13, 15 cast their own specific shadow (see shadow of set 1 17 and shadow of set 2 19).
  • the pair of shadows 17, 19 can be reduced and the corresponding distance map of the overturned cup target object 3 improved.
  • All of the light sources applied may be of the same type.
  • Fig. 5 further demonstrates an embodiment of the invention permitting the capture of additional images, corresponding to the additional light sources.
  • a calculation of the set of distances X'2 between the third and fourth light sources and the one or more target objects based on the decay of intensities of light sources over distances X 3 and X 4 may be performed using the ratio of the image intensities between the images I 3 and I 4 on a pixel by pixel basis.
  • the final distance map for the overturned cup target object 3 is actually comprised of a merging of the distance map developed by the first illuminator set 13 with the distance map developed by the second illuminator set 15.
  • the two derived distance maps are compared on a pixel by pixel basis and an appropriate pixel is selected by comparison. The comparison is made possible due to the fact that the relative position of the camera device 7 and the target object 3 has not been changed as between the two distance maps and a simple merging step common to individuals skilled in the art is sufficient to combine the two distance maps to form a final distance map.
  • This final distance map generally minimizes the effect of shadows on the pixel positioning to provide a more exact result.
  • Fig. 6 illustrates a more detailed apparatus for the divergence ratio distance mapping camera.
  • the more detailed apparatus is comprised of the camera device 7 connected to a computer device including a frame grabber 21 (part of the processing unit 11), also connected to a video sync separator 23 which in turn is connected to a video microcontroller 25 that controls the front 27 and back 29 LED drivers that control the pair of illuminating devices i.e. the front light source 5 LED S 1 and back light source 9 LED S 2 .
  • the video microcontroller 25 may be connected to a monitor display
  • a distance mapping system including: one or more target objects; at least one camera device and at least two light sources.
  • the at least one computer device is linked to the camera that is operable to capture digital frame information, and calculate distance X between the center of the light sources and one or more target objects based on the method of the present invention.
  • the composite video signal out of an infra red camera device 7 was used to synchronize the timing of the front and back infra red illuminating devices 5, 9.
  • the composite video signal is fed into a video sync separator 23 that extracts the vertical sync pulse and also provides the odd/even field information. This output from the sync is provided to the video microcontroller 25.
  • the video microcontroller 25 is operable to signal the front LED 5 to illuminate when the camera device 7 is in the even field and an image I 1 is captured and stored in the frame grabber 21 (see Fig. 7a).
  • the video microcontroller 25 is operable to signal the back LED 9 to illuminate when the camera device 7 is in the odd field and an image I 2 is captured and stored in the frame grabber 21 (see Fig. 7b).
  • the video microcontroller is operable to signal the first light source and the second light source to illuminate the one or more target objects in a sequential manner.
  • the video microcontroller and the camera device are linked to enable the camera device to capture the images of the one or more target objects sequentially, while illuminated by the first and second light sources in sequence.
  • the frame grabber 21 then applies the derived distance Eq. (7) to the two images I 1 and I 2 on a pixel by pixel basis and the distance map of the target object 3 can be displayed on a monitor display (31) (see Fig. 7d).
  • the depth of an image or the distance map can be displayed using a colour code with red being the shortest distance and purple being the longest distance.
  • red being the shortest distance
  • purple being the longest distance.
  • black and white wherein dark represents the shortest distance and white represents the longest distance.
  • Fig. 7a illustrates an image taken with front illumination. This is an image of a face of a statue taken by an IR camera device 7 only using front illumination 5 and stored in the frame grabber 21.
  • Fig. 7b illustrates an image taken with back illumination. This is an image of a face of a statue taken by an IR camera device 7 only using back illumination 9 and stored in the frame grabber 21.
  • Fig. 7c illustrates a photograph of the object. This is a normal photograph of the face of the statue for comparison with the generated depth profile (see Fig. 7d).
  • Fig. 7d illustrates a measured depth profile of the object. This is the result of the frame grabber applying the distance Eq. (7) on the image taken in Fig. 7a and the image taken in Fig. 7b on a pixel by pixel basis. As previously explained, dark represents the shortest distance between the target object 3 and the midpoint between the front 5 and back 9 LED devices while white depicts longer distances between the target object 3 and the midpoint between the front 5 and back 9 LED devices.
  • the range of the camera of the present invention there exist practical limits on the range of the camera of the present invention.
  • the measurement depends upon the divergence of light. This limit may be extended by unbalancing the intensities of these two illuminating light sources by avoiding the saturation of the CCD camera device 7 when the front LED 5 is too close to the target object 3.
  • the light intensities Int A , and Int ⁇ are more or less the same but as the distance to the target object 3 become excessively short and the front light 5 intensity IntA becomes much larger than Int ⁇ , this difference between the light intensities no longer remains within the linear range of the CCD camera device 7.
  • this limit may be extended by either reducing the exposure time of the CCD camera device 7 when capturing the image with the front LED or by reducing the output power of only the front LED 5 by a known factor N and keeping Int ⁇ unchanged.
  • An appropriate value for N may be found by monitoring the composite video signal of the CCD camera device 7.
  • the distance mapping system is operable to provide the three-dimensional information and be incorporated, for example, into the automobile industry.
  • the system may be integrated with an automobile distance mapping system or accident prevention system.
  • the distance mapping apparatus of the present invention could be incorporated to quickly provide for the exact 3D pixel positional information for prototype vehicles.
  • the distance mapping device provides for real time operational advantages. Most other methods need time for setting up the sensors at specified locations even before making a measurement.
  • the distance mapping apparatus is a handheld operation that can aim at target at any angle and location relative to the object. Additional embodiments of the invention may be further incorporated into aspects of the automobile industry.
  • the distance mapping system is linked to an on-board computer system of a vehicle and is operable to provide environmental 3D information to assist the on-board system of the vehicle in accident prevention.
  • the distance mapping system can differentiate the radar echo from the trees on the pavement from that of an oncoming moving car from the shape of the objects.
  • ordinary radar systems do not function in this manner. For example, when a car equipped with an ordinary radar system negotiates the curve of a road the ordinary radar system may mistake trees along the side of the road as an oncoming car and the automatic braking system would be triggered. In other words, an ordinary radar system functions optimally when the equipped car is travelling along a straight road but not along a curved road.
  • the distance mapping system could be incorporated into traffic surveillance system and is operable to obtain three-dimensional information associated with automobiles, such three-dimensional information providing a basis for establishing make and model information for automobiles.
  • the present invention may be used to assist in determining the make and model of a vehicle by only calculating the distance map of one profile.
  • the detailed information of the one profile of the vehicle could be extrapolated to recreate a 3D representation of the vehicle, or it could be used to compare with stored library information of 3D representations of vehicles for greater accuracy and identification.
  • a distance mapping system is provided as previously described wherein the distance mapping system is operable to provide distance information relative to one or more target objects to an individual with a visual impairment regarding their physical environment.
  • environmental three- dimensional information may be provided so as to assist an individual who is visually impaired. Due to the ability to freely position the camera device 7, the distance mapping system could be readily incorporated into an assistive cane or incorporated into the outer apparel of a visually impaired individual. The distance mapping system could then provide signals regarding the calculated environmental information to the individual based upon predefined criteria such as the size and the shape of an object. Ordinary echo based warning systems are not capable of discerning whether an object is a man, a tree, or a building.
  • the distance mapping system could be readily incorporated into a humanoid robotic system to provide omni directional eye vision to more quickly identify its surroundings and avoid obstacles.
  • the system of the present invention may be integrated with any type of robot to provide distance mapping information to the robot in relation to one or more target objects.
  • the distance mapping system is operable to provide environmental 3D information for a 3D virtual studio. Due to the ability to freely position the camera device 7, a 3D virtual studio could be readily set up wherein the live scenery is inserted either in the foreground or background of a computer generated graphic, but could be positioned anywhere within the frame as long as the computer generated graphics itself as the distance information in each pixel.
  • the 3D virtual studio could function in real time and could greatly assist television broadcasts.
  • the system of the present invention may be integrated with a television studio system to provide three-dimensional information that enables editing of one or more target objects in a three-dimensional studio environment.
  • the distance mapping system is incorporated into the ⁇ osmetic industry to quickly provide a 3D imaging of, a patient without having to manufacture a moulding. More specifically, this 3D imaging could be used to assist a plastic surgeon and subsequently the patient in determining how certain features may appear after a procedure. In addition, the 3D imaging information could be used by an orthodontist who makes teeth mouldings, the provided information could greatly reduce the need of an uncomfortable moulding process. The current distance mapping system would allow for a 3D image to be made without any contact with the patient and a non-invasive manner.
  • the distance mapping system may be readily incorporated into a security system and more specifically linked to a fingerprint capture system, wherein the distance mapping system is accomplished in a touch-less non contact method that provides a three-dimensional creation of a 3D map of a fingerprint without having to ink the individual's fingers or touch a panel for scanning of the palm.
  • the distance mapping system may be readily incorporated into surveillance systems to provide for profile information on an individual. If a front profile of an individual has been captured, the distance mapping system could be used to generate a side profile of the individual. Additionally, if the side profile of an individual has been captured, the front profile could be extrapolated based upon the 3D distance mapping information.
  • the system may be integrated with a biometric authentication system to enable bio-authentication of individuals based on touch-less capture of bio-authentication information such as fingerprints.
  • a distance mapping system wherein the distance mapping apparatus may substitute or replace the illuminating light sources with sound transducers to achieve a sonar distance mapping camera for underwater objects like a submarine or a school of fish result.
  • a distance mapping system wherein for the purposes of real time 3D information gathering, the sources are of the same type (i.e. acoustic sources or light sources).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un procédé et un système de détection et de report d'informations en trois dimensions concernant un ou plusieurs objets cibles. Plus particulièrement, la présente invention consiste: à sélectionner un ou des objets cibles; à éclairer le ou les objets cibles au moyen d'une première source lumineuse et à capturer une image du ou des objets cibles; à éclairer par la suite le même ou les mêmes objets cibles au moyen d'une seconde source lumineuse; à capturer une image du ou des objets cibles; et à calculer enfin la distance au point médian entre les deux sources lumineuses et le ou les objets cibles en fonction de la perte d'intensité de la lumière avec la distance, par analyse du rapport des intensités d'image sur une base pixel par pixel.
PCT/CA2008/000502 2007-03-23 2008-03-03 Caméra de report de distance de rapport de divergence WO2008116288A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/532,644 US8982191B2 (en) 2007-03-23 2008-03-03 Divergence ratio distance mapping camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/690,503 US20080231835A1 (en) 2007-03-23 2007-03-23 Divergence ratio distance mapping camera
US11/690,503 2007-03-23

Publications (1)

Publication Number Publication Date
WO2008116288A1 true WO2008116288A1 (fr) 2008-10-02

Family

ID=39774342

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2008/000502 WO2008116288A1 (fr) 2007-03-23 2008-03-03 Caméra de report de distance de rapport de divergence

Country Status (2)

Country Link
US (2) US20080231835A1 (fr)
WO (1) WO2008116288A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9389431B2 (en) 2011-11-04 2016-07-12 Massachusetts Eye & Ear Infirmary Contextual image stabilization

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8189003B2 (en) * 2007-05-08 2012-05-29 Dreamworks Animation Llc System and method for rendering computer graphics utilizing a shadow illuminator
US8218135B2 (en) * 2007-08-01 2012-07-10 Ford Global Technologies, Llc System and method for stereo photography
WO2009073950A1 (fr) * 2007-12-13 2009-06-18 Keigo Izuka Système de caméras et procédé permettant de fusionner des images pour créer une image omni-focalisée
KR101483462B1 (ko) * 2008-08-27 2015-01-16 삼성전자주식회사 깊이 영상 획득 장치 및 방법
US8982182B2 (en) * 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US20120098971A1 (en) * 2010-10-22 2012-04-26 Flir Systems, Inc. Infrared binocular system with dual diopter adjustment
US10363102B2 (en) 2011-12-30 2019-07-30 Mako Surgical Corp. Integrated surgery method
US9383753B1 (en) 2012-09-26 2016-07-05 Google Inc. Wide-view LIDAR with areas of special attention
US9192445B2 (en) 2012-12-13 2015-11-24 Mako Surgical Corp. Registration and navigation using a three-dimensional tracking sensor
KR102048361B1 (ko) * 2013-02-28 2019-11-25 엘지전자 주식회사 거리 검출 장치, 및 이를 구비하는 영상처리장치
US9134114B2 (en) * 2013-03-11 2015-09-15 Texas Instruments Incorporated Time of flight sensor binning
US9644973B2 (en) * 2013-03-28 2017-05-09 Google Inc. Indoor location signalling via light fittings
CN104755874B (zh) * 2013-04-01 2018-08-28 松下知识产权经营株式会社 具有多个光源的运动传感器装置
CN105593786B (zh) * 2013-11-07 2019-08-30 英特尔公司 对象位置确定
US10921877B2 (en) * 2014-10-20 2021-02-16 Microsoft Technology Licensing, Llc Silhouette-based limb finder determination
WO2016189495A1 (fr) 2015-05-27 2016-12-01 Van Dyke, Marc Alerte d'accidents prédits entre des véhicules sans conducteur
US10031522B2 (en) 2015-05-27 2018-07-24 Dov Moran Alerting predicted accidents between driverless cars
US10706572B2 (en) 2015-08-26 2020-07-07 Olympus Corporation System and method for depth estimation using multiple illumination sources
US10489925B2 (en) * 2017-08-13 2019-11-26 Shenzhen GOODIX Technology Co., Ltd. 3D sensing technology based on multiple structured illumination
DE102019219585A1 (de) * 2019-12-13 2021-06-17 Robert Bosch Gmbh Vorrichtung und Verfahren zur lichtgestützten Abstandsbestimmung, Steuereinheit und Arbeitsvorrichtung

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58155377A (ja) * 1982-03-10 1983-09-16 Akai Electric Co Ltd 位置検出装置
US4843565A (en) * 1987-07-30 1989-06-27 American Electronics, Inc. Range determination method and apparatus
US6897946B2 (en) * 1998-05-25 2005-05-24 Matsushita Electric Industrial Co., Ltd. Ranger finder device and camera

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4425501A (en) * 1981-03-30 1984-01-10 Honeywell Inc. Light aperture for a lenslet-photodetector array
US4720723A (en) * 1983-06-24 1988-01-19 Canon Kabushiki Kaisha Distance measuring device
JPS6060511A (ja) * 1983-09-14 1985-04-08 Asahi Optical Co Ltd 測距装置
JPH0616147B2 (ja) * 1986-03-26 1994-03-02 チノン株式会社 カメラ
US5003166A (en) * 1989-11-07 1991-03-26 Massachusetts Institute Of Technology Multidimensional range mapping with pattern projection and cross correlation
US5706417A (en) * 1992-05-27 1998-01-06 Massachusetts Institute Of Technology Layered representation for image coding
US6822563B2 (en) * 1997-09-22 2004-11-23 Donnelly Corporation Vehicle imaging system with accessory control
US7359782B2 (en) * 1994-05-23 2008-04-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
JPH11508359A (ja) * 1995-06-22 1999-07-21 3ディブイ・システムズ・リミテッド 改善された光学測距カメラ
US6215898B1 (en) * 1997-04-15 2001-04-10 Interval Research Corporation Data processing system and method
US6022124A (en) * 1997-08-19 2000-02-08 Ppt Vision, Inc. Machine-vision ring-reflector illumination system and method
US6266053B1 (en) * 1998-04-03 2001-07-24 Synapix, Inc. Time inheritance scene graph for representation of media content
JP3840341B2 (ja) * 1998-10-15 2006-11-01 浜松ホトニクス株式会社 立体情報検出方法及び装置
US7068825B2 (en) * 1999-03-08 2006-06-27 Orametrix, Inc. Scanning system and calibration method for capturing precise three-dimensional information of objects
US6549203B2 (en) * 1999-03-12 2003-04-15 Terminal Reality, Inc. Lighting and shadowing methods and arrangements for use in computer graphic simulations
US6362822B1 (en) * 1999-03-12 2002-03-26 Terminal Reality, Inc. Lighting and shadowing methods and arrangements for use in computer graphic simulations
US7224384B1 (en) * 1999-09-08 2007-05-29 3Dv Systems Ltd. 3D imaging system
US6700669B1 (en) * 2000-01-28 2004-03-02 Zheng J. Geng Method and system for three-dimensional imaging using light pattern having multiple sub-patterns
US7027642B2 (en) * 2000-04-28 2006-04-11 Orametrix, Inc. Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects
JP2001318303A (ja) * 2000-05-08 2001-11-16 Olympus Optical Co Ltd カメラの測距装置
JP4040825B2 (ja) * 2000-06-12 2008-01-30 富士フイルム株式会社 画像撮像装置及び距離測定方法
US6618123B2 (en) * 2000-10-20 2003-09-09 Matsushita Electric Industrial Co., Ltd. Range-finder, three-dimensional measuring method and light source apparatus
JP2005507489A (ja) * 2001-02-23 2005-03-17 ジェニコン サイエンスィズ コーポレーション 検体分析において拡張ダイナミックレンジを提供する方法
US7271839B2 (en) * 2001-03-15 2007-09-18 Lg Electronics Inc. Display device of focal angle and focal distance in iris recognition system
US20030046177A1 (en) * 2001-09-05 2003-03-06 Graham Winchester 3-dimensional imaging service
US20030202120A1 (en) * 2002-04-05 2003-10-30 Mack Newton Eliot Virtual lighting system
US7221437B1 (en) * 2002-08-20 2007-05-22 Schaefer Philip R Method and apparatus for measuring distances using light
US7123351B1 (en) * 2002-08-20 2006-10-17 Schaefer Philip R Method and apparatus for measuring distances using light
US7301472B2 (en) * 2002-09-03 2007-11-27 Halliburton Energy Services, Inc. Big bore transceiver
US7167173B2 (en) * 2003-09-17 2007-01-23 International Business Machines Corporation Method and structure for image-based object editing
DE602004030375D1 (de) * 2003-12-24 2011-01-13 Redflex Traffic Systems Pty Ltd System und verfahren zur bestimmung der fahrzeuggeschwindigkeit
US20050267657A1 (en) * 2004-05-04 2005-12-01 Devdhar Prashant P Method for vehicle classification
KR101183000B1 (ko) * 2004-07-30 2012-09-18 익스트림 리얼리티 엘티디. 이미지 프로세싱을 기반으로 한 3d 공간 차원용 시스템 및방법
US7389041B2 (en) * 2005-02-01 2008-06-17 Eastman Kodak Company Determining scene distance in digital camera images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58155377A (ja) * 1982-03-10 1983-09-16 Akai Electric Co Ltd 位置検出装置
US4843565A (en) * 1987-07-30 1989-06-27 American Electronics, Inc. Range determination method and apparatus
US6897946B2 (en) * 1998-05-25 2005-05-24 Matsushita Electric Industrial Co., Ltd. Ranger finder device and camera

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9389431B2 (en) 2011-11-04 2016-07-12 Massachusetts Eye & Ear Infirmary Contextual image stabilization
US10571715B2 (en) 2011-11-04 2020-02-25 Massachusetts Eye And Ear Infirmary Adaptive visual assistive device

Also Published As

Publication number Publication date
US8982191B2 (en) 2015-03-17
US20080231835A1 (en) 2008-09-25
US20100110165A1 (en) 2010-05-06
US20150009300A9 (en) 2015-01-08

Similar Documents

Publication Publication Date Title
US8982191B2 (en) Divergence ratio distance mapping camera
CN111448591B (zh) 不良光照条件下用于定位车辆的系统和方法
JP4899424B2 (ja) 物体検出装置
JP4612635B2 (ja) 低照度の深度に適応可能なコンピュータ視覚を用いた移動物体検出
CN107273846B (zh) 一种人体体型参数确定方法及装置
US7321386B2 (en) Robust stereo-driven video-based surveillance
JP5748920B2 (ja) 車両周辺の表示方法
Macknojia et al. Calibration of a network of kinect sensors for robotic inspection over a large workspace
CN108234984A (zh) 双目深度相机系统和深度图像生成方法
US11073379B2 (en) 3-D environment sensing by means of projector and camera modules
US20160184998A1 (en) Robot identification system
KR20140137577A (ko) 차량의 주변환경 정보를 제공하는 장치 및 그 방법
JP2008092459A (ja) 周辺監視装置
KR20060021922A (ko) 두 개의 카메라를 이용한 장애물 감지 기술 및 장치
EP2476999B1 (fr) Procédé de mesure du déplacement, dispositif de mesure du déplacement et programme pour la mesure du déplacement
Lion et al. Smart speed bump detection and estimation with kinect
CN111086451B (zh) 一种平视显示系统、显示方法和汽车
KR20120002723A (ko) 3차원 영상 정보를 이용하는 사람 인식 방법 및 장치
JPH1144533A (ja) 先行車両検出装置
JP2006215743A (ja) 画像処理装置及び画像処理方法
JPH11211738A (ja) 移動体の速度計測方法およびその方法を用いた速度計測装置
CN108171754A (zh) 一种基于双目视觉的机器人导航装置及方法
KR102546045B1 (ko) 라이다(LiDAR)를 이용한 인체 감시장치
JP5785515B2 (ja) 歩行者検出装置及び方法、並びに車両用衝突判定装置
JPH0273471A (ja) 三次元形状推定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08733606

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 12532644

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08733606

Country of ref document: EP

Kind code of ref document: A1