WO2013091016A1 - Système lumineux structuré pour une acquisition géométrique robuste - Google Patents

Système lumineux structuré pour une acquisition géométrique robuste Download PDF

Info

Publication number
WO2013091016A1
WO2013091016A1 PCT/AU2012/001587 AU2012001587W WO2013091016A1 WO 2013091016 A1 WO2013091016 A1 WO 2013091016A1 AU 2012001587 W AU2012001587 W AU 2012001587W WO 2013091016 A1 WO2013091016 A1 WO 2013091016A1
Authority
WO
WIPO (PCT)
Prior art keywords
light sources
light
reference point
image sensor
pattern
Prior art date
Application number
PCT/AU2012/001587
Other languages
English (en)
Inventor
David John Battle
David John Maunder
Donald James Bone
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Publication of WO2013091016A1 publication Critical patent/WO2013091016A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37131Moire pattern, diffraction grating, fringe

Definitions

  • the present invention relates generally to the photographic acquisition of detailed geometric information regarding a scene and, in particular, to the use of modulated light sources to make this information robust and independent of imaging system calibrations.
  • Applications of the invention include metrology, robot part picking, reverse engineering and geometry-based post processing of digital images.
  • Digital images represent projections of the three-dimensional world into two dimensions from the particular viewpoint of a camera.
  • a human, computer or mobile robot to possess additional information regarding a captured scene, such as the relative distances, or depths, of objects contained within the scene.
  • the ability to record object distances in an image allows a photographer, for example, to selectively blur features in the background so as to enhance the salience of foreground objects.
  • Computer security systems employing image analysis algorithms are greatly assisted in segmenting objects of interest where appropriate geometric information is available. Accurate knowledge of scene geometry is also important in the case of mobile robots, which may be required to negotiate and handle complex objects in the real world.
  • TOF time of flight
  • DMD depth from defocus
  • triangulation methods In TOF methods, light propagation is timed PC I
  • a system is desired to be capable of utilising multiple sources of illumination without being cumbersome or expensive. Such a system would be expected to scale well with the number of sources, implying that the sources themselves should be simple and inexpensive with minimal communication and control requirements.
  • Such an improved depth ranging system would also be independent of the kind of optical distortion that currently needs to be calibrated out of depth calculations, implying that the use of optical elements should be minimised and that depth calculations should not rely on implicit correspondences between pixel positions and scene coordinates.
  • an improved depth ranging system should be capable of acquiring information from multiple sources simultaneously, and efficiently fusing the information into a coherent geometric description of the scene. Such a system would then constitute a geometry acquisition system, rather than simply a depth mapping system.
  • a method of determining coordinates of a reference point on an object in three-dimensional space in a scene captured by an image sensor The object is irradiated by light sources which are modulated at a different spatio-temporal frequency.
  • the method generates a composite intensity (light) signal (which has multiple components each providing a phase angle - called herein a composite phase signal) on the object by a predetermined geometric arrangement of the light sources, and captures the composite phase signal at the reference point using the image sensor.
  • a processing arrangement determines, from the captured composite phase signal, a set of measured positioning parameters independent of a position of the image sensor. The measured positioning parameters from the light sources are used for determining the coordinates of the reference point.
  • one of the at least two coordinates is a depth coordinate.
  • each of the plurality of light sources is characterised by at least one known positioning parameter with respect to a reference line through said reference point.
  • the difference in spatio-temporal frequency of the at least one light source results from at least one of:
  • each light source comprises multiple intersecting patterns to create a two-dimensional signal.
  • the patterns are orthogonal.
  • each said light source may comprise a rotating pattern surrounding the light source.
  • the composite phase signal forms a wavefront that is radial to the corresponding light source.
  • the patterns are sinusoidal.
  • the measured positioning parameters comprise an angular displacement from the light source.
  • the object is in a three-dimensional space and the method determines the three-dimensional coordinates of the reference point in the three- dimensional space.
  • the positioning parameters are measured with respect to a reference line through each of the plurality of spatio-temporally modulated light sources, thereby being independent of a position of the image sensor.
  • a robotic system comprising:
  • a robotic manipulator arranged for operation in association with an object in three- dimensional space
  • an image sensor arranged for imaging a scene formed at least by the object
  • a plurality of spatio-temporally modulated light sources configured to
  • a computing device connected to the robotic manipulator, the image sensor and the each of the light sources and configured to:
  • a set of measured positioning parameters used in the determination of at least two coordinates of the reference point, from the plurality of light sources, wherein the set of measured positioning parameters is determined independent of a position of the image sensor;
  • the image sensor is mounted upon the robotic manipulator.
  • the image sensor may be located at the reference point, where desirably the image sensor can be a photodiode.
  • Fig. 1 illustrates the intersections of iso-phase planes generated by a pair of cylindrical spatio-temporally modulated light sources
  • Fig. 2 is a plan view of a pair of spatio-temporally modulated sources in relation to two objects in the scene and two possible camera viewpoints;
  • Figs. 3A and 3B illustrate a typical intensity signal projected by a spatio-temporally modulated light source comprising two superimposed sinusoidal carriers with distinct periods, and the appearance of the signal in the Fourier domain;
  • Figs. 4A and 4B illustrate a skew sinusoidal pattern possessing modulation in both horizontal (X) and vertical (Y) orientations along with mappings of such a pattern into cylindrical and spherical geometries in one implementation according to the present disclosure
  • Fig. 5 is a visualisation of an iso-phase surface of a spherically mapped skew sinusoidal mask and how light rays emanating from the centre of the sphere are mapped to spatial coordinates with varying azimuth and elevation angles;
  • FIGs. 6A and 6B illustrate the construction of a pattern possessing skew-orthogonal sinusoidal components with two distinct periods along with mappings of such a pattern into cylindrical and spherical geometries in an implementation according to the present disclosure
  • Fig. 7 illustrates the deployment of two spatio-temporally modulated light sources, each projecting multiple skewed sinusoidal patterns into a 3-D scene according to a preferred implementation
  • Fig. 8 illustrates the typical convergence of the reconstruction algorithm when fusing data from multiple spatio-temporal light sources in another implementation
  • Fig. 9 is a schematic block diagram illustrating the sequence of steps in processing captured frames from a video camera into estimates of scene geometry according to a preferred implementation
  • FIG. 10 is a schematic illustration of a typical robotic part picking application involving multiple spatio-temporally modulated light sources
  • FIGs. 1 1A and 1 IB form a schematic block diagram of a general purpose computer system upon which arrangements described can be practiced, and
  • Fig. 12 illustrates a system with three spatio-temporally modulated light sources oriented so as to allow a determination of the three dimensional location of any point in the scene.
  • FIG. 10 illustrates a robotic system 1000 in which a manipulator 1060 controlled by a computer 1070 is tasked with handling various objects 1030.
  • the system 1000 therefore, needs to know or otherwise estimate or determine precise spatial locations of the objects 1030, and particularly in association with manipulator 1060, whose location in the 3D space will be known.
  • a video camera 1050 operating as an image sensor for capturing images of the scene in which the objects 1030 are located at a high frame rate.
  • the image sensor (camera) 1050 may be conveniently mounted to a peripheral limb of the manipulator 1060.
  • the system 1000 also includes multiple light sources 1010, 1020 and 1040 configured at known locations around the periphery of the scene for substantially simultaneous irradiation of the scene. These light sources 1010, 1020 and 1040 illuminate the scene coincidently, but are spatio- temporally modulated on account of radiating intensities that are functions of both position and time. At least two such light sources are required according to the present disclosure.
  • the multiple light sources are preferably modulated at different carrier frequencies, as the resulting diversity of illumination can improve robustness to shadows and occlusions, as discussed above.
  • the difference in frequency between any two light sources may be from any one or combination of:
  • L is generally greater than 1 , and equal to 3 in the example of Fig. 10.
  • L is generally greater than 1 , and equal to 3 in the example of Fig. 10.
  • the scene geometry afforded by the objects 1030 is estimated in the coordinate frame of the sources 1010, 1020 and 1040, which is stationary.
  • FIGs. 1 1 A and 1 IB depict the computer system 1070, which may be implemented using a general-purpose computer, and upon which the various arrangements described can be practiced.
  • the computer system 1070 includes: a computer module 1 101 ; input devices such as a keyboard 1102, a mouse pointer device 1103, a scanner 1 126, the camera 1050, and a microphone 1180; and output devices including the light sources 1010, 1020, 1040, a display device 11 14 and loudspeakers 1117.
  • An external Modulator- Demodulator (Modem) transceiver device 1116 may be used by the computer
  • the communications network 1120 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN.
  • WAN wide-area network
  • the modem 1 116 may be a traditional "dial- up" modem.
  • the connection 1 121 is a high capacity (e.g., cable) connection
  • the modem 1 116 may be a broadband modem.
  • a wireless modem may also be used for wireless connection to the communications network 1120.
  • the computer module 1 101 typically includes at least one processor unit 1 105, and a memory unit 1106.
  • the memory unit 1106 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM).
  • RAM semiconductor random access memory
  • ROM semiconductor read only memory
  • module 1101 also includes an number of input/output (I/O) interfaces including: an audio- video interface 1107 that couples to the video display 1114, loudspeakers 11 17 and microphone 1180; an I/O interface 1 113 that couples to the keyboard 1102, mouse 1103,
  • (I/O) interfaces including: an audio- video interface 1107 that couples to the video display 1114, loudspeakers 11 17 and microphone 1180; an I/O interface 1 113 that couples to the keyboard 1102, mouse 1103,
  • (I/O) interfaces including: an audio- video interface 1107 that couples to the video display 1114, loudspeakers 11 17 and microphone 1180; an I/O interface 1 113 that couples to the keyboard 1102, mouse 1103,
  • (I/O) interfaces including: an audio- video interface 1107 that couples to the video display 1114, loudspeakers 11 17 and microphone 1180; an I/O interface 1 113 that couples to the keyboard 1102, mouse 1103,
  • (I/O) interfaces including: an audio- video interface 1107 that couples
  • the modem 1116 may be incorporated within the computer module 1 101, for example within the interface 1 108.
  • the computer module 1 101 also has a local interface 1 111, which permits coupling of the computer system 1070 via a connection 1 123 to the manipulator 1060.
  • the I/O interfaces 1 108, 1 1 1 1 and 1 1 13 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated).
  • Storage devices 1109 are provided and typically include a hard disk drive (HDD) 1 1 10. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used.
  • An optical disk drive 1112 is typically provided to act as a non-volatile source of data.
  • Portable memory devices such optical disks (e.g., CD-ROM, DVD, Blu-ray DiscTM), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 1070.
  • the components 1 105 to 1113 of the computer module 1 101 typically include
  • the processor 1 105 is coupled to the system bus 1 104 using a connection 1118.
  • the memory 1106 and optical disk drive 11 12 are coupled to the system bus 1 104 by connections 11 19. Examples of computers on which the described
  • the methods of coordinate determination may be implemented using the computer system 1070 wherein the processes of Figs. 1 to 10, to be described, may be implemented as one or more software application programs 1133 executable within the computer system 1070.
  • the steps of the methods of depth mapping and coordinate determination are effected by instructions 1 131 (see Fig. 1 IB) in the software 1 133 that are carried out within the computer system 1070.
  • the software instructions 1 131 may be formed as one or more code modules, each for performing one or more particular tasks.
  • the software may also be divided into two separate parts, in which a first part and the PCI
  • • - 1 1 - corresponding code modules performs the depth and coordinate determination methods and a second part and the corresponding code modules manage a user interface between the first part and the user.
  • the software may be stored in a computer readable medium, including the storage devices described below, for example.
  • the software is loaded into the computer system 1070 from the computer readable medium, and then executed by the computer system 1070.
  • a computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product.
  • the use of the computer program product in the computer system 1070 preferably effects an
  • the software 1133 is typically stored in the HDD 1110 or the memory 1 106.
  • the software is loaded into the computer system 1070 from a computer readable medium, and executed by the computer system 1070.
  • the software 1 133 may be stored on an optically readable disk storage medium (e.g., CD-ROM) 1 125 that is read by the optical disk drive 11 12.
  • a computer readable medium having such software or computer program recorded on it is a computer program product.
  • the use of the computer program product in the computer system 1070 preferably effects an apparatus for determining 3D coordinates and/or depth.
  • the application programs 1 133 may be supplied to the user encoded on one or more CD-ROMs 1125 and read via the corresponding drive 1 112, or alternatively may be read by the user from the network 1 120. Still further, the software can also be loaded into the computer system 1070 from other computer readable media.
  • Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 1070 for execution and/or processing.
  • Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-rayTM Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 1 101.
  • Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computer module 1 101 include radio or infra-red transmission channels as well as a PC l
  • GUIs graphical user, interfaces
  • a user of the computer system 1070 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s).
  • Other forms of functionally adaptable user interfaces may also be
  • Fig. 1 IB is a detailed schematic block diagram of the processor 1 105 and a "memory" 1134.
  • the memory 1 134 represents a logical aggregation of all the memory modules (including the HDD 1 109 and semiconductor memory 1 106) that can be accessed by the computer module 1 101 in Fig. 11 A.
  • a power-on self-test (POST) program 1 150 executes.
  • the POST program 1150 is typically stored in a
  • ROM 1 149 of the semiconductor memory 1106 of Fig. 11A A hardware device such as the ROM 1149 storing software is sometimes referred to as firmware.
  • the POST program 1 150 examines hardware within the computer module 1101 to ensure proper functioning and typically checks the processor 1105, the memory 1 134 (1109, 1 106), and a basic input-output systems software (BIOS) module 1151, also typically stored in the ROM 1 149, for correct operation. Once the POST program 1150 has run successfully, the BIOS 1151 activates the hard disk drive 1110 of Fig. 11A. Activation of the hard disk drive 1 1 10 causes a bootstrap loader program 1152 that is resident on the hard disk drive 1 110 to execute via the processor 1105.
  • BIOS basic input-output systems software
  • the operating system 1 153 is a system level application, executable by the processor 1 105, to fulfil various high level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface. j
  • the operating system 1153 manages the memory 1134 (1109, 1 106) to ensure that each process or application running on the computer module 1 101 has sufficient memory in which to execute without colliding with memory allocated to another process.
  • the aggregated memory 1 134 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 1070 and how such is used.
  • the processor 1 105 includes a number of functional modules including a control unit 1139, an arithmetic logic unit (ALU) 1140, and a local or internal memory 1 148, sometimes called a cache memory.
  • the cache memory 1148 typically includes a number of storage registers 1144 - 1 146 in a register section.
  • One or more internal busses 1141 functionally interconnect these functional modules.
  • the " processor 105 typically also has one or more interfaces 1142 for communicating with external devices via the system bus 1 104, using a connection 1 1 18.
  • the memory 1 134 is coupled to the bus 1 104 using a connection 1 119.
  • the application program 1 133 includes a sequence of instructions 1 131 that may include conditional branch and loop instructions.
  • the program 1133 may also include data 1132 which is used in execution of the program 1133.
  • the instructions 1 131 and the data 1 132 are stored in memory locations 1 128, 1 129, 1130 and 1135, 1 136, 1137, respectively.
  • a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 1130.
  • an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 1128 and 1 129.
  • the processor 1 105 is given a set of instructions which are executed therein.
  • the processor 1 105 waits for a subsequent input, to which the processor 1105 reacts to by executing another set of instructions.
  • Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 1 102, 1103, data received from an external source across one of the networks 1 120, 1 102, data retrieved from one of the storage devices 1106, 1109 or data retrieved from a storage medium 1 125 inserted into the corresponding reader 1 1 12, all depicted in Fig. 1 1 A.
  • the execution of a set of the instructions may in spme cases result in output of data. Execution may also involve storing data or variables to the memory 1134.
  • the disclosed depth and coordinate measurement arrangements use input variables 1 154, which are stored in the memory 1134 in corresponding memory locations 1 155, 1 156, 1 157.
  • the arrangements produce output variables 1 161, which are stored in the memory 1134 in corresponding memory locations 1162, 1 163, 1164.
  • Intermediate variables 1 158 may be stored in memory locations 1159, 1160, 1 166 and 1167.
  • each fetch, decode, and execute cycle comprises:
  • a further fetch, decode, and execute cycle for the next instruction may be executed.
  • a store cycle may be performed by which the control unit 1 139 stores or writes a value to a memory location 1132.
  • the methods or parts thereof may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of depth and coordinate mapping.
  • dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
  • m i which is a linear sum of sinusoidal terms, each offset (in this instance by a value of one) to maintain overall positivity.
  • the pattern of light intensity radiated through each mask from a line or point source on an axis of the mask is proportional to the mask transmittance.
  • the light source described would ordinarily radiate in all directions, with an attendant rapid loss of intensity with range, it is straightforward to constrain the angle of illumination to any desired value by using suitable internal reflectors.
  • the central aspect of importance here is that there are no refracting optics in the light path of the source. Unlike conventional projectors, therefore, which use lenses and are thus limited to finite depths of field, the spatio-temporal light source described radiates an unfocussed, diverging field.
  • the temporal component of modulation for the I th source is achieved by rotating its cylindrical mask at a velocity of /V t revolutions per second, where the specific value of Ni is characteristic of the I th source.
  • the far-field intensity of the I th source then comprises a rotating sum of sinusoidal carriers whose phases can be directly related to the instantaneous mechanical angle 6 t 170 through which the cylinder has rotated with respect to the datum ⁇ 0 180, which is common to all sources.
  • the spatio-temporal sources 1 10 and 120 encode points in the scene such that their azimuth angles with respect to each source can be readily estimated. This is accomplished by demodulating the pixel time histories across successive frames captured by the camera 1050 and determining the phase(s) of each carrier. De-multiplexing multiple carriers is relatively straight forward on account of different sources using different values of either K lm or N which enables a form of frequency-division multiplexing, as illustrated in Figs. 3 A and 3B.
  • Fig. 2 is a plan view of the pair of spatio-temporal sources 210 and 220, such as those depicted in Fig. 1 , in relation to objects forming a scene 230, which in this case is the PC
  • Fig. 2 shows coordinate axes 290 by which positioning parameters of the source locations 210 and 220 are known or determinable. With knowledge of the source locations 210 and 220, and the angles B 270 and ⁇ 2 240 relative to the angular datum ⁇ 0 280, this arrangement permits direct triangulation of scene points 250 in X and Y, being at least two coordinates in the 3D system, without regard to either the calibration parameters of a camera 295, nor the location 260, 265, which is free to vary.
  • Fig. 3 A illustrates what the temporal history, or time signal 310, of a single camera pixel might resemble when two carriers are present, according to the arrangement depicted in Figs. 1 and 2.
  • Fig. 3B is the temporal spectrum of the time signal in Fig. 3A computed using a fast Fourier transform (FFT).
  • FFT fast Fourier transform
  • Fig. 3B firstly shows the presence of the two sinusoidal signals 330 and 340 having different frequencies. In view of the spread of peaks shown at DC (0 Hz) and at 21 Hz and 27 Hz, Fig. 3B secondly shows the limitations of FFT techniques in estimating signal parameters from short time records.
  • the FFT approach displays poor resolution of closely spaced frequencies in comparison to algebraic techniques, especially when the spectrum becomes more crowded with carriers.
  • the preferred implementation uses an algebraic (matrix) approach to estimating carrier amplitudes and phases. Demodulating the camera frame data using an algebraic approach gives superior phase estimation performance on account of the frequencies being precisely known beforehand.
  • iso-phase planes 130 and 140 are shown radiating from the spatio-temporal sources 1 10 and 120. On such planes, the phases of sinusoidal carriers P
  • a line of intersection 150 between the iso-planes 130 and 140 may be no more valid than the line 60.
  • camera independence maybe achieved by adding additional sources projecting rotating patterns surrounding the light sources in the vertical plane, and hence intersecting the line 150 to provide unambiguous localisation.
  • Figure 12 shows three light sources 1210, 1212, 1214, each rotating about their respective axes.
  • a point 1218 in the scene in some part of a surface 1240 in the scene when seen by a camera 1230, will be associated with phases from each of the sources 1210-1214 which, after analysis, will reveal the angles, ⁇ , ⁇ 2 , ⁇ 2 of the planes 1220, 1222, 1224 containing the rays emitted from the three light sources.
  • the coordinates may be defined, for example, by the location of the first light source, 1250 (taken to be the origin), a line joining the light sources, 1251 , (taken as the x axis), the axis of rotation of the first light source, 1252, (taken to be the y axis) and a line perpendicular to the x and y axes, 1253, (taken to be the z-axis).
  • This process is independent of the location of the camera 1230 and reveals the 3D location of points in the scene rather than PC I
  • Fig. 4A illustrates a sinusoidal pattern 430 similar to that discussed in Fig. 1 , however, this pattern is skewed, or tilted, with respect to the X and Y axes.
  • the pattern 430 can be considered to have spatial frequencies in both horizontal and vertical directions, the aim of which is to resolve the ambiguous elevations in the first implementation.
  • Fig. 4B shows a skew-sinusoidal pattern 430 mapped to both cylindrical 410 and spherical 420 geometries, where the condition of integral circumferential cycles is observed in each case. Further discussion will focus on the spherical geometry 420 on account of its advantages in providing a direct mapping between its vertical phase component and the elevation angle ⁇ , as well as the sphere having surfaces normal to the light path, and thus less likely to introduce unwanted refraction into the projected intensities.
  • a single spherical mask 510 is illustrated with respect to the global coordinate frame 550.
  • the iso-phase surfaces for the first arrangement took the form of vertical planes
  • the iso-phase surfaces for this spherical skew pattern take the form of helical coils 520. Rays projected from the centre of the sphere 510 through the helix 520 intersect scene points p 560 possessing identical phase.
  • the iso-phase surfaces 520 rotate with the pattern, encoding the angular displacements represented by the azimuth angle ⁇ ⁇ 540 and the elevation angle ⁇ 530 into the projected intensities I lm according to l lm oc 1 + cos(K lm e s + V lm ⁇ t> s ) where V tm is the vertical equivalent of the horizontal circumferential frequency K lm .
  • - 20 - preferred implementation comprises at least two sinusoidal patterns 610 and 620 skewed in opposite directions as illustrated in Fig. 6A.
  • These patterns corresponding to positive and negative values of V Im , are summed to construct or create a composite pattern 630, being a two-dimensional signal.
  • the pattern 630 is an example of a composite signal, being a composite of the signal patterns 610 and 620 as impinging upon the object. As seen in Fig. 6A the patterns are orientated orthogonal to each other, thus creating a cross-hatched composite pattern. Since each of the patterns 610 and 620 contains phase changes, the pattern 630 is an example of a composite phase signal representing a composite signal of light intensities from the spatio-temporal light sources from which the patterns are generated.
  • FIG. 7 shows an example implementation 700 in a three-dimensional system 790 where respective iso- phase surfaces 740 and 750 wind around a known positioned spatio-temporal source 710 in opposite directions.
  • the positioning of the light sources 710 and 720 provide for a predetermined geometric arrangement of the sources relative to the object or reference point 780 within a three-dimensional space.
  • the intersections of these contra-rotating surfaces define a ray 760 which identifies the azimuth angle fyand elevation angle ' 0 ⁇ of points, such as a point 780 in the scene with respect to the / t l source 710.
  • Wrapped phase means that a phase angle and corresponding phase angle rotated by 2 ⁇ radians is not disambiguated.
  • the point 780 is irradiated with a composite wrapped phase signal produced by the light sources 710, 720, with each light source being characterised by at least one known positioning parameter with respect to a reference line, such as 760, 770 through the reference point 780.
  • the reconstruction algorithm which is responsible for forming the overall geometry estimate based on information from L distributed spatio-temporal sources.
  • the angles and ⁇ can, in principle, be determined directly for each source. This, however, ignores the influence of noise in the intensity measurements, and also the non-uniformity in data quality to be expected in real- world measurements. For reasons mentioned in the
  • the remaining component of the present disclosure is a minimisation algorithm (e.g. Newton's algorithm) designed to reconstruct scene geometries such that modelled carrier phases match the measured phases in an overall least squares sense.
  • a minimisation algorithm e.g. Newton's algorithm
  • the forward data model, mapping scene coordinates to estimated phases i i m takes the form
  • the cost function for the least squares minimisation is calculated oyer the M sinusoidal carriers of each of the L spatio-temporal sources.
  • the spectral intensities estimated in the demodulation step exemplified in Fig. 3B by I 350 and / 2 360, are used to weight the respective phase estimates such that those associated with stronger signals take precedence over noisier ones. This is the underlying principle whereby geometric diversity in the positioning of multiple sources is able to improve the robustness of geometry estimates.
  • the overall weighted cost function to be minimised is given by
  • the cost function derivatives necessary for calculating the coordinate increments at each iteration of the minimisation algorithm include the vectors of first derivatives Vx 2 , and the matrices of second derivatives V ⁇ 2 for each scene point p, given respectively by
  • the initial estimate p ⁇ can be constructed using direct triangulation, as practiced in the prior art, without regard to camera calibration.
  • the better the initial estimate of p the fewer steps are required to reduce the squared error below the desired tolerance.
  • Fig. 8 is a typical plot of the convergence of the above reconstruction algorithm in which it can be observed that the squared error 820 is monotonically reduced on each iteration 810.
  • the minimum attainable residual error 830 is a function of the signal to noise ratio of the input image frames, as well as the accuracy of the source locations etc.
  • Fig. 9 summarises a data processing architecture 900 representative of a process of the preferred implementation of the geometry acquisition system described herein.
  • a certain minimum number of frames are acquired at step 910 from the camera system, of Fig. 10 for example, to permit the phases of sinusoidal illumination components to be estimated.
  • the scene geometry is then initialised, as indicated by the dashed arrow connection 912, to a starting estimate at step 920.
  • This starting estimate can take the form of a regular Cartesian grid of pixels having some uniform (or user specified) default depth.
  • the processing system can use the result of the previous calculation to initialise the current estimate.
  • the carrier phase outputs 935 arising from the sinusoidal fit 930 can be used in conjunction with the camera data to triangulate the approximate coordinates of the camera pixels as an initial geometry estimate.
  • the measured carrier phases 935 are compared in step 990 with the P ⁇ J J
  • step 990 calculates errors of the carrier phases with respect to the current geometry estimate. If the sum of squared errors is less than a prescribed threshold, being a convergence test performed at step 995, the reconstruction of the scene geometry halts, and the process 900 proceeds to acquire the next frame at step 999 for processing in the next cycle, as indicated by the dashed line 998.
  • step 960 the derivatives are weighted according to the carrier amplitudes 940 found during the sinusoidal fitting step 930 and used to construct the Newton increment in step 970.
  • the Newton increment is then scaled and added to the current geometry estimate in step 980 to provide the updated geometry estimate to step 990.
  • the iterative process 900 then continues, with the subsequently calculated error being less than that in the preceding iteration.
  • the variance of the geometry estimates is greatly improved over straightforward triangulation, on account of the data being fused from multiple geometrically diverse sources of illumination, independently of any camera or projector calibration.
  • the image sensor may be implemented as a simple light detector, such as a photodiode, positioned at the reference point in the 3D scene. In such an implementation, the sensor does not detect light from the sources p
  • This arrangement can be useful of detecting motion in the scene, being a situation where the depth may be undergoing variation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention décrit un procédé de détermination des coordonnées (790) d'un point de référence (780) sur un objet dans l'espace tridimensionnel dans une scène capturée par un capteur d'image. L'objet est exposé à des sources lumineuses (710, 720) qui sont modulées à des fréquences spatiotemporelles différentes. Le procédé génère un signal de phase composite (630, 640) sur l'objet selon un agencement géométrique prédéfini des sources lumineuses, et capture (910) le signal de phase composite au niveau du point de référence à l'aide du capteur d'image. Un système de traitement détermine, à partir du signal de phase composite capturé, un groupe de paramètres de positionnement mesurés (phase porteuse mesurée ψlm) quelle que soit la position du capteur d'image. Les paramètres de positionnement mesurés provenant des sources lumineuses sont utilisés pour déterminer les coordonnées du point de référence.
PCT/AU2012/001587 2011-12-23 2012-12-21 Système lumineux structuré pour une acquisition géométrique robuste WO2013091016A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2011265572 2011-12-23
AU2011265572A AU2011265572A1 (en) 2011-12-23 2011-12-23 Structured light system for robust geometry acquisition

Publications (1)

Publication Number Publication Date
WO2013091016A1 true WO2013091016A1 (fr) 2013-06-27

Family

ID=48667520

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2012/001587 WO2013091016A1 (fr) 2011-12-23 2012-12-21 Système lumineux structuré pour une acquisition géométrique robuste

Country Status (2)

Country Link
AU (1) AU2011265572A1 (fr)
WO (1) WO2013091016A1 (fr)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015024871A1 (fr) * 2013-08-19 2015-02-26 Basf Se Détecteur optique
JP2016011874A (ja) * 2014-06-27 2016-01-21 キヤノン株式会社 画像処理装置およびその方法
WO2016092451A1 (fr) * 2014-12-09 2016-06-16 Basf Se Détecteur optique
US9665182B2 (en) 2013-08-19 2017-05-30 Basf Se Detector for determining a position of at least one object
US9741954B2 (en) 2013-06-13 2017-08-22 Basf Se Optical detector and method for manufacturing the same
US9829564B2 (en) 2013-06-13 2017-11-28 Basf Se Detector for optically detecting at least one longitudinal coordinate of one object by determining a number of illuminated pixels
CN108205817A (zh) * 2016-12-20 2018-06-26 东莞前沿技术研究院 获取目标曲面的方法、装置和系统
WO2018158435A1 (fr) * 2017-03-02 2018-09-07 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Déflectomètre, motif de référence et procédé servant à définir la topographie d'un objet
US10094927B2 (en) 2014-09-29 2018-10-09 Basf Se Detector for optically determining a position of at least one object
US10120078B2 (en) 2012-12-19 2018-11-06 Basf Se Detector having a transversal optical sensor and a longitudinal optical sensor
CN108844459A (zh) * 2018-05-03 2018-11-20 华中科技大学无锡研究院 一种叶片数字化样板检测系统的标定方法及装置
US10353049B2 (en) 2013-06-13 2019-07-16 Basf Se Detector for optically detecting an orientation of at least one object
US10401496B2 (en) 2015-09-30 2019-09-03 Ams Sensors Singapore Pte. Ltd. Optoelectronic modules operable to collect distance data via time-of-flight and triangulation
US10412283B2 (en) 2015-09-14 2019-09-10 Trinamix Gmbh Dual aperture 3D camera and method using differing aperture areas
US10775505B2 (en) 2015-01-30 2020-09-15 Trinamix Gmbh Detector for an optical detection of at least one object
US10890491B2 (en) 2016-10-25 2021-01-12 Trinamix Gmbh Optical detector for an optical detection
WO2021019929A1 (fr) * 2019-07-26 2021-02-04 ソニーセミコンダクタソリューションズ株式会社 Dispositif de mesure de distance, système de mesure de distance et procédé de réglage destiné à un dispositif de mesure de distance
US10948567B2 (en) 2016-11-17 2021-03-16 Trinamix Gmbh Detector for optically detecting at least one object
US10955936B2 (en) 2015-07-17 2021-03-23 Trinamix Gmbh Detector for optically detecting at least one object
US11041718B2 (en) 2014-07-08 2021-06-22 Basf Se Detector for determining a position of at least one object
US11060922B2 (en) 2017-04-20 2021-07-13 Trinamix Gmbh Optical detector
US11067692B2 (en) 2017-06-26 2021-07-20 Trinamix Gmbh Detector for determining a position of at least one object
US11211513B2 (en) 2016-07-29 2021-12-28 Trinamix Gmbh Optical sensor and detector for an optical detection
US11428787B2 (en) 2016-10-25 2022-08-30 Trinamix Gmbh Detector for an optical detection of at least one object
US11860292B2 (en) 2016-11-17 2024-01-02 Trinamix Gmbh Detector and methods for authenticating at least one object

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112815832B (zh) * 2019-11-15 2022-06-07 中国科学院长春光学精密机械与物理研究所 一种基于3d靶标的测量相机坐标系计算方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999058930A1 (fr) * 1998-05-14 1999-11-18 Metacreations Corporation Numeriseur tridimensionnel a lumiere structuree et a base triangulaire
WO2001051886A1 (fr) * 2000-01-10 2001-07-19 Massachusetts Institute Of Technology Procede et dispositif de mesure de ligne hypsometrique
US20050174579A1 (en) * 2002-04-24 2005-08-11 Gunther Notni Method and device for determining the spatial co-ordinates of an object
US7440590B1 (en) * 2002-05-21 2008-10-21 University Of Kentucky Research Foundation System and technique for retrieving depth information about a surface by projecting a composite image of modulated light patterns
US20100303341A1 (en) * 2009-06-01 2010-12-02 Haeusler Gerd Method and device for three-dimensional surface detection with a dynamic reference frame
US20110164114A1 (en) * 2010-01-06 2011-07-07 Canon Kabushiki Kaisha Three-dimensional measurement apparatus and control method therefor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999058930A1 (fr) * 1998-05-14 1999-11-18 Metacreations Corporation Numeriseur tridimensionnel a lumiere structuree et a base triangulaire
WO2001051886A1 (fr) * 2000-01-10 2001-07-19 Massachusetts Institute Of Technology Procede et dispositif de mesure de ligne hypsometrique
US20050174579A1 (en) * 2002-04-24 2005-08-11 Gunther Notni Method and device for determining the spatial co-ordinates of an object
US7440590B1 (en) * 2002-05-21 2008-10-21 University Of Kentucky Research Foundation System and technique for retrieving depth information about a surface by projecting a composite image of modulated light patterns
US20100303341A1 (en) * 2009-06-01 2010-12-02 Haeusler Gerd Method and device for three-dimensional surface detection with a dynamic reference frame
US20110164114A1 (en) * 2010-01-06 2011-07-07 Canon Kabushiki Kaisha Three-dimensional measurement apparatus and control method therefor

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10120078B2 (en) 2012-12-19 2018-11-06 Basf Se Detector having a transversal optical sensor and a longitudinal optical sensor
US9741954B2 (en) 2013-06-13 2017-08-22 Basf Se Optical detector and method for manufacturing the same
US10845459B2 (en) 2013-06-13 2020-11-24 Basf Se Detector for optically detecting at least one object
US10823818B2 (en) 2013-06-13 2020-11-03 Basf Se Detector for optically detecting at least one object
US10353049B2 (en) 2013-06-13 2019-07-16 Basf Se Detector for optically detecting an orientation of at least one object
US9989623B2 (en) 2013-06-13 2018-06-05 Basf Se Detector for determining a longitudinal coordinate of an object via an intensity distribution of illuminated pixels
US9829564B2 (en) 2013-06-13 2017-11-28 Basf Se Detector for optically detecting at least one longitudinal coordinate of one object by determining a number of illuminated pixels
US9958535B2 (en) 2013-08-19 2018-05-01 Basf Se Detector for determining a position of at least one object
JP2016537638A (ja) * 2013-08-19 2016-12-01 ビーエーエスエフ ソシエタス・ヨーロピアBasf Se 光学検出器
US9665182B2 (en) 2013-08-19 2017-05-30 Basf Se Detector for determining a position of at least one object
WO2015024871A1 (fr) * 2013-08-19 2015-02-26 Basf Se Détecteur optique
US9557856B2 (en) 2013-08-19 2017-01-31 Basf Se Optical detector
US10012532B2 (en) 2013-08-19 2018-07-03 Basf Se Optical detector
CN105637320A (zh) * 2013-08-19 2016-06-01 巴斯夫欧洲公司 光学检测器
AU2014310703B2 (en) * 2013-08-19 2018-09-27 Basf Se Optical detector
JP2016011874A (ja) * 2014-06-27 2016-01-21 キヤノン株式会社 画像処理装置およびその方法
US11041718B2 (en) 2014-07-08 2021-06-22 Basf Se Detector for determining a position of at least one object
US10094927B2 (en) 2014-09-29 2018-10-09 Basf Se Detector for optically determining a position of at least one object
CN107003785B (zh) * 2014-12-09 2020-09-22 巴斯夫欧洲公司 光学检测器
US11125880B2 (en) 2014-12-09 2021-09-21 Basf Se Optical detector
WO2016092451A1 (fr) * 2014-12-09 2016-06-16 Basf Se Détecteur optique
CN107003785A (zh) * 2014-12-09 2017-08-01 巴斯夫欧洲公司 光学检测器
US10775505B2 (en) 2015-01-30 2020-09-15 Trinamix Gmbh Detector for an optical detection of at least one object
US10955936B2 (en) 2015-07-17 2021-03-23 Trinamix Gmbh Detector for optically detecting at least one object
US10412283B2 (en) 2015-09-14 2019-09-10 Trinamix Gmbh Dual aperture 3D camera and method using differing aperture areas
US10401496B2 (en) 2015-09-30 2019-09-03 Ams Sensors Singapore Pte. Ltd. Optoelectronic modules operable to collect distance data via time-of-flight and triangulation
US11211513B2 (en) 2016-07-29 2021-12-28 Trinamix Gmbh Optical sensor and detector for an optical detection
US10890491B2 (en) 2016-10-25 2021-01-12 Trinamix Gmbh Optical detector for an optical detection
US11428787B2 (en) 2016-10-25 2022-08-30 Trinamix Gmbh Detector for an optical detection of at least one object
US11860292B2 (en) 2016-11-17 2024-01-02 Trinamix Gmbh Detector and methods for authenticating at least one object
US11698435B2 (en) 2016-11-17 2023-07-11 Trinamix Gmbh Detector for optically detecting at least one object
US10948567B2 (en) 2016-11-17 2021-03-16 Trinamix Gmbh Detector for optically detecting at least one object
US11635486B2 (en) 2016-11-17 2023-04-25 Trinamix Gmbh Detector for optically detecting at least one object
US11415661B2 (en) 2016-11-17 2022-08-16 Trinamix Gmbh Detector for optically detecting at least one object
WO2018113257A1 (fr) * 2016-12-20 2018-06-28 东莞前沿技术研究院 Procédé, dispositif et système d'acquisition de surface incurvée cible
CN108205817A (zh) * 2016-12-20 2018-06-26 东莞前沿技术研究院 获取目标曲面的方法、装置和系统
WO2018158435A1 (fr) * 2017-03-02 2018-09-07 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Déflectomètre, motif de référence et procédé servant à définir la topographie d'un objet
US11060922B2 (en) 2017-04-20 2021-07-13 Trinamix Gmbh Optical detector
US11067692B2 (en) 2017-06-26 2021-07-20 Trinamix Gmbh Detector for determining a position of at least one object
CN108844459A (zh) * 2018-05-03 2018-11-20 华中科技大学无锡研究院 一种叶片数字化样板检测系统的标定方法及装置
WO2021019929A1 (fr) * 2019-07-26 2021-02-04 ソニーセミコンダクタソリューションズ株式会社 Dispositif de mesure de distance, système de mesure de distance et procédé de réglage destiné à un dispositif de mesure de distance

Also Published As

Publication number Publication date
AU2011265572A1 (en) 2013-07-11

Similar Documents

Publication Publication Date Title
WO2013091016A1 (fr) Système lumineux structuré pour une acquisition géométrique robuste
CN110880185B (zh) 基于条纹投影的高精度动态实时360度全方位点云获取方法
Pagani et al. Structure from motion using full spherical panoramic cameras
US8213707B2 (en) System and method for 3D measurement and surface reconstruction
US9547802B2 (en) System and method for image composition thereof
US10401716B2 (en) Calibration of projection systems
Bouguet et al. 3D photography using shadows in dual-space geometry
US8217961B2 (en) Method for estimating 3D pose of specular objects
US20210374978A1 (en) Capturing environmental scans using anchor objects for registration
Wong et al. Recovering light directions and camera poses from a single sphere
Shin et al. A multi-camera calibration method using a 3-axis frame and wand
Valgma 3D reconstruction using Kinect v2 camera
Bergamasco et al. Parameter-free lens distortion calibration of central cameras
CN116295113A (zh) 一种融合条纹投影的偏振三维成像方法
Hu et al. A refractive stereo structured-light 3-D measurement system for immersed object
Zhao et al. Three‐dimensional face modeling technology based on 5G virtual reality binocular stereo vision
Mallik et al. A multi-sensor information fusion approach for efficient 3D reconstruction in smart phone
Shim et al. Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations
Albarelli et al. High-coverage 3D scanning through online structured light calibration
CN113483669B (zh) 一种基于立体靶标的多传感器位姿标定方法及装置
Habbecke et al. Laser brush: a flexible device for 3D reconstruction of indoor scenes
Masuda et al. Simultaneous determination of registration and deformation parameters among 3D range images
Fuersattel et al. Geometric primitive refinement for structured light cameras
CN112750098B (zh) 深度图优化方法及装置、系统、电子设备、存储介质
KR20200032664A (ko) 사각형 그리드 투영을 이용한 3차원 영상 재구성 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12859866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12859866

Country of ref document: EP

Kind code of ref document: A1