WO2008044943A1 - Appareil et procédé de balayage tridimensionnel amélioré - Google Patents

Appareil et procédé de balayage tridimensionnel amélioré Download PDF

Info

Publication number
WO2008044943A1
WO2008044943A1 PCT/NZ2006/000259 NZ2006000259W WO2008044943A1 WO 2008044943 A1 WO2008044943 A1 WO 2008044943A1 NZ 2006000259 W NZ2006000259 W NZ 2006000259W WO 2008044943 A1 WO2008044943 A1 WO 2008044943A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
laser
mesh
chassis
images
Prior art date
Application number
PCT/NZ2006/000259
Other languages
English (en)
Inventor
Mihai Costache
Dale Mooney
Original Assignee
Jewel Soft Holdings Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jewel Soft Holdings Limited filed Critical Jewel Soft Holdings Limited
Priority to PCT/NZ2006/000259 priority Critical patent/WO2008044943A1/fr
Publication of WO2008044943A1 publication Critical patent/WO2008044943A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Definitions

  • the present invention relates to an improved 3-dimensional scanning apparatus and method.
  • the invention relates to a 3-dimensional scanning apparatus and method that capture a 3-dimensional representation -of an object. More particularly, it relates to a scanning apparatus and method that captures a 3- dimensional representation of an object that represents both the geometry and photographic imagery of the object. Still more particularly the present invention relates to a 3-dimensional scanning apparatus and method that captures a 3- dimensional representation of a hand.
  • Scanning apparatus which capture 3-dimensional geometrical representations of objects are known.
  • these involve the use of lasers to provide beams to moving mirrors which scan them over the surface of an object. Reflections of these laser beams are detected and analysed.
  • analysis involves counting interference fringes to measure the distance from a given mirror to a point on the object where the beam is reflected from the object. If the angle of the mirror and the distance from the mirror to the point on the object are known, the geometrical position of the sample points can be deduced. Given enough sample points, a 3- dimensional geometrical representation of an object can be generated.
  • interference fringes provides an accurate means for capturing 3- dimensional representations of objects.
  • the technology involved in moving mirrors and analysing interference fringes is relatively sophisticated and expensive. Nevertheless, these systems have found wide application.
  • the analysis of distances and angles of rapidly moving mirrors is relatively processor intensive. Accordingly, a finite time is required for any given processing.
  • a 3- dimensinal scanned representation of an object the size of an average adult human hand (for example) using moving mirrors and interference fringe analysis may be accurate to within microns and may take approximately 2 to 5 minutes to generate.
  • the term 'centre of area' is intended to relate to a point which represents the centre of an area as mathematically integrated. This term is the analog of the commonly used term 'centre of mass', but relating to areas in place of masses.
  • the term 'laser curtain' refers broadly to a laser output confined to a plane so that it forms a profile line or line of intersect when projected onto an object.
  • 'maximas' used in relation to brightness and pixels is intended to relate to a pixel or group of pixels which are the brightest as measured across a given cross section.
  • a brightness maxima for a profile line may be taken as the brightest pixel occurring on a vertical line of pixels.
  • the term 'photographic' and suchlike is intended to refer broadly to captured images which have photographic properties and is intended to include such images captured on any medium including CCD or other digital media.
  • the invention provides a method of capturing a 3- dimensional representation of an object, the method comprising:
  • the laser operates at an infrared wavelength.
  • the laser operates substantially at 780 nanometres.
  • the laser projection is substantially a line.
  • said line is provided by the projection of a laser curtain.
  • the method includes moving the projection of the laser curtain over the object occurs substantially along a predefined scanning axis.
  • the method includes moving relative to the object the laser, the first camera and the second camera.
  • the method includes moving the laser relative to the object comprises relative translation parallel to the scanning axis.
  • the method includes measuring displacement of the laser relative to the object.
  • the linear displacement transducer is a draw wire sensor.
  • the method includes creating the geometrical mesh representation of the object created from images captured from the first camera includes locating the first camera at a predefined offset angle to the laser projection and using trigonometry to detect a profile defined by the intersect of the object and the laser curtain.
  • the method includes detecting upon an image captured from said at least one first camera a line of brightness maximas.
  • the method includes creating a smoothed line representing said brightness maximas.
  • the method includes sampling points along said smoothed line.
  • the method includes providing a plurality of image capture stations each comprising a laser, a first camera and a second camera, said plurality of image capture stations arranged around the object.
  • said plurality of image capture stations comprises five or more said image capture stations.
  • sequence of images captured from the first and second cameras comprises at least one image taken from each of first and second cameras at substantially the same stages of progression of movement of the laser projection over the object.
  • the method includes the step of abridging the geometric mesh representation of the object to provide an abridged geometric mesh having abridged mesh elements.
  • the number of abridged mesh elements is factored according to a given processing power.
  • the number of abridged mesh elements is factored according to the size of the object.
  • the method includes defining for each abridged mesh element a reference point.
  • the reference points comprise a centre of area of the abridged mesh element.
  • the method includes the step of defining for the given mesh element a surface normal.
  • the method includes identifying for a given abridged mesh element an image from the second camera that corresponds to a stage of progression of the movement of the laser projection represented by the reference point of the given abridged mesh element.
  • the method includes identifying for the given abridged mesh element a single image captured from the second camera for which the surface normal of the given mesh element is most closely aligned with an orientation of a given second camera.
  • the method includes assigning portions of images captured by the second camera to the geometric mesh.
  • the method includes providing illumination for the object.
  • the method includes providing at least one optical filter between said light sources and the object.
  • the illumination is provided by a set of light sources arranged around the object.
  • said first camera is adapted with an optical filter adapted to attenuate visible light.
  • said at least one second camera is adapted with an optical filter adapted to attenuate infra-red light.
  • the method includes aligning the plurality of lasers to provide projections lying substantially in a common plane.
  • the method includes a set of processor executable instructions stored on a processor readable medium, said instructions adapted to carry out the method as outlined in any one of the paragraphs above.
  • An apparatus for capturing a 3-dimensional representation of an object includes:
  • At least one laser adapted to provide a laser projection onto the object
  • a mount for the laser said mount adapted to allow the laser projection on the object to be moved over the object;
  • a first camera adapted to capture an image of the laser projection on the object
  • a second camera adapted to capture an image substantially excluding the laser projection on the object
  • a processor adapted to capture a sequence of images from first and second cameras, said images corresponding to given stages of progression of movement of the laser projection over the object.
  • the mount for the laser is also adapted to mount the first and second cameras.
  • the apparatus includes a translating chassis to which the mount is connected, said translating chassis being adapted to translate said mount for the laser relative to the object.
  • the apparatus includes a plurality of said mounts including a laser, a first camera and a second camera is provided on said translating chassis.
  • said plurality includes five or more said mounts.
  • a first camera includes an optical filter adapted to attenuate visible light.
  • a second camera includes an optical filter adapted to attenuate infra-red light.
  • the second camera is located proximate a first camera.
  • a second camera is located along an axis parallel the predefined scanning axis from the first camera.
  • a first camera is located proximate each laser.
  • At least one camera is located along an axis parallel to the predefined scanning axis from the laser to which it is proximate.
  • At least one second camera is located along an axis parallel to the predefined scanning axis from the laser wavelength optimized camera to which it is proximate.
  • the apparatus includes a base chassis and a translator chassis, wherein the translator chassis is adapted to translate relative to the base chassis substantially along the predefined scanning axis.
  • the apparatus includes a means to measure the displacement of the translator chassis along the predefined scanning axis.
  • said means to measure displacement includes a draw wire sensor.
  • the apparatus includes a plurality of image capture stations each including at least one laser, at least one first camera and at least one second camera disposed around the chassis.
  • Preferably said image capture stations are disposed substantially around the chassis.
  • the apparatus includes 5 or more image capture stations.
  • the apparatus includes one or more visible light sources mounted on the chassis.
  • the visible light sources include fluorescent tubes.
  • said fluorescent tubes are mounted substantially parallel to the central predefined scanning axis.
  • the apparatus includes at least one optical filter arranged between the scanning area and the fluorescent tubes.
  • the optical filter includes an infrared filter.
  • the optical filter includes a heat shield.
  • the optical filter is selected to compensate for deviations of the spectrum of the visible light sources from a visible white light spectrum.
  • the optical filter includes a minus green filter.
  • the apparatus includes at least one substantially annular fluorescent tube arranged transverse to the scanning axis.
  • the apparatus includes an adjustable mount for the laser, said mount adapted to align the output of the laser in at least two degrees of freedom.
  • the laser mount is adapted to allow the output of a laser to be aligned into a given plane.
  • Figure 1 shows a perspective view of a scanning apparatus according to the preferred embodiment of the present invention
  • Figure 2 shows an alternative perspective view of a scanning apparatus according to the same preferred embodiment as figure 1 , in this case an outer cover and frame has been removed to show internal aspects of the scanning apparatus;
  • FIG. 3 shows a cutaway side elevation of a scanning apparatus according to the same preferred embodiment as figures 1 and 2;
  • Figure 4 shows a side elevation of a scanning apparatus according to the same preferred embodiment as figures 1 to 3;
  • Figure 5 shows an alternative perspective view of a scanning apparatus according to the same preferred embodiment as figures 1 to 4;
  • Figure 6 shows a further alternative perspective view of a scanning apparatus according to the same preferred embodiment as figures 1 to 5, in this case, an additional cover for an annular fluorescent tube has been removed;
  • Figure 7 shows a further alternative perspective view of a scanning apparatus according to the same preferred embodiment as figures 1 to 6;
  • Figure 8 shows a yet further perspective view of a scanning apparatus according to the same preferred embodiment as figures 1 to 7;
  • Figure 9 shows a plan view of a scanning apparatus according to the same preferred embodiment as figures 1 to 9;
  • Figure 10 shows an underside plan view of a scanning apparatus according to the same preferred embodiment as figures 1 to 9;
  • Figure 11 depicts a method carried out according to the preferred embodiment
  • Figure 12 depicts an alternative method carried out according to the preferred embodiment of the present invention.
  • FIG. 1 shows a scanning apparatus (1) according to the preferred embodiment of the present invention.
  • This view of the scanning apparatus shows it as it is used in a retail environment to obtain 3-dimensional scans parts of a customer's body.
  • the scanning apparatus (1) comprises a box (2) which has an aperture (3) into which a customer inserts their hand (not shown) and places their wrist (not shown) on a rest (4). It is envisioned that some form of rest for the fingers and palm of a hand may, be provided in alternative embodiments of the invention.
  • FIG. 2 shows the scanning apparatus (1) with the outer cover of the box (2) and the rest (4) removed to reveal inner aspects of the scanning apparatus (1).
  • the scanning apparatus (1 ) has a base chassis (5) and a vertical chassis or translating chassis (6).
  • the translating chassis (6) has a front face plate (6a) and a back face plate (6b). These face plates each define a scanning aperture (7a) and (7b) which, in turn, define a scanning area (8) between the two plates.
  • the translating chassis (6) is mounted on horizontal guides (9) and coupled to a threaded member (10). This arrangement allows the translating chassis (6) to translate back and fourth with respect to the base chassis (5).
  • Figure 3 shows a cutaway side elevation of the scanning apparatus (1 ). Attached to the base chassis (5) is a drive motor (11) for the threaded member (10). A corresponding threaded member (12) is located on the translating chassis (6) to cause it to translate when threaded member (10) is turned by the drive motor (11).
  • the base chassis (5) has rollers (13a) to (13d) over which a screen (not shown) attached to the translating chassis (6) can be stretched.
  • Figure 3 also shows a laser (16a) mounted to the rear face plate (6b) via laser mount (18a) and image capture station mount (17a).
  • Laser mount (18a) is adjustable so that the beam of a laser (16a) can be adjusted in multiple degrees of freedom.
  • the laser (16a) includes optical elements which cause the beam to produce a laser curtain in the form of a beam combined in one plane but diverging in an orthogonal plane.
  • the laser mount (18a) allows the laser (16a) is adjusted into a given plane so that the laser curtain is in a plane perpendicular to the axis "Z" in which the translating chassis (6) translates.
  • the apparatus (1) includes a number of lasers (16) all adjusted so that their beams are confined to the same plane so they form a continuous curtain across the scanning area. (8). These additional lasers are best seen on Figures 4 and 5 as (16b) and (16c).
  • the infrared camera is a digital camera which has a wavelength low pass filter to avoid saturation by visible and rear infrared light.
  • the infrared camera (19a) is positioned directly back from the laser in the direction 2.
  • the infrared camera (19a) is also angled so that it is directed towards a point where the laser curtain (not shown) intersects the centre of the scanning area (8).
  • the infrared camera (19a) is therefore offset from the laser (16a) by a predetermined angle. It will be understood by those skilled in the art that the point of intersect of the camera (19) view and the laser curtain might be adjusted if an object is to be placed off-centre with respect to the scanning area (8).
  • a second type of camera in the form of a visible wave length camera (20a).
  • the visible wavelength camera (20a) is a colour camera.
  • This camera (20a) has a wavelength high-pass filter to alternate light at the wavelength of the laser curtain.
  • the cameras used in the preferred embodiment is a CCD camera with 640 x 480 or 1024 x 768 pixels. They are either black and white, for capturing the laser projection or colour for capturing the visible light image. They may have a frame rate of 60 frames per second or less.
  • the visible wave length camera (20a) is also directly back from the laser in the scanning axis Z and is aligned to be directed at the same point of intersection between the laser curtain (not shown) and the centre of the scanning area (8).
  • the cameras (19) and (20) include filters so that they capture complementary images in terms of the projection of the laser curtains. This is, one captures only infrared and the other captures only visible wavelengths.
  • the lasers are infrared, so any projection of the lasers (16) will be captured by the infrared cameras only.
  • Alternative means for capturing complementary images will be known to those skilled in the art. For example, the two types of images (laser projection, and non laser projection) might be achieved in part with the use of chop-in amplifiers which resolve the images in respect of time rather than wavelength.
  • Figure 3 also shows two concentrically arranged fluorescent tubes (21 ) and (22) which direct light towards the scanning area (8).
  • a filtering screen (23) which extends between the front face plate (6a) and back face plate (6b) and around the scanning area (8).
  • the filtering screen consists of a number of filters to adjust the light spectrum of any light supplied from the outside of the screen (23) in towards the scanning area (8) so that the spectrum approximately corresponds to a white light spectrum.
  • the filtering screen (23) may also include infrared filters, or heat shields, (not shown) to prevent light sources from interfering with the operation of the laser (16a) and the infrared camera (19).
  • a typical filter to include in the filtering screen (23) is a 'minus green' filter which removes a green component from fluorescent lighting. This green component is typically more apparent when an object is viewed via the visible wave length camera (20).
  • the filtering screen (23) may be used to adjust not only the spectrum of light directed towards the scanning area but also to adjust the spectrum of a given visible wavelength camera (20).
  • the translator chassis (6) has a convex section (6c) which provides a space for the fingers of the hand (not shown) when the translator chassis is in a position close to the aperture (3).
  • the filtering screen (23) has laser apertures, such as (24a) to (24c) shown in figure 3. These laser apertures allow for the output from the laser (16a) to project through the filtering screen (23). Otherwise, the infrared filters included in the filtering screen (23) will attenuate the output from the lasers (16) and prevent any laser light getting through to the object.
  • the filtering screen (23) also has camera apertures (such as (25a) to (25e) shown in figure 3). The camera apertures allow the cameras (19) and (20) to view an object within the scanning area (8).
  • Figures 4 to 6 show a side elevation of the scanning apparatus (1).
  • Figures 4 to 6 show a draw wire sensor (24) which is mounted on the base chassis (5).
  • the draw wire (25) is anchored to a reference anchor (26) on the translating chassis (6).
  • the draw wire sensor (24) provides a means to measure the displacement of the translating chassis (6) relative to the base chassis (5).
  • Alternative means will be apparent to those skilled in the art and may include optical or machine vision means, or a variety of other electrical or optical sensors known to those skilled in the art. In particular, any linear displacement transducer known to those skilled in the art would be suitable.
  • a draw wire sensor (24) is used in the preferred embodiment as it provides a simple and economical means for measuring the displacement of the translator chassis (6).
  • the fluorescent tubes (27) are mounted outside the filtering screen (23) and therefore provide filtered light into the scanning area (8).
  • the fluorescent tubes (27) are straight tubes which project between the front face (6a) and back face (6b) of the translator chassis (6) and are mounted parallel to the axis of translation "Z'. Suitable numbers, positions and specifications of these tubes (27) will be known to those skilled in the art.
  • the preferred embodiment has 10 pairs.
  • FIGs 7 and 8 show a perspective view of the scanning apparatus (1) from the underside with part of the base chassis (5) removed. Visible in these figures is the drive motor (11) for the threaded member (10) which, via the action of the corresponding threaded member (12), translates the translator chassis (6) along the horizontal guides (9a) and (9b).
  • Figures 9 and 10 show top and bottom plane views of the scanning apparatus (1).
  • a customer places their hand (not shown) into the aperture (3) in the box (2) of the scanning apparatus (1). They then rest their wrist (not shown) on the rest (4) and hold their hand stationary in the interior of the box where it will be positioned in the middle of the scanning area (8). As discussed above, some form of rest (not shown) may be included in the scanning area for the rest of a person's hand.
  • the translator chassis (6) is initially translated to a position (not shown) which is closest to the aperture (3).
  • the user's hand (not shown) is illuminated by the fluorescent tubes, (21a), (21b) and (27a) to (27e).
  • the output from the laser (16a) project in a band around the customer's wrist near their hand (not shown).
  • the drive motor (11) causes the translator chassis (6) to move away from the aperture (3) which draws the projected band formed by the output of the lasers (16a) to (16e) and the focus of the cameras (19) and (20) to draw along and past the user's hand.
  • Figure 11 depicts a process carried out on a computer or processor associated with the scanning apparatus (1). It will be apparent to those skilled in the art which specific forms of processor are suitable for executing the steps of the process described herein and what processing power each given processor provides.
  • Box 101 depicts a step in which the laser (16) infrared camera (19) and visible wavelength camera (20) are directed at, say, the wrist of the customer.
  • the lasers (16) were previously aligned by adjustment of their mounts in two degrees of freedom so that their outputs lie in a common place and form a curtain across and around the scanning area (8).
  • the laser curtain (not shown) projects a band onto, say, the wrist of the customer. This projected band will be detected by the infrared camera (19) which is offset by a predefined offset angle (not shown) from the laser curtain.
  • the infrared camera (19) includes a filter which attenuates visible light so the infrared camera (19) sees only the band projected onto the customer's wrist by the laser curtain.
  • the band when viewed from an offset angle, the band will reveal to the infrared camera (19) a profile of part of the customer's wrist which reveals the shape of a wrist as 'sampled' in the plane of the laser curtain (not shown).
  • the laser mounts (18) of the preferred embodiment consist of two brackets with 4 spacers between them. As will be understood by those skilled in the art, spacers of various lengths will allow the laser to be pointed in a suitable direction. Suitable alternative mounting arrangements will be apparent to those skilled in the art.
  • the laser curtain is provided by five lasers (16a) to (16e), each providing a 60 degree diverging fan. These are located at a sufficient radius from the scanning area (8) so that a continuous laser curtain is provided for a scanning area (8) large enough to accommodate a large hand.
  • the lasers of the preferred embodiment operate at an infrared wavelength of 780 nanometres.
  • the translator chassis (5) begins translating away from the wrist towards the fingers (not shown). This may typically be achieved by powering the drive motor (11).
  • a train of clock pulses is started.
  • the clock pulses may be at a frequency of fifteen Hz.
  • an image is captured from each of the five infrared cameras (19a) to (19e) and stored. The image is taken at each clock pulse.
  • an image is captured from the visible wavelength cameras (20a) to (2Oe) upon the same clock pulse. Therefore, the visible wavelength camera images and the infrared wavelength images are synchronized. This means that pairs of the images of the first and second types of camera will correspond to a given stage of progression of translation of the projected band over the hand. In fact, there are 5 sets of cameras so there will be 5 pairs or sets of these images. Each set will have been taken from a different orientation however and will see different aspects of the hand at the same stage of progression.
  • a reading is taken from the draw wire sensor (24) to be stored with the images captured from the cameras (19) and (20) to represent the position along the axis 'Z' to which each picture corresponds. This process is repeated until typically a sequence of 150 images are taken by each camera. That is, 750 infrared images and 750 visible wavelength images due to there being 5 image capture stations.
  • the drive motor (11) is turned off or reversed to move the translating chassis back to the starting position near the aperture (3).
  • the drive motor (11), threaded member (10) and frequency of the clock pulses (103) are, in the preferred embodiment, arranged so that 150 images are captured by each of the cameras (19a) to (19e) and (20a) to (2Oe).
  • These will be stored with the measurement taken from the draw wire sensor at each corresponding clock pulse.
  • Also stored with this data is the angle of orientation of each camera around the translating chassis (6). For example, there will be five different angles corresponding to each pair of infrared and visible wavelength cameras.
  • Figure 12 depicts at a high level a process used to generate 3-dimensional representation of an object captured with the scanning apparatus (1).
  • This process depicted in figure 12 generates a geometrical mesh from images captured by the infrared cameras (19) and assigns portions of images captured by the visible wavelength cameras to elements of the geometrical mesh. It will be apparent to those skilled in the art how each step in the process depicted in figure 12 might be implemented.
  • step 201 the images captured in the process depicted in figure 11 are provided in a storage medium in a form apparent to those skilled in the art.
  • step 202 the images from the infrared cameras are analysed to find the pixels in each vertical column of pixels which is the brightest and therefore represents a 'brightness maximas'. It will be apparent to those skilled in the art that columns of pixels may be grouped and that columns may be substituted for bands that are off- vertical.
  • steps 203 a smoothing algorithm is applied to the pixels identified as a 'brightness maximas' to create a smooth line of 'brightness maximas'.
  • This line represents the profile in the 'X' and 'Y' coordinate axis sampled along the given 'Z.
  • 'Z' is provided by the reading from the draw wire sensor taken at the clock pulse 102 corresponding to the infrared image being analysed. It will be apparent to those skilled in the art that an analogue signal from the draw wire sensor may actually be used to generate the clock pulses.
  • step 205 the smoothed line formed from brightness maximas is sampled to find 'X' and ⁇ ' coordinates for sample points. These are stored with the corresponding 'X' coordinate provided by the draw wire sensor (24) to create vertices for mesh elements that form a complete mesh representing the hand. The points are then used as vertices of mesh elements to form a 3-dimensional geometric mesh.
  • an abridged set of mesh elements is created.
  • the number of mesh elements in the abridged set of mesh elements is factored to the size of the hand. For example a small hand may be represented by perhaps 25,000 mesh elements, whereas a large hand comprise 40,000 mesh elements or triangles.
  • the size of the hand can be calculated by any suitable means apparent to those skilled in the art. A given level of processing power available to the apparatus and method of the preferred embodiment may be used to scale the number of mesh elements, in place of the hand size.
  • a given abridged mesh element is assigned a reference point which is, in the preferred embodiment, the centre of area of the mesh. Those skilled in the art will be aware of algorithms for finding the centre of area but these will typically involve integrating the area. Those skilled in the art will also recognise that the centre of area is an analogous term to 'centre of mass'.
  • each abridged mesh element will be assigned a surface normal, which those skilled in the art will recognise as simply a unit vector which is orthogonal to the surface of the mesh element.
  • the reference point or centre of mass would be represented by 'X', 'Y' and also 'Z' coordinates.
  • the 'Z' coordinate of the reference point of the given abridged mesh element is used to find the closest 'Z' coordinate assigned to a set of visible wavelength camera images. This image will best correspond to the stage of progression of the laser projection over the object. In the case of these images, the 'Z' coordinate is supplied by the draw wire sensor and clock pulse 102. However, in the case of the abridged mesh element, this 'Z' measurement will have been lost during the abridgment process. Hence, the centre of area reference point is used to supply a 'Z' coordinate.
  • Step 209 identifies five visible wavelength images which have a 'Z' coordinate corresponding to that of the reference point for the given abridged mesh element.
  • a single visible wavelength image is selected as corresponding to the given abridged mesh element by comparing the surface normal assigned to the given abridged mesh element with the angle assigned to the visible wave length image by the associated camera (20a) to (2Oe).
  • the given abridged mesh element which has essentially been captured with the infrared cameras (19a) to (19e) and lasers (16a) to (16e) before the abridgment process, is associated with a visible wavelength image which will have the portion of the hand that is covered by the abridged mesh element approximately squarely in the centre of the visible wave length image.
  • step 211 the vertices of the abridged mesh element and the visible wavelength image are input into an algorithm which is known to those skilled in the art to assign pixels from the visible wavelength image to the abridged mesh element.
  • Steps 207 to 211 are repeated until pixels from visible wavelength images have been assigned to suitable parts of each of the abridged mesh elements.
  • Step 212 represents the end of the process.
  • a 3-dimensional representation of the hand is provided and this representation includes visible wavelength images or photographic type images of portions of the hand. Therefore, the shape of a hand will be captured in 3-dimensions along with the colour, tone and texture of the hand. For example, the person's skin tone, including colouration and texture created by veins and hairs will be captured in the 3-dimensional representation of the hand.
  • This 3-dimensional representation will be suitable for use in displaying goods such as rings or gems which interact aesthetically with the dimensions, colour, tone and texture of the hand to allow an informed retail decision on a ring and gem which a given customer finds aesthetically pleasing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Appareil de saisie de représentations tridimensionnelles d'objets, ledit appareil comportant un laser conçu pour projeter un faisceau laser sur un objet; un support pour le laser conçu pour déplacer le faisceau laser projeté sur l'objet par-dessus celui-ci; une première caméra conçue pour saisir une image du faisceau laser projeté sur l'objet; une deuxième caméra conçue pour saisir une image excluant sensiblement le faisceau laser projeté sur l'objet; et un processeur conçu pour saisir une suite d'images provenant des première et deuxième caméras. Ces images correspondent à des étapes données de la progression du mouvement du faisceau laser projeté sur l'objet. L'invention permet d'affecter des parties d'images saisies par la deuxième caméra à une représentation par maillage géométrique de l'objet créée à partir d'images issues de la première caméra. L'invention concerne également un procédé associé.
PCT/NZ2006/000259 2006-10-11 2006-10-11 Appareil et procédé de balayage tridimensionnel amélioré WO2008044943A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/NZ2006/000259 WO2008044943A1 (fr) 2006-10-11 2006-10-11 Appareil et procédé de balayage tridimensionnel amélioré

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/NZ2006/000259 WO2008044943A1 (fr) 2006-10-11 2006-10-11 Appareil et procédé de balayage tridimensionnel amélioré

Publications (1)

Publication Number Publication Date
WO2008044943A1 true WO2008044943A1 (fr) 2008-04-17

Family

ID=39283077

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NZ2006/000259 WO2008044943A1 (fr) 2006-10-11 2006-10-11 Appareil et procédé de balayage tridimensionnel amélioré

Country Status (1)

Country Link
WO (1) WO2008044943A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130132038A1 (en) * 2011-11-18 2013-05-23 Nike, Inc. Automated 3-D Modeling Of Shoe Parts
US8958901B2 (en) 2011-11-18 2015-02-17 Nike, Inc. Automated manufacturing of shoe parts
US9084451B2 (en) 2011-11-18 2015-07-21 Nike, Inc. Automated identification and assembly of shoe parts
US9451810B2 (en) 2011-11-18 2016-09-27 Nike, Inc. Automated identification of shoe parts
US10552551B2 (en) 2011-11-18 2020-02-04 Nike, Inc. Generation of tool paths for shore assembly
RU2754762C1 (ru) * 2020-11-18 2021-09-07 Общество с ограниченной ответственностью "Компания "Нординкрафт" Способ получения виртуальных моделей длинномерных изделий

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6330523B1 (en) * 1996-04-24 2001-12-11 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
US20020050988A1 (en) * 2000-03-28 2002-05-02 Michael Petrov System and method of three-dimensional image capture and modeling
US20030160970A1 (en) * 2002-01-30 2003-08-28 Anup Basu Method and apparatus for high resolution 3D scanning
US20050257748A1 (en) * 2002-08-02 2005-11-24 Kriesel Marshall S Apparatus and methods for the volumetric and dimensional measurement of livestock
US20060120576A1 (en) * 2004-11-08 2006-06-08 Biomagnetic Imaging Llc 3D Fingerprint and palm print data model and capture devices using multi structured lights and cameras
US20060154198A1 (en) * 2005-01-11 2006-07-13 Duane Durbin 3D dental scanner
US20060221072A1 (en) * 2005-02-11 2006-10-05 Se Shuen Y S 3D imaging system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6330523B1 (en) * 1996-04-24 2001-12-11 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
US20020050988A1 (en) * 2000-03-28 2002-05-02 Michael Petrov System and method of three-dimensional image capture and modeling
US20030160970A1 (en) * 2002-01-30 2003-08-28 Anup Basu Method and apparatus for high resolution 3D scanning
US20050257748A1 (en) * 2002-08-02 2005-11-24 Kriesel Marshall S Apparatus and methods for the volumetric and dimensional measurement of livestock
US20060120576A1 (en) * 2004-11-08 2006-06-08 Biomagnetic Imaging Llc 3D Fingerprint and palm print data model and capture devices using multi structured lights and cameras
US20060154198A1 (en) * 2005-01-11 2006-07-13 Duane Durbin 3D dental scanner
US20060221072A1 (en) * 2005-02-11 2006-10-05 Se Shuen Y S 3D imaging system

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190015608A (ko) * 2011-11-18 2019-02-13 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
US11641911B2 (en) 2011-11-18 2023-05-09 Nike, Inc. Automated identification and assembly of shoe parts
US8958901B2 (en) 2011-11-18 2015-02-17 Nike, Inc. Automated manufacturing of shoe parts
US9084451B2 (en) 2011-11-18 2015-07-21 Nike, Inc. Automated identification and assembly of shoe parts
US9451810B2 (en) 2011-11-18 2016-09-27 Nike, Inc. Automated identification of shoe parts
KR20180011856A (ko) * 2011-11-18 2018-02-02 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
KR101822604B1 (ko) 2011-11-18 2018-03-08 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
US9939803B2 (en) 2011-11-18 2018-04-10 Nike, Inc. Automated manufacturing of shoe parts
KR101868774B1 (ko) * 2011-11-18 2018-06-18 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
KR20180067712A (ko) * 2011-11-18 2018-06-20 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
KR20180112875A (ko) * 2011-11-18 2018-10-12 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
KR101908267B1 (ko) 2011-11-18 2018-10-15 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
US11879719B2 (en) 2011-11-18 2024-01-23 Nike, Inc. Automated 3-D modeling of shoe parts
US8849620B2 (en) * 2011-11-18 2014-09-30 Nike, Inc. Automated 3-D modeling of shoe parts
US10671048B2 (en) 2011-11-18 2020-06-02 Nike, Inc. Automated manufacturing of shoe parts
US10393512B2 (en) 2011-11-18 2019-08-27 Nike, Inc. Automated 3-D modeling of shoe parts
KR102062354B1 (ko) 2011-11-18 2020-01-03 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
US10552551B2 (en) 2011-11-18 2020-02-04 Nike, Inc. Generation of tool paths for shore assembly
US20130132038A1 (en) * 2011-11-18 2013-05-23 Nike, Inc. Automated 3-D Modeling Of Shoe Parts
US10194716B2 (en) 2011-11-18 2019-02-05 Nike, Inc. Automated identification and assembly of shoe parts
US11266207B2 (en) 2011-11-18 2022-03-08 Nike, Inc. Automated identification and assembly of shoe parts
US11317681B2 (en) 2011-11-18 2022-05-03 Nike, Inc. Automated identification of shoe parts
US11341291B2 (en) 2011-11-18 2022-05-24 Nike, Inc. Generation of tool paths for shoe assembly
US11346654B2 (en) 2011-11-18 2022-05-31 Nike, Inc. Automated 3-D modeling of shoe parts
US11422526B2 (en) 2011-11-18 2022-08-23 Nike, Inc. Automated manufacturing of shoe parts
KR101947336B1 (ko) 2011-11-18 2019-02-12 나이키 이노베이트 씨.브이. 신발 부분들의 자동화된 3-d 모델링
US11763045B2 (en) 2011-11-18 2023-09-19 Nike, Inc. Generation of tool paths for shoe assembly
RU2754762C1 (ru) * 2020-11-18 2021-09-07 Общество с ограниченной ответственностью "Компания "Нординкрафт" Способ получения виртуальных моделей длинномерных изделий

Similar Documents

Publication Publication Date Title
EP2258254B1 (fr) Appareil pour forme de surface dentaire et imagerie de nuance
JP4845607B2 (ja) 3次元形状測定方法及び装置
US9314150B2 (en) System and method for detecting tooth cracks via surface contour imaging
JP3987908B2 (ja) 宝石用原石の特性を求めるためのデータを発生する装置ならびに宝石用原石の特性を求める方法およびコンピュータ・プログラム
WO2008044943A1 (fr) Appareil et procédé de balayage tridimensionnel amélioré
KR101605386B1 (ko) 측정 물체 표면 위에서 3d 좌표들을 결정하기 위한 광학 측정 방법 및 측정 시스템
CN105510340B (zh) 对琢石成像的方法
Davis et al. A laser range scanner designed for minimum calibration complexity
JP2016535661A (ja) 眼表面のマッピング
US20180299262A1 (en) Device for optical 3d measuring of an object
WO2016041147A1 (fr) Appareil d'imagerie de surface dentaire utilisant une projection laser
KR20120023631A (ko) 눈물막의 촬영, 처리 및/또는 표시, 및/또는 눈물막 층 두께(들)의 측정을 위한 눈 표면 간섭계(osi) 디바이스들, 시스템들 및 방법들
JP7270702B2 (ja) 深度感知システムおよび方法
EP2291606A1 (fr) Sonde d'inspection optique
WO2018170032A1 (fr) Génération de lumière structurée pour caméra 3d intra-buccale utilisant un balayage mems 1d
JP2002213931A (ja) 3次元形状計測装置および3次元形状計測方法
JP2007163460A (ja) 宝石の特性を判定するためのデータを生成する装置
US20130238111A1 (en) Quantifying defects and handling thereof
US20080074648A1 (en) Method and Apparatus for Three-Dimensional Measurement of Objects in an Extended angular range
JP4447970B2 (ja) 物体情報生成装置および撮像装置
CN113379893B (zh) 一种利用光学反射合成3d人脸模型的方法
JP2009537915A (ja) 画像情報生成方法
JP3454088B2 (ja) 3次元形状計測方法及びその装置
CN113474619A (zh) 使用可移动扫描仪生成纹理模型
JP7001612B2 (ja) 対象物の少なくとも1つの既定点の3d座標を決定する方法および装置

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06812834

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 576077

Country of ref document: NZ

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06812834

Country of ref document: EP

Kind code of ref document: A1