EP2195609A2 - Procédé de numérisation tridimensionnelle - Google Patents
Procédé de numérisation tridimensionnelleInfo
- Publication number
- EP2195609A2 EP2195609A2 EP08860393A EP08860393A EP2195609A2 EP 2195609 A2 EP2195609 A2 EP 2195609A2 EP 08860393 A EP08860393 A EP 08860393A EP 08860393 A EP08860393 A EP 08860393A EP 2195609 A2 EP2195609 A2 EP 2195609A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- point
- image
- optical axis
- speckle
- reference plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
Definitions
- the invention relates to the construction of non-contact synthetic images from physical three-dimensional objects. This technique is commonly called three-dimensional scanning, or, according to the English terminology, 3D scanning.
- Such a tool generally comprises one or more scanners, portable or not, performing an acquisition of the topography of the object along one or more orientations, this topography then being used to perform a three-dimensional reconstruction of the object's synthesis in the environment CAD.
- - scanning which consists of projecting a linear image onto the object (usually a plane light brush generated by means of a laser source) and scanning the surface of the object at medium of this line whose successive distortions allow, as and when, a reconstruction of the entire illuminated face
- - the capture point or "one-shot” which consists of projecting on the object, in a timely manner , a structured image containing a predetermined pattern whose general distortion, with respect to its projection on a plane, is analyzed point by point to allow reconstruction of the illuminated face.
- the use of the laser is delicate, and can even be dangerous if one must use it to scan the human face.
- the known techniques do not go without certain disadvantages.
- the dispersion of the projection and acquisition tools does not make it possible to produce sufficiently compact and lightweight scanners to allow real portability, so that even for scanning a small object (such as a telephone), it is necessary to have all the paraphernalia around the object.
- the calculations made from the images resulting from the acquisition are generally complex (see in particular the document US 2006/0017720 cited above), requiring heavy programs implemented on powerful processors.
- the invention aims to overcome the aforementioned drawbacks by proposing a technical solution for performing the three-dimensional scanning of an object in a simple and fast manner.
- the invention proposes a method of constructing a synthetic image of a three-dimensional surface of a physical object, this method comprising the steps of: - choosing a surface on the object; have, facing said surface, a projector provided with a light source, an optical axis, and a target defining a speckle comprising a plurality of points of light intensities and / or predetermined colors, optical axis of the projector towards the surface to be digitized, project, along the optical axis, the fleck on the surface, acquire and memorize a two-dimensional image of the speckled projected on the surface and deformed by it, by means of a sensor optical system placed in the optical axis of the headlamp, - comparing, for at least one selection of mouchetis points, the image of the mouchetis deformed with an undistorted image of the mouchetis, as projected on a reference plane, to calculate, for each point of the selection, at least the depth coordinate, measured parallel to the optical axis,
- FIG. 1 is a three-dimensional view of a three-dimensional scanning device applied to the capture of a physical object, in this case an apple
- Figure 2 is an exploded perspective view of the device of Figure 1, showing a portion of its internal components, according to another angle of view
- FIG. 3 is a view from above of the device of FIG. 2
- FIG. 4 is a schematic representation, in Euclidean geometry, of the mathematical model underlying the digitization technique according to the invention
- Figure 5 is a plan view of a luminous speckle, as projected on a reference plane perpendicular to the axis of projection
- FIG. 1 is a three-dimensional view of a three-dimensional scanning device applied to the capture of a physical object, in this case an apple
- Figure 2 is an exploded perspective view of the device of Figure 1, showing a portion of its internal components, according to another angle of view
- FIG. 3 is a view from above of the device of FIG. 2
- FIG. 4 is a schematic representation, in Euclidean geometry, of the mathematical
- FIG. 6 is a schematic perspective view illustrating the projection, on a front face of an ergonomic pillow-type object, of the mouchetis whose photographic acquisition makes it possible to produce a topography of the illuminated face;
- Figure 7 is a plan view, in the axis of the capture apparatus, the speckled projected on one side of the object of Figure 6;
- FIGS. 8, 9 and 10 illustrate the analysis of the speckle distortion along three orthogonal axes of the three-dimensional space, respectively in the abscissa, in the ordinate and in the depth;
- FIGS. 11 and 12 are perspective views, respectively from the front and from the rear, of the illuminated front face of the object of FIG. 6, as reconstructed from the analysis illustrated on FIGS.
- Figures 8 to 10; Figure 13 is a plan view of the illuminated front face of the object of Figure 6, as reconstructed; FIGS. 14 and 15 are side views, in two different orientations, of the illuminated front face of the object of FIG. 6.
- Figures 1 to 3 is shown schematically, a device 1 for digitization without contact, for constructing a synthetic image 2 of a three-dimensional surface 3 of a physical object 4.
- this object 4 is an apple, but it could be any other object having one or more three-dimensional surfaces to be digitized, these surfaces being able to be in relief (that is to say not flat) ).
- This device 1 comprises an apparatus 5 comprising a portable casing 6 provided with a handle 7 enabling it to be grasped and manipulated and, mounted in this casing 6, a projector 8 and a camera-like camera 9 (FIG. that is, capable of continuous shooting, for example at the normalized frame rate of 24 frames per second), or of a camera type (that is to say, taking one-shot shots). It is assumed in the following that the camera 9 is a camera, which can be used, if necessary, as a camera.
- the projector 8 comprises a light source 10 and arranged facing this light source 10, a focusing optic 11 and a target 12.
- the light source 10 is preferably a source of light.
- white light for example filament type (as in the schematic example shown in Figures 2 and 3) or halogen type.
- Focusing optics 11 shown diagrammatically in FIGS. 2 and 3 by a single converging lens, define a main optical axis 13 passing through the light source 10.
- the pattern 12 defines a speckle 14 comprising a multitude of points of light intensities (contrast) and / or predetermined colors.
- speckle 14 is shown in plan in FIG. 6: this is a scab-type pattern or Speckle (for a definition of the Speckle pattern, see for example Juliette SELB, "Acousto-optical virtual source for the imaging of diffusing media ", Thesis of Doctorate, Paris Xl, 2002).
- the target 12 may be in the form of a translucent or transparent plate (glass or plastic), square or rectangular, slide-type, on which the speckled 14 is printed by a conventional method (transfer, offset, screen printing, flexography, laser, inkjet, etc.).
- the sight 12 is arranged between the light source 10 and the optic 11, on the axis 13 thereof, that is to say perpendicular to the optical axis 13 and so that it passes through the center of the target 12 (defined in this case by the crossing of its diagonals ).
- the pattern 12 is placed at a predetermined distance from the optic 11, depending on the focal length thereof (see the example below).
- Two of the adjacent sides of the target 12 respectively define an abscissa axis (x) and an ordinate axis (y), the optical axis defining a depth axis (z).
- the camera 9 comprises an optic 15, shown diagrammatically in FIGS. 2 and 3 by a simple convergent lens, and defining a secondary optical axis 16.
- the camera 9 further comprises a photosensitive sensor 17, for example of the CCD type, which is in the form of a square or rectangular plate and is placed, facing the optics 15, on the secondary optical axis 16, c that is to say, perpendicular to it and so that the axis passes through the center of the sensor 17 (defined in this case by the crossing of its diagonals).
- the sensor 17 is placed at a predetermined distance from the optics 15, depending on the focal length thereof (see the example below).
- the projector 8 and the camera 9 are arranged in such a way that the optical axes 13 and 16 are coplanar and perpendicular.
- the apparatus 5 further comprises a semi-reflecting mirror 18 disposed on the main optical axis 13 at the intersection with the secondary optical axis 16. More specifically, the mirror 18 has two opposite planar main faces, namely one face rear 19, disposed facing the projector 8, and a front face 20, disposed facing the camera 9.
- the semi-reflecting mirror 18 is in the form of a thin strip whose faces 19 and 20 are parallel, but it could be in the form of a prism whose faces would be inclined 45 ° with respect to each other.
- This front face 20 is semi-reflective, that is to say that it is arranged to transmit along the main optical axis 13 the incident light from the projector 8, but to reflect along the secondary optical axis 16 the reflected light reaching it along the main optical axis 13 from the illuminated object 4.
- the mirror 18 is arranged in such a way that the principal 13 and secondary 16 axes are concurrent on the semi-reflecting front face (also called the splitter plate), and more precisely in the center thereof (defined in this case by the crossed of its diagonals).
- the semi-reflecting front face also called the splitter plate
- the incident light emitted by the source 10 first passes through the pattern 12, is focused by the optic 11 and then passes without reflection the mirror 18 semi-reflective.
- This light illuminates - with projection of the target 12 - the object 4 to be digitized, which reflects a part of it which, emitted along the main optical axis 13 in the opposite direction of the incident light, is reflected at right angles, according to the secondary optical axis 16, by the separating plate 20 in the direction of the camera 9.
- This reflected light is focused by the optics 15 of the camera 9 and finally hits the sensor 17.
- the secondary optical axis 16 is virtually confused with the main optical axis 13. In other words, although the camera 9 is not physically arranged on the main optical axis 13, it is in the main optical axis 13, as far as everything goes. passes as if the light reflected by the object 4 and hitting the sensor 17 had not undergone any deviation.
- a semi-reflecting mirror 18 makes it possible, in practice, to avoid the obscuration of the projector 8 that would be caused by the physical mounting of the camera 9 on the main optical axis 13 in front of the projector 8 or, conversely, the occultation of the camera 9 that would cause its physical assembly on the main optical axis 13 behind the projector 8.
- the distance from the sensor 17 to the separator blade 20 is less than the distance from the target 12 to the separator blade 20.
- the camera 9 is placed , in the main optical axis 13, in front of the projector 8. This makes it possible to use a sensor 17 of reasonable size (and therefore of reasonable cost), the field of view of which is strictly included in the image of the speckle 14, as this is visible in FIGS. 8 to 10.
- the preferred dimensioning makes it possible to project on a target plane f (reference plane) located at 390 mm from the target 12 a clear image of the speckle 14 (FIG. About 350 mm, the sensor 17 can see within this image a field whose large side is about 300 mm.
- the apparatus 5 further comprises a manual triggering mechanism (for example trigger type), which can be mounted on the handle 6 and for actuating the projector 8 and the camera 9, that is to say, concretely, the projection of the sights and the shooting.
- the triggering mechanism may be of the flash type, that is to say the projection of the fleck 14 and the shooting are performed punctually and simultaneously, or delayed type, that is to say that the shooting can be triggered - automatically or manually - during a period of time during which the projection is performed continuously.
- the device 1 finally comprises a unit 21 for processing the data coming from the sensor 17.
- This processing unit 21 is in practice in the form of a processor embedded in the apparatus 5 or, as illustrated in FIG. central unit 22 of a remote computer 23 connected to the apparatus 5 via a wired or wireless communication interface, processor 21 on which is implemented a software application for constructing computer-generated images from the data from the sensor 17.
- the apparatus 5 is positioned by orienting it so that the projector 8 is facing this surface 3 and directing the main optical axis 13 towards that ( Figure 1), at an estimated distance close to the distance to obtain a clear image of the mouchetis 14 (that is to say, the distance that would normally be the reference plane F).
- the triggering mechanism is then actuated to project the image of the speckle 14 of the test pattern 12 onto the surface 3 to be digitized, along the main optical axis 13.
- the image of the speckle 14 on the object 4 presents at least locally, relative to an image projected on the reference plane, distortions due to the reliefs of the illuminated surface 3 (FIG. 6).
- the shooting is then carried out by means of the camera 9 to acquire and memorize the two-dimensional image of speckled 14 "deformed by the illuminated surface 3 (FIG. 7), this image being reflected successively by the surface 3 of the object 4 and the separator blade 20.
- the following operation consists in comparing the image of the deformed speckled 14 'with the image of the undistorted speckle 14, as projected on the reference plane F. This comparison can be performed for each point of the mouchetis 14, or in a selection of predetermined points, performed within the undistorted image.
- the comparison be made for each point of the image, that is to say, concretely, for each pixel of the image acquired by the sensor 17. It goes without saying that the accuracy of the construction depends on the accuracy of the sensor 17.
- the preferred dimensioning proposed above provides, for a sensor 17 comprising of the order of 9 million pixels, a precision of the order of one tenth of mm, sufficient to construct an acceptable synthetic image 2 of any type of object whose size is on a human scale.
- the following operation consists in matching, for the set of selected points, each point of the speckle 14 undeformed with the corresponding point of speckled 14 'deformed.
- this pairing can be achieved by correlation, that is to say by successive approaches within zones which, although having local disparities, appear similar in neighboring regions of the two images.
- Figure 4 illustrates the geometric method used to perform this calculation.
- the main optical axis 13 and the secondary optical axis 16 are merged, in accordance with the assembly described above.
- the ratings are as follows: r: reference plane;
- P selected point in the mouchetis image as projected in the reference plane
- O point of the target whose image is the point P;
- M image of the point P on the object
- M ' projection of the point M on the reference plane from the point
- ⁇ y y (M ') - y (p): vertical component (ordinate) of the shift between M' and P.
- the parameters / and AB are known; they depend on the structure of the device 1.
- the parameters X (P), y (P) can be computed trivially by a simple translation, from a coordinate system centered on the main optical axis 13, towards the centered on the point B, perpendicular projection of the point O in the reference plane T.
- Figures 8 and 9 illustrate, in shades of gray, the deviations Ax and Ay found in the image of the speckled due to the distortion of it by its projection on the object 4.
- FIG. 10 illustrates a cartography of the depths calculated for each point of the mouchetis image as projected onto the object 4. From the depth coordinate Z (M) thus calculated, it is possible to calculate the abscissa X (M) and the ordinate y (M) of the point M in the system B, x, y, z, using the following formulas:
- the coordinates of the point M in the reference plane, in a system centered on the optical axis 13, can be deduced from the X (M) and y (M) coordinates trivially by a simple translation in the reference plane F. From the complete coordinates thus calculated for each point M, the calculation software can then reconstruct the illuminated face 3 of the object 4 in the form of a synthetic image 2 illustrated in shades of gray in FIGS.
- the operations just described are repeated for a plurality of adjacent surfaces of the object.
- the computer-generated images of all the digitized surfaces are then assembled, the overlapping zones of two adjacent surfaces making it possible, for example by means of an image correlation technique, to perform a precise sewing of these surfaces along their edges. .
- the device 1 and method described above provide a number of advantages.
- the prior acquisition of the undeformed speckle 14 projected on the reference plane F makes it possible to have a second camera - virtual, the latter - whose vision provides a reference from which are made the comparisons leading to the mapping of the three-dimensional surface (in relief) to be digitized.
- the fact of having the camera 9 in the optical axis 13 of the projector 8 - by means of the interposition of a splitter blade - makes it possible to benefit from the advantages of axial stereovision, that is to say to avoid occultation phenomena that would be encountered if the camera 9 was angularly offset relative to the axis 13 of the projector 8, some areas of the reliefs of the surface 3 to be digitized being indeed, in such a configuration, illuminated from the point from the projector but remaining in the shadow from the point of view of the camera.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0706822A FR2921719B1 (fr) | 2007-09-28 | 2007-09-28 | Procede de construction d'une image de synthese d'une surface tridimensionnelle d'un objet physique |
PCT/FR2008/001340 WO2009074751A2 (fr) | 2007-09-28 | 2008-09-26 | Procédé de numérisation tridimensionnelle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2195609A2 true EP2195609A2 (fr) | 2010-06-16 |
Family
ID=39386375
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08860393A Withdrawn EP2195609A2 (fr) | 2007-09-28 | 2008-09-26 | Procédé de numérisation tridimensionnelle |
Country Status (5)
Country | Link |
---|---|
US (1) | US8483477B2 (fr) |
EP (1) | EP2195609A2 (fr) |
JP (1) | JP2010541058A (fr) |
FR (1) | FR2921719B1 (fr) |
WO (1) | WO2009074751A2 (fr) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IT1398460B1 (it) * | 2009-06-09 | 2013-02-22 | Microtec Srl | Metodo per la classificazione di prodotti alimentari in grado di rotolare sulla propria superficie esterna, quali frutta ed ortaggi. |
US8908958B2 (en) * | 2009-09-03 | 2014-12-09 | Ron Kimmel | Devices and methods of generating three dimensional (3D) colored models |
FR2950140B1 (fr) | 2009-09-15 | 2011-10-21 | Noomeo | Procede de numerisation tridimensionnelle comprenant une double mise en correspondance |
FR2950157A1 (fr) | 2009-09-15 | 2011-03-18 | Noomeo | Procede de numerisation tridimensionnelle d'une surface comprenant laprojection d'un motif combine |
JP2012002780A (ja) * | 2010-06-21 | 2012-01-05 | Shinko Electric Ind Co Ltd | 形状計測装置、形状計測方法、および半導体パッケージの製造方法 |
TWI428568B (zh) * | 2010-09-03 | 2014-03-01 | Pixart Imaging Inc | 測距方法、測距系統與其處理軟體 |
MX2013007948A (es) * | 2011-01-07 | 2013-11-04 | Landmark Graphics Corp | Sistemas y metodos para la construccion de cuerpos cerrados durante el modelado 3d. |
US9800795B2 (en) * | 2015-12-21 | 2017-10-24 | Intel Corporation | Auto range control for active illumination depth camera |
JP6780315B2 (ja) * | 2016-06-22 | 2020-11-04 | カシオ計算機株式会社 | 投影装置、投影システム、投影方法及びプログラム |
CN108759720B (zh) * | 2018-06-07 | 2020-02-21 | 合肥工业大学 | 光滑表面面型测量方法 |
US11783506B2 (en) * | 2020-12-22 | 2023-10-10 | Continental Autonomous Mobility US, LLC | Method and device for detecting a trailer angle |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5003166A (en) * | 1989-11-07 | 1991-03-26 | Massachusetts Institute Of Technology | Multidimensional range mapping with pattern projection and cross correlation |
US5835241A (en) * | 1996-05-30 | 1998-11-10 | Xerox Corporation | Method for determining the profile of a bound document with structured light |
US6377353B1 (en) * | 2000-03-07 | 2002-04-23 | Pheno Imaging, Inc. | Three-dimensional measuring system for animals using structured light |
FR2842591B1 (fr) * | 2002-07-16 | 2004-10-22 | Ecole Nale Sup Artes Metiers | Dispositif pour mesurer des variations dans le relief d'un objet |
US7929752B2 (en) * | 2003-10-31 | 2011-04-19 | Nano Picture Co., Ltd. | Method for generating structured-light pattern |
US7330577B2 (en) * | 2004-01-27 | 2008-02-12 | Densys Ltd. | Three-dimensional modeling of the oral cavity by projecting a two-dimensional array of random patterns |
GB2410794A (en) | 2004-02-05 | 2005-08-10 | Univ Sheffield Hallam | Apparatus and methods for three dimensional scanning |
US20060017720A1 (en) * | 2004-07-15 | 2006-01-26 | Li You F | System and method for 3D measurement and surface reconstruction |
US7609865B2 (en) * | 2004-11-08 | 2009-10-27 | Biomagnetics | 3D fingerprint and palm print data model and capture devices using multi structured lights and cameras |
CN101288105B (zh) * | 2005-10-11 | 2016-05-25 | 苹果公司 | 用于物体重现的方法和系统 |
WO2007105205A2 (fr) * | 2006-03-14 | 2007-09-20 | Prime Sense Ltd. | Détection tridimensionnelle au moyen de formes de tacheture |
JP4917615B2 (ja) * | 2006-02-27 | 2012-04-18 | プライム センス リミティド | スペックルの無相関を使用した距離マッピング(rangemapping) |
CN101957994B (zh) * | 2006-03-14 | 2014-03-19 | 普莱姆传感有限公司 | 三维传感的深度变化光场 |
JP5592070B2 (ja) * | 2006-03-14 | 2014-09-17 | プライム センス リミティド | 三次元検知のために深度変化させる光照射野 |
-
2007
- 2007-09-28 FR FR0706822A patent/FR2921719B1/fr active Active
-
2008
- 2008-09-26 WO PCT/FR2008/001340 patent/WO2009074751A2/fr active Application Filing
- 2008-09-26 JP JP2010526336A patent/JP2010541058A/ja active Pending
- 2008-09-26 US US12/680,557 patent/US8483477B2/en active Active
- 2008-09-26 EP EP08860393A patent/EP2195609A2/fr not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
See references of WO2009074751A2 * |
Also Published As
Publication number | Publication date |
---|---|
WO2009074751A3 (fr) | 2009-08-06 |
US20110170767A1 (en) | 2011-07-14 |
FR2921719B1 (fr) | 2010-03-12 |
FR2921719A1 (fr) | 2009-04-03 |
JP2010541058A (ja) | 2010-12-24 |
WO2009074751A2 (fr) | 2009-06-18 |
US8483477B2 (en) | 2013-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2195609A2 (fr) | Procédé de numérisation tridimensionnelle | |
JP5928792B2 (ja) | 非接触式高分解能の手形取り込みのための装置及び方法 | |
EP2979059B1 (fr) | Module portable de mesure de lumière structurée ayant un dispositif de décalage de motif incorporant une optique à motif fixe pour l'éclairage d'un objet examiné | |
EP2646789B1 (fr) | Procédé de détermination d'au moins une caractéristique de réfraction d'une lentille ophtalmique | |
CA2402618A1 (fr) | Dispositif d'acquisition d'image panoramique | |
BE1017316A7 (fr) | Appareil pour determiner le forme d'une gemme. | |
Reshetouski et al. | Mirrors in computer graphics, computer vision and time-of-flight imaging | |
Rachakonda et al. | Sources of errors in structured light 3D scanners | |
WO2010072912A1 (fr) | Dispositif de numerisation tridimensionnelle a reconstruction dense | |
WO2009074750A2 (fr) | Dispositif de numérisation tridimensionnelle | |
WO2004017020A1 (fr) | Dispositif pour mesurer des variations dans le relief d'un objet | |
WO2008023196A1 (fr) | Appareil d'enregistrement et d'affichage d'images tridimensionnelles | |
FR2951564A1 (fr) | Procede et installation d'analyse de parametres geometriques d'un objet | |
FR2672119A1 (fr) | Systeme de mesure de surfaces a trois dimensions devant etre representees mathematiquement ainsi qu'un procede de mesure et un gabarit d'etalonnage du systeme de mesure. | |
US9208369B2 (en) | System, method and computer software product for searching for a latent fingerprint while simultaneously constructing a three-dimensional topographic map of the searched space | |
FR3061301A1 (fr) | Procede d'observation d'un objet | |
US20160320312A1 (en) | Gemstone imaging apparatus | |
WO2012035257A1 (fr) | Dispositif et procédé de mesure de la forme d'un miroir ou d'une surface speculaire | |
KR101792343B1 (ko) | 마이크로 렌즈 어레이를 이용한 매트릭스 광원 패턴 조사 적외선 프로젝터 모듈 및 이를 이용한 3차원 스캐너 | |
FR3023384A1 (fr) | Dispositif de visualisation du marquage d'un verre ophtalmique | |
FR2920874A1 (fr) | Installation et procede d'imagerie en luminescence | |
WO2011033186A1 (fr) | Procédé de numérisation tridimensionnelle d'une surface comprenant la projection d'un motif combiné | |
Bonfort et al. | Reconstruction de surfaces réfléchissantes à partir d’images | |
Aguilar et al. | Low cost 3D scanning process using digital image processing | |
US5872631A (en) | Optical two- and three-dimensional measuring of protrusions and convex surfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA MK RS |
|
17P | Request for examination filed |
Effective date: 20100421 |
|
DAX | Request for extension of the european patent (deleted) | ||
19U | Interruption of proceedings before grant |
Effective date: 20131121 |
|
19W | Proceedings resumed before grant after interruption of proceedings |
Effective date: 20151201 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: AABAM |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20161110 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: CONDOR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20170523 |