KR102016413B1 - Apparatus and method for scanning item - Google Patents

Apparatus and method for scanning item Download PDF

Info

Publication number
KR102016413B1
KR102016413B1 KR1020160001248A KR20160001248A KR102016413B1 KR 102016413 B1 KR102016413 B1 KR 102016413B1 KR 1020160001248 A KR1020160001248 A KR 1020160001248A KR 20160001248 A KR20160001248 A KR 20160001248A KR 102016413 B1 KR102016413 B1 KR 102016413B1
Authority
KR
South Korea
Prior art keywords
item
information
location information
image
model
Prior art date
Application number
KR1020160001248A
Other languages
Korean (ko)
Other versions
KR20170082077A (en
Inventor
손성열
김호원
김태준
김기남
김재헌
박창준
박혜선
조규성
최진성
추창우
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020160001248A priority Critical patent/KR102016413B1/en
Publication of KR20170082077A publication Critical patent/KR20170082077A/en
Application granted granted Critical
Publication of KR102016413B1 publication Critical patent/KR102016413B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An item scanning apparatus and method are disclosed. An item scanning apparatus according to an embodiment of the present invention extracts a portion that needs to be scanned from the item by using feature points attached to a standard cradle model and a three-dimensional partial model of the previously scanned item, and requires the scan. A location information calculation unit for calculating location information corresponding to the portion; A photographing unit which acquires depth information and texture information corresponding to the location information by using an image photographing the item using a camera; And a reconstruction unit for reconstructing the 3D model for the item based on the depth information and the texture information.

Description

Item scanning apparatus and method {APPARATUS AND METHOD FOR SCANNING ITEM}

The present invention relates to a three-dimensional scanning technique, and in particular, by using a model that scans any type of item, to determine whether there is a missing portion, and a technique for performing scanning using the location information for the missing portion It is about.

Recently, with the development and widespread use of sensors capable of measuring depth, the development of 3D printers has increased the importance of 3D scanning and restoration technology. In particular, as low-cost sensors such as Kinect developed by Microsoft have been developed recently, the application of object restoration is expected to increase.

 Meanwhile, three-dimensional reconstruction technology, which can be represented by Microsoft's KinectFusion, still has several problems. First, in order to scan an object having an arbitrary shape, it is necessary to manually move the position of the sensor or perform scanning while moving or rotating the object. In order to accurately restore an object, it is difficult for the expert to take a long time to scan, so it is not suitable. In addition, there are many difficulties in accurately matching data photographed at various locations. Existing techniques use a method of finding and matching common parts between consecutive data. Such a method may be degraded depending on the user's experience, and is not suitable for a job requiring precise restoration because of its inaccurate accuracy.

In particular, when the user directly scans while rotating and moving the object, in the case of a specific part, scanning is often not performed, and this method has a disadvantage in that precise scanning cannot be performed.

In addition, conventional techniques are silent about performing a rescanning operation on missing portions.

Korean Patent Application Publication No. 2015-0060020 discloses a technique for three-dimensional scanning. In particular, Korean Patent Application Publication No. 2015-0060020 discloses a technique for improving the accuracy of scanning results, and only a technique of performing the entire 3D scanning operation again is disclosed, but silent about scanning of missing portions. Doing.

Therefore, in view of the recent trend in which the need for three-dimensional scanning technology is increasing, the need for scanning technology for missing parts is emerging.

It is an object of the present invention to perform accurate scanning without missing parts.

In addition, an object of the present invention is to perform a precise scanning by using the image photographed from various locations.

An item scanning apparatus according to the present invention for achieving the above object is a position information calculation unit for calculating the position information corresponding to the portion that needs to be scanned, based on the determination result of the portion that needs to be scanned in the item; A photographing unit which acquires depth information and texture information corresponding to the location information by using an image photographing the item using a camera; And a reconstruction unit for reconstructing the 3D model for the item based on the depth information and the texture information.

In this case, the location information calculation unit may determine a portion where scanning is missing, and may determine that the missing portion is a portion that requires scanning.

At this time, the location information calculation unit may determine a portion missing scanning based on a result of combining the cradle model extracted from the standard cradle model and the three-dimensional model for the item.

In this case, the location information calculator may calculate the location information of the item and the camera based on the location information index corresponding to the item and the movement information of the item.

In this case, the photographing unit may correct an error based on a result of comparing the image photographed after moving the item in a predetermined direction and the image photographed before the item is moved.

In addition, the item scanning method according to the present invention comprises the steps of calculating the position information corresponding to the portion that needs to be scanned, based on the determination result of the portion that needs to be scanned; Acquiring depth information and texture information corresponding to the location information using an image of the item photographed using a camera; And restoring a three-dimensional model for the item based on the depth information and the texture information.

The present invention removes the error by comparing the image before the item movement and the image after the item movement, thereby enabling precise restoration.

In addition, since the present invention performs the scanning of the missing part again, precise scanning is possible.

In addition, since the present invention performs the scanning by using the location information index and the movement information of the item, precise scanning is possible.

In addition, the present invention compares the information extracted by using the location information index and the movement information of the item, it is possible to correct the error of the relative positional relationship of the cameras.

1 is a block diagram illustrating an item scanning apparatus using image information according to an exemplary embodiment of the present invention.
2 is a diagram illustrating photographing an item in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.
3 is a block diagram illustrating a photographing unit in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.
4 is a flowchart illustrating a method of removing a backlash caused by an error in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.
5 is a block diagram illustrating a location information calculator in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.
6 is a block diagram illustrating a restoration unit in an item scanning apparatus using image information according to an embodiment of the present invention.
7 is a flowchart illustrating an item scanning method using image information according to an exemplary embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings. Here, the repeated description, well-known functions and configurations that may unnecessarily obscure the subject matter of the present invention, and detailed description of the configuration will be omitted. Embodiments of the present invention are provided to more completely describe the present invention to those skilled in the art. Accordingly, the shape and size of elements in the drawings may be exaggerated for clarity.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram illustrating an item scanning apparatus using image information according to an exemplary embodiment of the present invention.

Referring to FIG. 1, an item scanning apparatus using image information according to an exemplary embodiment of the present invention includes a location information calculator 110, a photographing unit 120, and a reconstruction unit 130.

Before using the present invention, the relative position of the depth sensor, the camera, and the turntable must be known. This may be grasped using a camera calibration method.

The location information calculation unit 110 calculates location information corresponding to the part requiring the scan based on the determination result of the part requiring the scan in the item.

In this case, the item may mean an object to be scanned.

In this case, the part that needs to be scanned may mean a part that needs to be scanned again because the item has been scanned once, but the scan precision of the part falls below a specific value or a part is missing.

At this time, the position information calculator 110 calculates position information on a portion that requires scanning, and the photographing unit 120 performs photographing on a portion that requires scanning using the position information.

At this time, the location information calculation unit 110 determines a portion that needs to be scanned by using the reconstructed image of the previously scanned item and the standard cradle model on which the item is mounted.

In this case, the standard holder model means a standard form of a predefined item. For example, if the item is a shoe, the standard cradle model may refer to the standard form of the foot wearing the shoe. The description thereof will be described with reference to FIG. 5.

The photographing unit 120 acquires depth information and texture information corresponding to the location information by using an image photographing an item using a camera.

At this time, the photographing unit 120 may photograph the location information indicator and the item attached to the upper part of the item, and extract location information of the camera, location information of the item, and attitude information of the item based on the captured image.

At this time, since the feature point pattern is drawn on the surface of the position information index, the position information of the camera, the position information of the item, and the attitude information of the item may be extracted using the feature point pattern existing in the image photographed by the cameras.

The restoration unit 130 restores the 3D model for the item based on the depth information and the texture information.

At this time, the 3D model may be restored to a 3D mesh (MESH) form. However, it is not necessarily restored to the mesh form, and there is no limitation thereto.

2 is a diagram illustrating photographing an item in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.

Referring to FIG. 2, an item conveying device 204 and an item rotating device 205 in which a T-shirt type item is placed on an item holder 211 is illustrated.

The item is placed in the item conveying device 204, and the position can be changed by the item conveying device 204.

In addition, the item may be rotated by the item rotating device 205.

At this time, the location information indicator 212 is provided on the item, the location information indicator 212 is a feature point pattern is drawn, using the feature point pattern to extract the position and rotation information of the item, and the position and rotation of the item Depth information and texture information may be extracted using the information.

At this time, the position information indicator 212 is not connected to the cradle or the item to correct the error of the measurement result, but is connected to the item transfer device 204 and the item rotating device 205.

At this time, as shown in Figure 2, the camera for photographing the item may be largely composed of two types.

One type is the depth information acquisition camera 201 used to obtain depth information of the item, and the other type is the texture information acquisition camera 202 used to acquire the texture information of the item.

In FIG. 2, only two cameras are installed for convenience, but the number of installation of the cameras is not limited. Multiple cameras may be installed for precise measurements.

3 is a block diagram illustrating a photographing unit in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.

Referring to FIG. 3, a configuration in which the location information calculation unit 110 and the photographing unit 120 exchanges data with each other, photographs an item, and calculates location information based on the photographed image.

First, when scanning is started, the position information calculator 110 transmits a signal to the camera 121 for obtaining depth information, the camera 122 for obtaining texture information, and the HW control module 123.

At this time, the HW control module 123 transmits a signal to the item transfer device 124 and the item rotation device 125 based on the signal received from the position information calculation unit 110 to change the position of the item or In this case, the item is rotated.

At this time, the data corresponding to the degree of position change and the degree of rotation of the item is transmitted back to the position information calculation unit 110, by using this, to calculate the position information.

At this time, the image 126 for the depth / color information is generated using the images captured by the depth information acquisition camera 121 and the texture information acquisition camera 122.

In this case, the image 126 of the depth / color information may be transmitted to the location information calculator 110 again.

4 is a flowchart illustrating a method of operating the HW control module in the item scanning apparatus using the image information according to an embodiment of the present invention.

Referring to FIG. 4, first, the HW control module receives an HW control signal (S410).

In addition, the BACKLASH removal operation is performed (S420).

While scanning, the motor is used to move or rotate the item. However, in the case of a motor, movement may occur due to external factors such as gravity, inertia, or vibration of an item despite being in a stationary state. This movement causes an error, commonly referred to as BACKLASH.

BACKLASH causes an error while performing 3D restoration, and the HW control module 123 performs a task of removing BACKLASH.

In this case, the HW control module 123 may move a predetermined distance or transmit a signal rotating by a predetermined angle in a predefined direction, and may transmit a signal that returns to the original position again.

At this time, the backlash occurs constantly due to the inertia generated while the item moves by a certain distance or rotates by a predetermined angle and then returns again.

In this case, the HW control module 123 may remove the backlash by extracting and comparing the pre-movement image and the post-movement image to extract the generated backlash.

In addition, the operation of transporting and rotating the item is performed (S430).

5 is a block diagram illustrating a location information calculator in an item scanning apparatus using image information according to an exemplary embodiment of the present invention.

Referring to FIG. 5, the location information calculator 110 includes a capture planner module 111, a standard holder model 112, a partial reconstruction module 113, a camera, and an item location calculation module 114.

As described previously, the position information calculator 110 determines a portion of the item that needs to be scanned, calculates a camera position corresponding to the portion that needs to be scanned, and then transmits a photographing signal to the photographing unit 12. .

The capture planner module 111 compares the standard model of the item extracted from the standard cradle model 112 or the partially reconstructed three-dimensional model extracted from the partial reconstruction module 113 to find a part of the item that has not yet obtained data. The portion that needs to be scanned may be extracted, the position of the camera for obtaining an image corresponding to the portion that needs to be scanned may be calculated, and the HW control signal may be generated and transmitted to the photographing unit 120 based on the calculated result.

The standard holder model 112 may refer to a set of standard forms of a preset item. For example, if the item is a shoe, the standard foot shape of a person can be set as a standard holder. Also, for example, if the item is a top, the cradle may be set to a standard upper body shape.

In this case, the partial reconstruction module 113 may generate a partially reconstructed three-dimensional model of the item from the camera position information, the item position information, and the depth / color image obtained from the camera and the item position calculation module 114.

In this case, the camera and item position calculation module 114 may measure the relative positions of the item and the cameras by using the HW parameter and the image scanned by the position information index extracted by the photographing unit 120.

At this time, the camera and the item position calculation module 114 may extract a predefined feature point pattern from the image of the location information index scan, and calculate the position and attitude of the item using the pixel coordinate based position relationship of each feature point. .

At this time, the camera and item position calculation module 114 extracts a predefined feature point pattern from the scanned image information index, and uses the pixel coordinate-based positional relationship and camera calibration technique of each smooth points to compare the cameras. The positional relations can be extracted, and the relative errors can be extracted by comparing the relative positional relations with the positional relations extracted by camera calibration techniques.

At this time, if the extracted error is more than a specific value, it may be informed that the camera calibration is required.

6 is a block diagram illustrating a restoration unit in an item scanning apparatus using image information according to an embodiment of the present invention.

Referring to FIG. 6, the restoration unit includes a 3D appearance restoration unit 131, a texture generation unit 132, a cradle removal unit 133, and an illumination effect removal unit 134.

First, the 3D appearance and texture are restored based on the depth / color information image 126 and the position information extracted from the position information calculator 110.

In this case, the 3D appearance may be performed by the 3D appearance restoration unit 131, and the texture generation operation may be performed by the texture generation unit 132.

At this time, the cradle removal unit 133 may remove the cradle in the three-dimensional appearance.

In this case, the lighting effect remover 134 may remove the lighting effect from the texture.

At this time, the 3D model construction unit 135 may generate a final 3D model.

7 is a flowchart illustrating an item scanning method using image information according to an exemplary embodiment of the present invention.

First, referring to FIG. 7, a portion requiring a scan is extracted (S710).

In this case, the item may mean an object to be scanned.

In this case, the part that needs to be scanned may mean a part that needs to be scanned again because the item has been scanned once, but the scan precision of the part falls below a specific value or a part is missing.

In addition, the position information corresponding to the portion that needs to be scanned is calculated (S720).

At this time, the location information indicator and the item attached to the upper part of the item is photographed, and the location information of the camera, the location information of the item, and the posture information of the item can be extracted based on the captured image.

At this time, since the feature point pattern is drawn on the surface of the position information index, the position information of the camera, the position information of the item, and the attitude information of the item may be extracted using the feature point pattern existing in the image photographed by the cameras.

At this time, the position information of the item may be extracted by combining the degree of rotation of the item rotating apparatus, the degree of movement by the item moving apparatus, and the position information extracted from the captured image.

In addition, depth information and texture information corresponding to the position information are obtained using an image photographed using the camera (S730).

In this case, a camera for obtaining depth information and a camera for obtaining texture information exist in the type of camera used in operation S730.

In addition, the 3D model for the item is restored based on the depth information and the texture information (S740).

At this time, the 3D model may be restored to a 3D mesh (MESH) form. However, it is not necessarily restored to the mesh form, and there is no limitation thereto.

As described above, the item scanning apparatus and method according to the present invention may not be limitedly applied to the configuration and method of the embodiments described as described above, and the embodiments may be modified in various ways so that various modifications may be made. Or some may be selectively combined.

201: Camera for obtaining depth information
202: camera for obtaining texture information
204: item transfer device
205: item rotating device
211: Item Cradle
212: location information indicator

Claims (10)

A location information calculation unit that extracts a portion that needs to be scanned from the 3D model and calculates position information corresponding to the portion that needs to be scanned, by using the three-dimensional model of the feature points and the pre-scanned items attached to the cradle ;
A photographing unit obtaining depth information and texture information corresponding to the location information based on an image photographing the item using a camera; And
Restoration unit for restoring the 3D model for the item based on the depth information and the texture information
Including,
The photographing unit
The error is corrected based on a result of comparing the image photographed after moving the item in a preset direction and the image photographed before the item is moved.
The error is
Item scanning apparatus using the image information, characterized in that corresponding to the error due to the backlash (BACKLASH) generated in the motor used to move the item.
The method according to claim 1,
The location information calculation unit
And determining the missing portion and determining that the missing portion is a portion that requires scanning.
The method according to claim 2,
The location information calculation unit
An item scanning apparatus using image information, characterized in that for determining the missing part, based on a result of combining the cradle model extracted from the standard cradle model and the three-dimensional model for the item.
The method according to claim 3,
The location information calculation unit
And calculating the position information of the item and the camera based on the position information indicator corresponding to the item and the movement information of the item.
delete An item scanning method using an item scanning device,
Calculating location information corresponding to the portion of the item that needs to be scanned, based on a determination result of the portion of the item that needs to be scanned;
Acquiring depth information and texture information corresponding to the location information using an image of the item photographed using a camera; And
Restoring a 3D model for the item based on the depth information and the texture information
Including,
The acquiring step
The error is corrected based on a result of comparing the image photographed after moving the item in a preset direction and the image photographed before the item is moved.
The error is
And an error due to a backlash occurring in the motor used to move the item.
The method according to claim 6,
The calculating step
And determining the missing portion and determining that the missing portion is a portion that requires scanning.
The method according to claim 7,
The calculating step
Based on a result of combining the cradle model extracted from the standard cradle model and the three-dimensional model for the item, the item scanning method using the image information, characterized in that for determining the missing portion.
The method according to claim 8,
The calculating step
And calculating location information of the item and the camera based on the location information index corresponding to the item and the movement information of the item.
delete
KR1020160001248A 2016-01-05 2016-01-05 Apparatus and method for scanning item KR102016413B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160001248A KR102016413B1 (en) 2016-01-05 2016-01-05 Apparatus and method for scanning item

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160001248A KR102016413B1 (en) 2016-01-05 2016-01-05 Apparatus and method for scanning item

Publications (2)

Publication Number Publication Date
KR20170082077A KR20170082077A (en) 2017-07-13
KR102016413B1 true KR102016413B1 (en) 2019-09-02

Family

ID=59352439

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160001248A KR102016413B1 (en) 2016-01-05 2016-01-05 Apparatus and method for scanning item

Country Status (1)

Country Link
KR (1) KR102016413B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003302211A (en) * 2002-04-11 2003-10-24 Canon Inc Three-dimensional image processing unit and method
KR100609004B1 (en) 2006-02-03 2006-08-03 (자)한진개발공사 Method for recovering topography using digital photogrammetry technique
JP2011198349A (en) * 2010-02-25 2011-10-06 Canon Inc Method and apparatus for processing information

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990003333A (en) * 1997-06-25 1999-01-15 배순훈 Distance information acquisition device in 3D shape restoration system
KR100920225B1 (en) * 2007-12-17 2009-10-05 한국전자통신연구원 Method and apparatus for accuracy measuring of?3d graphical model by using image
KR102184766B1 (en) * 2013-10-17 2020-11-30 삼성전자주식회사 System and method for 3D model reconstruction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003302211A (en) * 2002-04-11 2003-10-24 Canon Inc Three-dimensional image processing unit and method
KR100609004B1 (en) 2006-02-03 2006-08-03 (자)한진개발공사 Method for recovering topography using digital photogrammetry technique
JP2011198349A (en) * 2010-02-25 2011-10-06 Canon Inc Method and apparatus for processing information

Also Published As

Publication number Publication date
KR20170082077A (en) 2017-07-13

Similar Documents

Publication Publication Date Title
US9672630B2 (en) Contour line measurement apparatus and robot system
JP6429772B2 (en) 3D scanning and positioning system
CN111060023B (en) High-precision 3D information acquisition equipment and method
CN110140347B (en) Depth image supply device and method
JP6363863B2 (en) Information processing apparatus and information processing method
US20110249117A1 (en) Imaging device, distance measuring method, and non-transitory computer-readable recording medium storing a program
CN109377551B (en) Three-dimensional face reconstruction method and device and storage medium thereof
US9230330B2 (en) Three dimensional sensing method and three dimensional sensing apparatus
JP6380667B2 (en) Shape measuring apparatus and shape measuring method
JP7269874B2 (en) How to process multiple regions of interest independently
JP6282098B2 (en) Calibration apparatus and method
CN107517346B (en) Photographing method and device based on structured light and mobile device
TWI672937B (en) Apparatus and method for processing three dimensional images
CN111445529B (en) Calibration equipment and method based on multi-laser ranging
JP4419570B2 (en) 3D image photographing apparatus and method
US20210150744A1 (en) System and method for hybrid depth estimation
CN105844701A (en) Sequential-image three-dimensional modeling method
US20240087167A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
JP2021106025A5 (en)
JP6653143B2 (en) Method and apparatus for measuring 3D coordinates of an object
EP3371780A1 (en) System and methods for imaging three-dimensional objects
WO2019087253A1 (en) Stereo camera calibration method
KR102016413B1 (en) Apparatus and method for scanning item
JP6486083B2 (en) Information processing apparatus, information processing method, and program
US11481917B2 (en) Compensation of three-dimensional measuring instrument having an autofocus camera

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant