KR101693259B1 - 3D modeling and 3D geometry production techniques using 2D image - Google Patents

3D modeling and 3D geometry production techniques using 2D image Download PDF

Info

Publication number
KR101693259B1
KR101693259B1 KR1020150085649A KR20150085649A KR101693259B1 KR 101693259 B1 KR101693259 B1 KR 101693259B1 KR 1020150085649 A KR1020150085649 A KR 1020150085649A KR 20150085649 A KR20150085649 A KR 20150085649A KR 101693259 B1 KR101693259 B1 KR 101693259B1
Authority
KR
South Korea
Prior art keywords
image
modeling
depth
present
delete delete
Prior art date
Application number
KR1020150085649A
Other languages
Korean (ko)
Other versions
KR20160148885A (en
Inventor
하찬효
하동효
이창우
예희영
Original Assignee
(주)유니드픽쳐
하찬효
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)유니드픽쳐, 하찬효 filed Critical (주)유니드픽쳐
Priority to KR1020150085649A priority Critical patent/KR101693259B1/en
Publication of KR20160148885A publication Critical patent/KR20160148885A/en
Application granted granted Critical
Publication of KR101693259B1 publication Critical patent/KR101693259B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The present invention analyzes a 2D image transmitted through a portable terminal such as a PC or a mobile and analyzes the edited 2D image, applies the analyzed depth value to a 3D editing tool to form a 3D modeling, The present invention relates to a 3D modeling method using a 2D image for producing a 3D shape having a color and a technique for producing a 3D shape.

Description

BACKGROUND ART [0002] 3D modeling using a 2D image and a three-dimensional shape manufacturing technique

The present invention relates to 3D modeling using a 2D image and a three-dimensional shape production technique. More specifically, the present invention can analyze a 2D image, provide editing and depth values, And more particularly, to a 3D modeling method and a three-dimensional shape making method using a 2D image.

Conventional 3D printers have been developed for the purpose of prototyping products before they are commercialized in the enterprise. However, they have been developed in the early stage, which was confined to plastic materials, expanded to nylon and metal materials, and commercialized in various fields as well as industrial prototypes Respectively.

In general, 3D printers are divided into a stacked type (additive type or rapid prototyping type) which stacks up layers one by one according to the method of forming a three-dimensional form and a cutting type (computer numerically controlled piece type) that cuts large ones. For example, a person or an object may be the object.

The step of fabricating a stereoscopic shape is performed by modeling, printing, and finishing.

The modeling is a step of producing a 3D drawing, which is produced using a 3D CAD (computer aided design), a 3D modeling program, a 3D scanning device, or the like. Generally, as in the construction industry or automobile manufacturing factory, the 3D scanning device is used except for the place where the 3D design is performed from the beginning. The 3D scanning device is a machine which draws the shape of the actual object on the monitor in three dimensions, By taking three-dimensional data by scanning the whole object, or by taking light such as a laser light onto a thing and then receiving reflected light to obtain three-dimensional data, or synthesizing pictures taken from six views such as the upper and lower slopes So that three-dimensional data can be obtained. However, in the conventional scan method, it is troublesome to visit the place where the scan device is provided, and the quality of the scan data may be changed according to the performance of the scan device. For example, when a person is scanned, the hair portion may not be generated as the scanning data, and the correct 3D data can not be generated with such scanning data.

Therefore, the emergence of an invention that enables correct 3D data to be generated regardless of the performance of a scanning device is required.

Korean Patent Application Publication No. 10-2015-0046557 Korea Patent Office Registration No. 10-1495810

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems of the prior art, and an object of the present invention is to provide a 3D modeling method using a 2D photograph received from a kakao talk, a message, The object of the present invention is to provide a 3D modeling method and a three-dimensional shape making technique using a 2D image that can create a three-dimensional shape of all objects using 2D photographs.

According to an embodiment of the present invention, there is provided an image processing method comprising: a 2D image correction step of correcting a shadow and a color of a 2D image; A 2D image analysis step of setting a depth value of the corrected image using a commercial image editing tool; A 3D modeling step of generating a 3D image by applying the values calculated in the 2D image analysis step to a 3D editing tool; And a product completing step of completing the product by matching the 3D image and the 2D image formed in the 3D modeling step.

In a preferred embodiment of the present invention, the 2D image analyzing step divides the corrected 2D image into regions and calculates depth values through shading.

In a preferred embodiment of the present invention, the 2D image analyzing step forms a displace-map according to shading that occurs in different regions.

According to a preferred embodiment of the present invention, the displace-map is configured such that black is set to 0 and white is set to 100, and black and white are divided into six levels.

In a preferred embodiment of the present invention, the 3D modeling generating step generates a 3D image that is formed only of a Z-axis value using a depth value.

As a preferred embodiment of the present invention, the 3D modeling generating step may restrict the pixel movements in the X and Y axes in addition to the Z-axis value, thereby maintaining the original shape.

In a preferred embodiment of the present invention, the product completing step is a step of matching the 2D image to the 3D formed image and restoring the same color as the original.

According to the present invention, the following effects can be achieved by this configuration.

In the 3D modeling and 3D modeling technique using a 2D image according to the present invention, a 3D modeling can be formed by analyzing and editing an image of a 2D photo transmitted using a PC or a mobile, Therefore, it is possible to overcome the spatial constraint for providing the scanning device, and it is not necessary to visit the photographing place provided with the scan device, thereby saving other required costs and time.

In addition, since the present invention utilizes a general 2D image instead of a scanning device, it is possible to form a three-dimensional shape of a bust, a half body, a whole body, and all objects, and it is possible to perform more precise and accurate work than a manual work using a conventional clay or clay. It is possible to achieve an effect in which the color can be directly expressed.

FIG. 1 is a block diagram schematically illustrating a 3D modeling and 3D modeling technique using a 2D image according to an embodiment of the present invention,
FIG. 2 is a view schematically showing an example of forming a displacement map using 3D modeling using a 2D image and a three-dimensional shape making technique according to an embodiment of the present invention,
FIG. 3 is a view schematically showing a method of constructing a level applied to the display map using 3D modeling using a 2D image and a three-dimensional shape producing technique according to an embodiment of the present invention,
4 is a view schematically showing a 3D image to which a Z value is applied using 3D modeling using a 2D image and a 3D shape making technique according to an embodiment of the present invention,
5 is a view schematically showing an example of detail rendering using a 3D editing tool in a 3D modeling and 3D modeling technique using a 2D image according to an embodiment of the present invention,
6 is a diagram schematically illustrating an example in which a product is completed by matching a 3D image and an original image generated in a 3D modeling and 3D modeling technique using a 2D image according to an embodiment of the present invention.

Hereinafter, a 3D modeling and a 3D shape making technique using a 2D image according to the present invention will be described in detail with reference to the accompanying drawings. It is to be noted that like elements in the drawings are represented by the same reference numerals as possible. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

FIG. 1 is a schematic block diagram of a 3D modeling and 3D shape creation technique using a 2D image according to an embodiment of the present invention. FIG. 2 is a block diagram of a 3D modeling and a 3D shape creation using a 2D image according to an embodiment of the present invention. FIG. 3 is a schematic view illustrating an example of forming a displacement map using a 2D image and a 3D shape creation technique according to an embodiment of the present invention. FIG. 4 is a schematic view illustrating a 3D image having a Z value applied using a 3D modeling using a 2D image and a 3D shape making technique according to an embodiment of the present invention. FIG. FIG. 5 is a schematic diagram illustrating an example of detail rendering using a 3D editing tool in a 3D modeling and 3D modeling technique using a 2D image according to an embodiment of the present invention. FIG. 6 is a diagram illustrating an example of a completed product by matching the 3D image and the original image generated in the 3D modeling and 3D shape creation technique using the 2D image according to the embodiment of the present invention. It is an example.

The present invention requires facilities such as a 3D printer and an editing device to directly produce monuments desired by a plurality of customers in order to produce a 3D object, and as shown in FIG. 1, a 2D image correction step S100; 2D image analysis step S200; 3D modeling generation step S300; And a product completion step (S400).

The 2D image correction step S100 is a step of correcting the image of the object requested by the user. The user can use a terminal (PC, PDMA, smart phone, etc.) Message or e-mail. The image refers to an image formed in 2D such as a photograph or a picture. The received image is corrected to remove the shadow so that only the image necessary for the 3D shape production exists, or to correct the color represented by the blurred line or point so that the original image is not damaged.

The 2D image analysis step (S200) analyzes the image through the 2D image correction step (S100) using a commonly used image editing tool and calculates a depth value according to the analyzed image. The 2D image analysis step (S200) Set the depth value so that shading can be generated for each divided and divided area. For example, as shown in FIG. 2, in order to produce a full-length image, a dis- map-map is formed by creating shades by dividing into a face, a neck, an upper body, a lower body, and the like.

In the Displace-map, black is set to 0, and 100 is assigned to white, which can be divided into 6 levels, and a depth value (Depth-Rate) is designated according to the divided shades. Axis value of the 3D modeling generation step (S300) described later.

Accordingly, as shown in FIG. 3, the depth value (Depth-Rate) is a value obtained by giving a height value to the shadows calculated by the displacement map, , The area painted in white may correspond to a luminance value of 255, the maximum height of white may be limited to 100, and the gray (gray) gradation between black and white may be determined according to how the subject should be displayed As shown in FIG. 3, when the side of the bust of the human being is exemplified, the line formed around the ear represented by black is set to a height of 0 and a nose tip is set to a maximum height of 100 do.

The 3D modeling creation step S300 forms a 3D modeled image by applying a Displace-map and a Depth-Rate calculated in the 2D image analysis step S200 to the 3D editing tool Dimensional image by projecting a Z-axis value to which a depth value according to shading is applied to a displace-map.

In general, when working without a base in a 3D editing tool, it takes a lot of time to work by repeatedly pushing and pulling the surface formed of millions of polygons. Also, the original image applied to the surface may be distorted or the resolution of the person may be distorted However, in the present invention, a displacement map and a depth value are provided to the 3D editing tool to prevent the original image from being damaged and to shorten the working time.

In order to describe the details using the 3D editing tool, the movement of the pixels formed by the X and Y axes is restricted in addition to the Z axis value to which the depth value is applied due to the special instruction, So that the shape can be made.

The product completion step S400 is a step of completing the product by matching the 3D image formed in the 3D modeling step S300 and the original 2D image to restore the same color as the 2D image.

The matched image can be printed by a method selected from a stacking type (addition type or rapid prototyping method) and a cutting type (computer numerically controlled engraving method) in which a large lump is cut through a 3D printer.

The present invention analyzes a 2D image and calculates a Displace-map and a Depth-Rate and provides it to a 3D editing tool to form a 3D image file, so that a 3D image, a figure, a memorial plaque, a trophy, It is possible to make all shapes desired by customers such as bust, and match the original image with the 3D image so that the color of the original can be expressed as it is, so that the perfection of the product can be improved.

While the applicant has described various embodiments of the present invention, it is to be understood that such embodiments are merely one embodiment of the technical idea of the present invention, and that any changes or modifications as far as implementing the technical idea of the present invention are within the scope of the present invention Should be interpreted as belonging to.

S100: 2D image correction step
S200: 2D image analysis step
S300: 3D modeling creation step
S400: Product Completion Step

Claims (7)

A 2D image correcting step of correcting the 2D image within a range in which the original image is not corrupted by removing the shadow or correcting the color represented by the blurred line or point so that only the image necessary for the 3D shape production exists;
A 2D image analysis step of setting a depth value of the corrected image using a commercial image editing tool;
A 3D modeling step of generating a 3D image by applying the values calculated in the 2D image analysis step to a 3D editing tool;
And a product finishing step of matching the 3D image and the 2D image formed in the 3D modeling step and restoring the same color as the original to complete the product,
The 2D image analyzing step may include calculating a depth value through shading by dividing the corrected 2D image into regions, forming a displace-map according to different shades for each region, map is made up of 0 levels of black and 100 levels of white. The depth between black and white is divided into six levels. Depth-rate can be specified according to the divided level so that it can be provided as a Z-axis value ,
Wherein the 3D modeling generating step generates a 3D image that is only a Z-axis value using depth values and restricts pixel movements in X and Y axes in addition to a Z-axis value to maintain the original shape. Using 3D Modeling and 3D Shape Making Technique.
delete delete delete delete delete delete
KR1020150085649A 2015-06-17 2015-06-17 3D modeling and 3D geometry production techniques using 2D image KR101693259B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150085649A KR101693259B1 (en) 2015-06-17 2015-06-17 3D modeling and 3D geometry production techniques using 2D image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150085649A KR101693259B1 (en) 2015-06-17 2015-06-17 3D modeling and 3D geometry production techniques using 2D image

Publications (2)

Publication Number Publication Date
KR20160148885A KR20160148885A (en) 2016-12-27
KR101693259B1 true KR101693259B1 (en) 2017-01-10

Family

ID=57736934

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150085649A KR101693259B1 (en) 2015-06-17 2015-06-17 3D modeling and 3D geometry production techniques using 2D image

Country Status (1)

Country Link
KR (1) KR101693259B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200065638A (en) 2018-11-30 2020-06-09 황욱철 Scaffold design modeling method and system
KR20230099545A (en) 2021-12-27 2023-07-04 동의대학교 산학협력단 Apparatus and Method of 3D modeling and calculating volume of moving object using a stereo 3D depth camera

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180096372A (en) 2017-02-21 2018-08-29 엘아이지넥스원 주식회사 Method for generating 3D model
KR102259509B1 (en) 2019-08-22 2021-06-01 동의대학교 산학협력단 3d modeling process based on photo scanning technology
KR102303566B1 (en) 2020-08-25 2021-09-17 윤기식 Method for generating 3D modeling of 2D images using virual grid networks
KR102504720B1 (en) * 2021-12-29 2023-02-28 주식회사 리콘랩스 Method and system for providing a 3d model automatic creation interface
WO2023128027A1 (en) * 2021-12-30 2023-07-06 주식회사 리콘랩스 Irregular sketch-based 3d-modeling method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100657943B1 (en) * 2005-01-07 2006-12-14 삼성전자주식회사 Real time 3 dimensional transformation method for 2 dimensional building data and apparatus therefor, and real time 3 dimensional visualization method for 2 dimensional linear building data and apparatus using the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120121031A (en) * 2011-04-26 2012-11-05 (주)클로버추얼패션 How to create 3D face model using 2D face photo
KR20150046557A (en) 2013-10-22 2015-04-30 (주)홈시큐넷 Manufactural system and method of 3D figuration
KR101495810B1 (en) 2013-11-08 2015-02-25 오숙완 Apparatus and method for generating 3D data

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100657943B1 (en) * 2005-01-07 2006-12-14 삼성전자주식회사 Real time 3 dimensional transformation method for 2 dimensional building data and apparatus therefor, and real time 3 dimensional visualization method for 2 dimensional linear building data and apparatus using the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200065638A (en) 2018-11-30 2020-06-09 황욱철 Scaffold design modeling method and system
KR20230099545A (en) 2021-12-27 2023-07-04 동의대학교 산학협력단 Apparatus and Method of 3D modeling and calculating volume of moving object using a stereo 3D depth camera

Also Published As

Publication number Publication date
KR20160148885A (en) 2016-12-27

Similar Documents

Publication Publication Date Title
KR101693259B1 (en) 3D modeling and 3D geometry production techniques using 2D image
US7365744B2 (en) Methods and systems for image modification
KR100327541B1 (en) 3D facial modeling system and modeling method
US20050053275A1 (en) Method and system for the modelling of 3D objects
KR101425576B1 (en) Method for acquiring and processing a three-dimensional data to product a precise wide-area scale model
KR101602472B1 (en) Apparatus and method for generating 3D printing file using 2D image converting
CN106652037B (en) Face mapping processing method and device
JP2016198974A (en) Slice model generation apparatus and three-dimensional molding system
US20130150994A1 (en) Method of carving three-dimensional artwork
KR101715325B1 (en) Method and system for providing Picture lay out drawings by using three dimensional scan technologies
KR20200056764A (en) Method for automatically set up joints to create facial animation of 3d face model and computer program
CN105212452A (en) A kind of manufacture method being carved with the pendant body of personalized embossed portrait
JP2009269181A (en) Method of manufacturing three-dimensional relief and device using this method
CN113808272A (en) Texture mapping method in three-dimensional virtual human head and face modeling
KR100924598B1 (en) Method for generating three dimension model
KR101631474B1 (en) Three dimensional model creating method for digital manufacture
JP2017062553A (en) Three-dimensional model forming device and three-dimensional model forming method
CN113674161A (en) Face deformity scanning completion method and device based on deep learning
JP7436612B1 (en) Method for manufacturing plate-like three-dimensional objects
KR100971797B1 (en) Transparent decoration and method for manufacturing the same
KR101779265B1 (en) Manufacturing 3-dimensional face using 3d printing with 2d pictures
JP2018140526A (en) Method of manufacturing three-dimensional relief
TWI536317B (en) A method of stereo-graph producing
JP6388489B2 (en) Method and apparatus for creating data for surface processing
JP3738282B2 (en) 3D representation image creation method, 3D representation computer graphics system, and 3D representation program

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
FPAY Annual fee payment

Payment date: 20191104

Year of fee payment: 4