WO2018164329A1 - Procédé et système de fourniture d'informations concernant un visage à l'aide d'une couche anatomique et support d'enregistrement non transitoire lisible par ordinateur - Google Patents

Procédé et système de fourniture d'informations concernant un visage à l'aide d'une couche anatomique et support d'enregistrement non transitoire lisible par ordinateur Download PDF

Info

Publication number
WO2018164329A1
WO2018164329A1 PCT/KR2017/008508 KR2017008508W WO2018164329A1 WO 2018164329 A1 WO2018164329 A1 WO 2018164329A1 KR 2017008508 W KR2017008508 W KR 2017008508W WO 2018164329 A1 WO2018164329 A1 WO 2018164329A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
anatomical
user
shape
information
Prior art date
Application number
PCT/KR2017/008508
Other languages
English (en)
Korean (ko)
Inventor
김진수
최흥산
김희진
최종우
허창훈
Original Assignee
주식회사 모르페우스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020170094995A external-priority patent/KR101959866B1/ko
Application filed by 주식회사 모르페우스 filed Critical 주식회사 모르페우스
Publication of WO2018164329A1 publication Critical patent/WO2018164329A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration

Definitions

  • the present invention relates to a method, system and non-transitory computer readable recording medium for providing information about a face using an anatomical layer.
  • an outline of a face component such as an eye, a nose, a mouth, and a chin is extracted from a face image obtained from a user, and the face of the user is extracted based on the feature points of each of the components.
  • Estimation techniques have been introduced.
  • a user terminal that allows a user to transmit a comparison original image to be compared by connecting to a network and a comparison target image provided through the user terminal, the specific comparison target image according to the user's designation
  • a person image similarity comparison system including a similarity comparison server unit providing similarity information has been introduced.
  • the entire user's face is estimated based only on local features obtained from the image of the user's face or bone, and based on the estimated user's face. Since it only provides a comparison service, there was no choice but to make a difference between actual and virtual results.
  • the present invention aims to solve all of the above-mentioned problems of the prior art.
  • another object of the present invention is to estimate the shape and position of at least one type of anatomical layer included in soft tissue of a user's face based on user information.
  • a method for providing information about a face using an anatomical layer comprising: obtaining three-dimensional measurement data about the shape of a user's face, the obtained three-dimensional measurement data to the user Estimating the shape and location of at least one type of anatomical layer included in soft tissue of the user's face by analyzing it in comparison with at least one anatomical face model associated with the information; and Providing information regarding a comparison between the user's face specified by the at least one reference anatomical face model, wherein the anatomical face model includes at least one type of anatomical layer included in soft tissue of the face;
  • a method is provided that includes modeled data relating to the shape and location of a.
  • a system for providing information about a face using an anatomical layer a measurement data acquisition unit for obtaining three-dimensional measurement data about the shape of the user's face, the obtained three-dimensional measurement data
  • a layer estimator for estimating the shape and position of at least one kind of anatomical layer included in soft tissue of the user's face by analyzing the at least one anatomical face model associated with the user information
  • a comparison information providing unit for providing information regarding a comparison between the user's face specified by the estimation and at least one reference anatomical face model, wherein the anatomical face model is included in the soft tissue of the face. Modeled data on the shape and location of at least one type of anatomical layer is included.
  • non-transitory computer readable recording medium for recording another method for implementing the present invention, another system, and a computer program for executing the method.
  • the present invention it is possible to estimate the shape and position of at least one kind of anatomical layer included in the soft tissue of the user's face based on the user information.
  • FIG. 1 is a view showing a schematic configuration of an entire system for providing information about a face using an anatomical layer according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing in detail the internal configuration of the information providing system according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a process of providing information on a face using an anatomical layer according to an embodiment of the present invention.
  • 4 to 6 exemplarily illustrate a process of generating information about a comparison between a user's face and a reference anatomical face model according to one embodiment of the present invention.
  • FIG. 1 is a view showing a schematic configuration of an entire system for providing information about a face using an anatomical layer according to an embodiment of the present invention.
  • the entire system may include a communication network 100, an information providing system 200, and a device 300.
  • the communication network 100 may be configured regardless of a communication mode such as wired communication or wireless communication, and includes a local area network (LAN) and a metropolitan area network (MAN). ), And various communication networks such as a wide area network (WAN).
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • the communication network 100 as used herein may be a known Internet or World Wide Web (WWW).
  • WWW World Wide Web
  • the communication network 100 may include, at least in part, a known wired / wireless data communication network, a known telephone network, or a known wired / wireless television communication network without being limited thereto.
  • the communication network 100 is a wireless data communication network, and includes Wi-Fi communication, Wi-Fi Direct communication, Long Term Evolution (LTE) communication, Bluetooth communication (for example, low power Bluetooth BLE (Bluetooth Low Energy), infrared communication, ultrasonic communication, etc. may be implemented at least in part.
  • Wi-Fi communication Wi-Fi Direct communication
  • Wi-Fi Direct communication Wi-Fi Direct communication
  • LTE Long Term Evolution
  • Bluetooth communication for example, low power Bluetooth BLE (Bluetooth Low Energy), infrared communication, ultrasonic communication, etc. may be implemented at least in part.
  • the information providing system 200 may perform communication with the device 300, which will be described later, through the communication network 100, and transmit 3D measurement data regarding the shape of the user's face.
  • At least one type of anatomical layer included in the soft tissue of the user's face by acquiring and analyzing the acquired three-dimensional measurement data in comparison with at least one anatomical face model associated with the user information. and estimating the shape and position of the layer, and providing information on the comparison between the user's face specified by the estimation and the at least one reference anatomical face model.
  • the configuration and function of the information providing system 200 according to the present invention will be described in detail through the following detailed description. Meanwhile, the information providing system 200 has been described as above, but this description is exemplary, and at least some of the functions or components required for the information providing system 200 will be described later as necessary. It will be apparent to those skilled in the art that they may be implemented within an external system (not shown) or may be included in the device 300 or an external system (not shown).
  • the device 300 is a digital device including a function for enabling communication after connecting to the information providing system 200 through the communication network 100, a smartphone, a tablet PC
  • the device 300 can be adopted as the device 300 according to the present invention.
  • the device 300 may include an application for supporting a function according to the present invention for providing information about a face using an anatomical layer.
  • an application may be downloaded from the information providing system 200 or an external application distribution server (not shown).
  • FIG. 2 is a diagram showing in detail the internal configuration of the information providing system 200 according to an embodiment of the present invention.
  • the information providing system 200 may be a digital device having a computing power by including a memory means and a microprocessor.
  • the information providing system 200 may be a server system.
  • the information providing system 200 includes a measurement data obtaining unit 210, a layer estimating unit 220, a comparison information providing unit 230, a service providing unit 240, a communication unit 250, and It may be configured to include a controller 260.
  • the measurement data acquisition unit 210, the layer estimator 220, the comparison information providing unit 230, the service providing unit 240, the communication unit 250 and the control unit 260 is At least some of the above may be program modules that communicate with an external system.
  • Such program modules may be included in the information providing system 200 in the form of an operating system, an application program module, or other program modules, and may be physically stored in various known storage devices.
  • the program module may be stored in a remote storage device that can communicate with the information providing system 200.
  • program modules include, but are not limited to, routines, subroutines, programs, objects, components, data structures, etc. that perform particular tasks or execute particular abstract data types, described below, in accordance with the present invention.
  • the measurement data acquisition unit 210 may perform a function of acquiring three-dimensional measurement data regarding a shape of a user's face.
  • the three-dimensional measurement data may include at least one of three-dimensional scanning data of the user's face and three-dimensional imaging data of the hard tissue of the user's face, such three-dimensional scanning
  • the imaging data may include x-rays, ultrasound waves, computer tomography (CT), magnetic resonance images (MRI), and positron emission tomography (PET). And a three-dimensional scanner.
  • the measurement data acquisition unit 210 may obtain three-dimensional measurement data about the user's face shape from the two-dimensional measurement data regarding the user's face shape.
  • the measurement data acquisition unit 210 acquires two-dimensional measurement data (eg, a picture, a picture, etc.) regarding a shape of a user
  • a perspective projection conversion algorithm (perspective) is obtained.
  • three-dimensional measurement on the shape of the user's face by converting the two-dimensional measurement data obtained above through the known three-dimensional transformation algorithm such as projection transformation into three-dimensional data based on the user's facial feature points of the two-dimensional measurement data.
  • Data can be obtained.
  • the layer estimator 220 at least one anatomy associated with the user information, the three-dimensional measurement data about the shape of the user's face obtained from the measurement data acquisition unit 210 By analyzing the comparison with the enemy face model, it is possible to estimate the shape and position of at least one type of anatomical layer included in the soft tissue of the user's face.
  • the anatomical face model is obtained by scanning the peeled (or unpeeled) layer while peeling off the anatomical layers of the soft tissue of the face of the dead human body (ie, the dead body).
  • the position and shape of a layer and data obtained by transmitting at least one of ultrasonic waves, radiation, a magnetic field, and a positron to a living human facial soft tissue (for example, data regarding the position and shape of a layer).
  • a living human facial soft tissue for example, data regarding the position and shape of a layer.
  • the modeled data may be included.
  • the data obtained above such as the support vector machine (SVM) algorithm, multivariate adaptive regression spline (MARS) algorithm, nearest neighbor (KNN) algorithm, neural network (NN) algorithm, etc.
  • SVM support vector machine
  • MARS multivariate adaptive regression spline
  • KNN nearest neighbor
  • NN neural network
  • an anatomical face model may be obtained that patterns the anatomical layers of the face according to demographic indicators.
  • the above-described anatomical layer may include layers relating to at least one of muscle, fat, blood vessels, nerves and lymphatic vessels, and the user information may include race, ethnicity, gender, and age of the user. Information may be included.
  • the type of the anatomical layer and the user information according to an embodiment of the present invention is not necessarily limited to those listed above, periosteum, fascia (faiocia) within the scope that can achieve the object of the present invention , Anatomic layers such as ligaments, or user information, such as a country or a residential area.
  • the layer estimator 220 may include at least one feature related to the face of the user, which is extracted from the 3D measurement data regarding the shape of the face of the user, associated with the corresponding user information.
  • the user's face by matching or transforming the shape and position of at least one type of anatomical layer included in the anatomical face model to conform to three-dimensional measurement data about the shape of the user's face It is possible to estimate the shape or position of at least one type of anatomical layer included in the soft tissue of.
  • the layer estimator 220 compares three-dimensional measurement data regarding the shape of the user's face with at least one anatomical face model associated with the user information, and analyzes the active expression.
  • Active appearance model (AAM) algorithms active shape model (ASM) algorithms, composite constraint AAM algorithms, iterative closest points (ICP) algorithms, and non-rigid matching algorithms At least one of (non-rigid registration) may be used.
  • the layer estimator 220 extracts a feature of the user's face from the 3D measurement data regarding the shape of the user's face, and an active appearance model (AAM).
  • AAM active appearance model
  • At least one of an algorithm, an active shape model (ASM) algorithm, and a composite constraint active expression model (Composite Constraint AAM) algorithm may be used, and iterative closest points (ICP) algorithms and non-rigid matching algorithms (by matching or transforming the shape and position of at least one type of anatomical layer included in the anatomical face model with at least one of non-rigid registration to conform to three-dimensional measurement data about the shape of the user's face, Estimate the shape or location of at least one type of anatomical layer included in the soft tissue of the user's face have.
  • ICP iterative closest points
  • non-rigid matching algorithms by matching or transforming the shape and position of at least one type of anatomical layer included in the anatomical face model with at least one of non-rigid registration
  • an algorithm for comparing and analyzing three-dimensional measurement data about a shape of a face and at least one anatomical face model associated with user information is not necessarily limited to those listed above. It is noted that other algorithms may be utilized within the scope of the object of the present invention.
  • the layer estimator 220 may include at least one anatomical data associated with corresponding user information, using the 3D measurement data regarding the shape of the face of the user obtained from the measurement data acquirer 210.
  • the analysis may be performed by comparing with the face model to estimate a change over time of the shape and position of at least one kind of anatomical layer included in the soft tissue of the user's face.
  • the layer estimator 220 analyzes three-dimensional measurement data regarding the shape of the user's face by comparing it with at least one anatomical face model associated with the user information. Estimating the shape and position of at least one type of anatomical layer included in the soft tissue of the user's face, and referring to the information on the change over time of the at least one anatomical face model, the estimated at least one It is possible to estimate the change over time of the shape and position of the kind of anatomical layer.
  • the layer estimator 220 may include at least one feature related to the face of the user, which is extracted from the 3D measurement data regarding the shape of the face of the user, associated with the corresponding user information. Compare or match the user's three-dimensional measurement data with the shape and position of at least one type of anatomical layer included in the at least one anatomical face model to compare the user's anatomical face model with The shape or position of at least one type of anatomical layer included in the soft tissue of the face may be estimated.
  • the layer estimator 220 may include information about a change over time of at least one anatomical face model associated with the above user information (for example, a user over time). The shape and position of the at least one kind of anatomical layer estimated above with reference to the shape and position of at least one type of anatomical layer included in the at least one anatomical face model associated with the information). The change over time can be estimated.
  • the comparison information providing unit 230 may provide information about a comparison between the face of the user specified by the estimation of the layer estimator 220 and the at least one reference anatomical face model.
  • the above-described reference anatomical face model includes (a) a model calculated by averaging, standardizing, or normalizing a plurality of anatomical face models, (b) a length between predetermined face points, and a predetermined face point.
  • Averaging and standardizing a face model in which at least one of a distance between each other, an angle between a plurality of predetermined points of a face, a height between predetermined face points, a size of a predetermined face area, a volume of a face, and a ratio between predetermined face points meets a predetermined criterion
  • at least one of a model calculated by normalizing and a model calculated by averaging, normalizing, or normalizing a predetermined super normal anatomical face model means that the appearance is superior to the average face
  • the super normal anatomical face model includes a beauty contest (eg, Miss Universe, Miss). World, Miss International, Miss Asia Pacific World, etc.) anatomical face model of the winner.
  • the comparison information providing unit 230 may include at least one reference anatomical result obtained by aggregating and averaging shapes and positions between the respective anatomical layers included in the plurality of anatomical face models. Information about a comparison between a face model and a face of a user specified by the estimation of the layer estimator 220 may be provided. Meanwhile, according to an exemplary embodiment of the present disclosure, the comparison information providing unit 230 may include a plurality of anatomical face models associated with user information (for example, a plurality of anatomical associations (or the same) with age or gender of the user). Information about a comparison between at least one reference anatomical face model obtained by averaging the shape and position of each anatomical layer included in the face model) and the user's face specified by the estimation of the layer estimator 220. May be provided.
  • the comparison information providing unit 230 may include a length between predetermined face points, a distance between predetermined face points, an angle between a plurality of predetermined points of a face, and a predetermined face point.
  • Select at least one face model in which at least one of the height, the size of the predetermined face area, the volume of the face, and the ratio between the predetermined points of the face is close to a predetermined criterion (eg, the golden ratio of the Fibonacci sequence), and the selected model is selected.
  • Information about a comparison between at least one reference anatomical face model obtained by averaging the shapes and positions of the respective anatomical layers included and the user's face specified by the estimation of the layer estimator 220 may be provided.
  • the comparison information providing unit 230 may include a plurality of anatomical face models associated with user information (for example, a plurality of anatomy in which the user is associated with (or the same) age or gender).
  • At least one face model selected from at least one of which is close to a predetermined criterion (for example, the golden ratio of Fibonacci sequence), and the average of the shapes and positions between the anatomical layers included in the selected model are averaged.
  • Information about a comparison between one reference anatomical face model and a user's face specified by the estimation of the layer estimator 220 may be provided.
  • the comparison information providing unit 230 may detect a face of a user of a predetermined age from an estimation result of a change over time of the shape and position of at least one type of anatomical layer. And provide information regarding a comparison between a face of a user of a given age and at least one reference anatomical face model.
  • the comparison information providing unit 230 is a length between the predetermined face, the distance between the predetermined face between the face of the user and at least one reference anatomical face model specified by the estimation, The values compared with respect to at least one of the angle between the plurality of predetermined points of the face, the height between the predetermined faces of the face, the size of the predetermined face of the face, the volume of the face, and the ratio between the predetermined faces of the face can be provided as information on the comparison.
  • the service provider 240 refers to the estimation result of the shape and position of at least one type of anatomical layer and the information about the comparison provided from the comparison information provider 230.
  • a simulation service regarding at least one of plastic surgery and skin surgery may be provided.
  • the service provider 240 may include at least one type of anatomical layer included in the soft tissue of the user's face through information about the comparison provided from the comparison information provider 230.
  • the layer estimator 220 may include at least one of virtual deformation, injection, and extraction of the user's face according to a simulation service relating to at least one of plastic surgery and skin treatment.
  • the communication unit 250 includes data from / to the measurement data acquisition unit 210, the layer estimator 220, the comparison information provider 230, and the service provider 240. It may perform a function to enable transmission and reception.
  • control unit 260 is a measurement data acquisition unit 210, layer estimator 220, comparison information providing unit 230, service provider 240 and communication unit 250 Control the flow of data between the liver. That is, the controller 260 according to the present invention controls the data flow from / to the outside of the information providing system 200 or the data flow between each component of the information providing system 200, thereby measuring the measurement data obtaining unit 210.
  • the layer estimator 220, the comparison information provider 230, the service provider 240, and the communicator 250 may control to perform a unique function.
  • FIG. 3 is a diagram illustrating a process of providing information on a face using an anatomical layer according to an embodiment of the present invention.
  • 4 to 6 exemplarily illustrate a process of generating information about a comparison between a user's face and a reference anatomical face model according to one embodiment of the present invention.
  • the information providing system 200 may acquire 3D computed tomography data of hard tissue of a user's face or 3D scanning data of a user's face ( 301).
  • the information providing system 200 selects (302) at least one anatomical face model (A) associated with at least one of the race, gender and age of the user,
  • the shape or position of at least one kind of anatomical layer included in the selected at least one anatomical face model (A) is adapted to match the three-dimensional data about the shape of the user's face using a non-rigid registration algorithm.
  • the shape or position of at least one kind of anatomical layer included in the soft tissue of the user's face can be estimated (303).
  • the information providing system 200 according to an embodiment of the present invention, the information about the comparison between the user's face and at least one reference anatomical face model (B, 304) measured by the above estimation Can be obtained.
  • the information providing system 200 sets a feature point 410 on each of a user's face and at least one reference anatomical face model. Measure at least one of length, distance, angle, height, size, volume, and ratio of a predetermined point or portion of the face of the user's face and the reference anatomical face model based on the feature point (510, e.g. Angle (nasofrontal angle), lip-jaw fold angle (labiomental fold angle, etc.), and then compare and analyze the measured values, respectively, to compare the user's face with the reference anatomical face model. Information about the information may be obtained (610).
  • information regarding a comparison between a user's face and a reference anatomical face model is obtained based on a diagnosis element regarding a user's face as follows. Can be.
  • a value related to a face width may include a width of left and right of each of the top 701, middle 702, and bottom 703 faces of the user face viewed from the front. It may be a value measured.
  • the wide left and right widths around the clown in the case of the face stop 702, the wide left and right widths around the clown, and in the case of the lower face 703, both end points of the lips It may be measured at a location passing by.
  • the information on the face width may be information used during facial contouring procedures (for example, clown and mandibular reduction surgery, etc.) to an Asian who has a slightly larger face width than an ideal face ratio. have.
  • Face asymmetry (volume asymmetry): According to an embodiment of the present invention, the value of the face asymmetry, the user's face is divided into the upper 801, middle 802, lower 803, and divided Volumes of six areas generated by dividing each area from side to side according to the face center line 804 may be measured. In addition, according to an embodiment of the present invention, the total volume of the user's face may be calculated by adding up the volumes of the six areas.
  • the information on the facial asymmetry may be information divided into soft tissue asymmetry and cervical tissue asymmetry, and may be information used for face asymmetry correction.
  • the value related to the eye width width is the horizontal distance 901 of the user's right eye, the distance 902 between both eyes, and the left eye.
  • the ratio of the horizontal distance 903 may be measured.
  • a value corresponding to the distance 902 between both eyes may be set to 1.0.
  • the information about the width of the eye may be information used in surgery such as an intraocular angle or an external angle plastic surgery to an Asian having a Mongolian inner jaw wrinkle.
  • the value related to the height of the face is measured by measuring the ratio of the vertical distance of the upper face 1001, the middle finger 1002 and the lower face 1003 of the user's face.
  • a value corresponding to the top 1001 of the face may be set to 1.0.
  • the vertical distance ratio divided up and down by the both ends 1004 of the lips at the lower face 1003 may be further measured.
  • the information regarding the face height may be information used for facial contour surgery, jaw surgery, and the like.
  • the contour depths are sequentially generated by arranging the contours of the cross section perpendicular to the front and back directions of the user's face at 0.25 mm intervals from the tip of the nose (1101). May be).
  • the relative height of each part of the face may be easily determined by the contour line using the depth of the contour, and may be measured by the left and right asymmetry and the smoothness of the curved surface of the face.
  • the information about the contour depth is information indicating what contour the user's face has before and after, and reduces the left and right face widths to Asians and the like who have shorter front and back lengths than Westerners. Information may be used to determine whether facial contour surgery is effective, forehead, nose, chin, etc., forehead plastic surgery, nose plastic surgery or chin plastic surgery is effective.
  • Depth by color map (depth by color map): According to an embodiment of the present invention, the color map according to the depth indicates the face height information such as depth by contour above using the color map. As shown in 1201, the relative height of each part of the user's face may be easily determined through color comparison.
  • the information on the color map according to the depth according to an embodiment of the present invention is an image that shows the degree of the overall facial protrusion easily in color, and forehead, nose, and the forehead of the forehead, nose, chin, jaw This may be information used to determine which surgery is most effective for a patient, such as a plastic surgery.
  • the value relating to the front-rear projection is the front-rear direction from the main feature point 1301 of the user's face to the reference plane 1302 behind the face.
  • the distance may be a value 1303.
  • it may be a value obtained by measuring the front-rear direction distance from the designated point of the user's face to the reference plane 1302.
  • the reference plane 1302 may refer to a plane perpendicular to the front and back direction after passing the average position of the left and right ear beads (tragus).
  • the information about the front-rear projections, forehead, nose, chin tip, etc. such as forehead plastic surgery, nose plastic surgery, chin tip plastic surgery, such as surgery which operation will be effective to the patient
  • the information may be used when making a judgment.
  • the present invention may be a value used when determining the amount to be operated before the related operation or to analyze the results before and after the operation.
  • a value related to an eye distance includes a horizontal and vertical distance 1401 of left and right eyes, a distance 1402 between both eyes, and a vertical distance of left and right eyelids. 1403, etc., the values may mean various measurements related to the eye distance.
  • the information on the eye distance may be a value used to determine the amount of surgery during surgery, such as double eyelid surgery, eye correction, inner eye plastic surgery, external eye plastic surgery.
  • nose and lips length (nose and lips length): According to an embodiment of the present invention, the values relating to the nose and lips distance, nasal bridge length (nasal bridge length, 1501), nasal bridge height (nasal bridge height, 1502) It may be a value meaning various measurements related to nose and lip distance, such as aesthetic line.
  • the information about the nose and lip distance may indicate the relative or absolute length associated with the user's nose and lips, it is used during surgery such as nose plastic surgery, jaw correction surgery, jaw surgery May be information.
  • the information regarding the nose and lip angles may be information used during nose plastic surgery and jaw surgery.
  • Curved length is defined by the eye (Ex, 1702), nose (Al) from the tragus 1701 point of the ear for each of the left and right sides of the user's face. , 1703), the distance to the mouth (Ch, 1704), and the jaw (Gn, 1705).
  • the information about the curve length may be a numerical value that cannot be grasped in two dimensions and can be grasped only in three dimensions.
  • the information about the length of the curve may be utilized during surgery, such as eye surgery, nose surgery, facial contour surgery, wrinkle surgery, information that can be effectively analyzed before and after such surgery Can be.
  • the user's face is not limited to the above-described diagnosis elements related to the above-described user faces, and other various diagnosis elements are added or changed to other diagnosis elements within the above diagnosis elements. It will be apparent to one skilled in the art that information regarding the comparison between and the reference anatomical face model can be obtained.
  • the information providing system 200 may provide the information regarding the obtained comparison through the device 300 (305).
  • Embodiments according to the present invention described above can be implemented in the form of program instructions that can be executed by various computer components and recorded in a computer-readable recording medium.
  • the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the computer-readable recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and magneto-optical media such as floptical disks. medium) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device may be modified with one or more software modules to perform the processing according to the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention se rapporte à un procédé et à un système qui permettent de fournir des informations concernant un visage à l'aide d'une couche anatomique, et à un support d'enregistrement non transitoire lisible par ordinateur. Un aspect de la présente invention concerne le procédé permettant de fournir des informations concernant un visage à l'aide d'une couche anatomique, ledit procédé comprenant les étapes suivantes : l'acquisition de données de mesure tridimensionnelles de la forme du visage d'un utilisateur ; l'analyse des données de mesure tridimensionnelles acquises en les comparant à au moins un modèle de visage anatomique associé aux données d'utilisateur, ce qui permet d'estimer la forme et l'emplacement d'au moins un type d'une couche anatomique incluse dans un tissu mou du visage de l'utilisateur ; la fourniture d'informations concernant la comparaison entre le visage de l'utilisateur spécifié par l'estimation et au moins un modèle de visage anatomique de référence, le modèle de visage anatomique comprenant des données modélisées de la forme et de l'emplacement du ou des types de la couche anatomique incluse dans le tissu mou du visage.
PCT/KR2017/008508 2017-03-10 2017-08-07 Procédé et système de fourniture d'informations concernant un visage à l'aide d'une couche anatomique et support d'enregistrement non transitoire lisible par ordinateur WO2018164329A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2017-0030745 2017-03-10
KR20170030745 2017-03-10
KR1020170094995A KR101959866B1 (ko) 2017-03-10 2017-07-26 해부학적 레이어를 이용하여 얼굴에 관한 정보를 제공하는 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체
KR10-2017-0094995 2017-07-26

Publications (1)

Publication Number Publication Date
WO2018164329A1 true WO2018164329A1 (fr) 2018-09-13

Family

ID=63448209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/008508 WO2018164329A1 (fr) 2017-03-10 2017-08-07 Procédé et système de fourniture d'informations concernant un visage à l'aide d'une couche anatomique et support d'enregistrement non transitoire lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2018164329A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113177923A (zh) * 2021-05-07 2021-07-27 上海联影智能医疗科技有限公司 医学影像内容识别方法、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040075672A (ko) * 2003-02-21 2004-08-30 정보통신연구진흥원 계층적 가상 성형시뮬레이션 방법
KR20120096238A (ko) * 2011-02-22 2012-08-30 주식회사 모르페우스 안면보정 이미지 제공방법 및 그 시스템
KR20130063531A (ko) * 2011-03-31 2013-06-14 고쿠리츠다이가쿠호진 고베다이가쿠 3 차원 조형 모델 제작 방법 및 의료·의학·연구·교육용 지원툴
KR20150020323A (ko) * 2012-05-17 2015-02-25 디퍼이 신테스 프로덕츠, 엘엘씨 수술 계획 방법
KR20170025162A (ko) * 2015-08-27 2017-03-08 연세대학교 산학협력단 얼굴 영상의 얼굴 나이 변환 방법 및 그 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040075672A (ko) * 2003-02-21 2004-08-30 정보통신연구진흥원 계층적 가상 성형시뮬레이션 방법
KR20120096238A (ko) * 2011-02-22 2012-08-30 주식회사 모르페우스 안면보정 이미지 제공방법 및 그 시스템
KR20130063531A (ko) * 2011-03-31 2013-06-14 고쿠리츠다이가쿠호진 고베다이가쿠 3 차원 조형 모델 제작 방법 및 의료·의학·연구·교육용 지원툴
KR20150020323A (ko) * 2012-05-17 2015-02-25 디퍼이 신테스 프로덕츠, 엘엘씨 수술 계획 방법
KR20170025162A (ko) * 2015-08-27 2017-03-08 연세대학교 산학협력단 얼굴 영상의 얼굴 나이 변환 방법 및 그 장치

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113177923A (zh) * 2021-05-07 2021-07-27 上海联影智能医疗科技有限公司 医学影像内容识别方法、电子设备及存储介质

Similar Documents

Publication Publication Date Title
KR101959866B1 (ko) 해부학적 레이어를 이용하여 얼굴에 관한 정보를 제공하는 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체
EP3654239A1 (fr) Biométrie par contact et sans contact à base d'images utilisant des éléments physiologiques
US7760923B2 (en) Method and system for characterization of knee joint morphology
Naudi et al. The virtual human face: superimposing the simultaneously captured 3D photorealistic skin surface of the face on the untextured skin image of the CBCT scan
Deng et al. A novel skull registration based on global and local deformations for craniofacial reconstruction
JP4936491B2 (ja) 視線方向の推定装置、視線方向の推定方法およびコンピュータに当該視線方向の推定方法を実行させるためのプログラム
CN108352068A (zh) 用于估计测试对象的绝对尺寸大小的方法和设备
US20190371059A1 (en) Method for creating a three-dimensional virtual representation of a person
US11931166B2 (en) System and method of determining an accurate enhanced Lund and Browder chart and total body surface area burn score
US20160371569A1 (en) Systems and methods of analyzing images
JP2016085490A (ja) 顔形態の評価システム及び評価方法
Yin et al. Accurate estimation of body height from a single depth image via a four-stage developing network
KR101949152B1 (ko) 피부상태 진단 방법 및 장치와 이를 이용한 피부상태 적합 화장정보 제공 방법
WO2018164329A1 (fr) Procédé et système de fourniture d'informations concernant un visage à l'aide d'une couche anatomique et support d'enregistrement non transitoire lisible par ordinateur
Imaizumi et al. Development of three-dimensional facial approximation system using head CT scans of Japanese living individuals
WO2018164394A1 (fr) Procédé et système permettant de procurer des informations sur le résultat de procédure, support d'enregistrement lisible par ordinateur non transitoire
CN116807452A (zh) 一种脊柱侧弯3d检测方法、系统、设备及介质
Christensen et al. Automatic measurement of the labyrinth using image registration and a deformable inner ear atlas
Nurhudatiana A computer-aided diagnosis system for vitiligo assessment: A segmentation algorithm
Wu et al. Reconstruction of 4D-CT from a single free-breathing 3D-CT by spatial-temporal image registration
WO2018164328A1 (fr) Procédé et système d'estimation de visage au moyen d'une couche anatomique, et support d'enregistrement non transitoire lisible par ordinateur
WO2018164327A1 (fr) Procédé et système d'estimation de couche anatomatique d'un visage, et support d'enregistrement lisible par ordinateur non transitoire
TW202024634A (zh) 美容促進裝置、美容促進系統、美容促進方法、及美容促進程式
Alcantara et al. Exploration of shape variation using localized components analysis
D'Alessio et al. Measure and comparison of facial attractiveness indices through photogrammetry and statistical analysis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17899242

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 05.12.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17899242

Country of ref document: EP

Kind code of ref document: A1