KR101567281B1 - Apparatus and Method for Estimating of Thickness of Subcutaneous Fat - Google Patents

Apparatus and Method for Estimating of Thickness of Subcutaneous Fat Download PDF

Info

Publication number
KR101567281B1
KR101567281B1 KR1020150081946A KR20150081946A KR101567281B1 KR 101567281 B1 KR101567281 B1 KR 101567281B1 KR 1020150081946 A KR1020150081946 A KR 1020150081946A KR 20150081946 A KR20150081946 A KR 20150081946A KR 101567281 B1 KR101567281 B1 KR 101567281B1
Authority
KR
South Korea
Prior art keywords
light
subcutaneous fat
interference light
distribution information
wavelength distribution
Prior art date
Application number
KR1020150081946A
Other languages
Korean (ko)
Inventor
김태근
Original Assignee
세종대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 세종대학교산학협력단 filed Critical 세종대학교산학협력단
Priority to KR1020150081946A priority Critical patent/KR101567281B1/en
Application granted granted Critical
Publication of KR101567281B1 publication Critical patent/KR101567281B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/4738Diffuse reflection, e.g. also for testing fluids, fibrous materials

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • General Physics & Mathematics (AREA)
  • Biochemistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Dentistry (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Engineering & Computer Science (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention relates to an apparatus and a method for estimating subcutaneous fat layer thickness. The subcutaneous fat layer thickness estimating apparatus according to an embodiment of the present invention separates incident light incident from a light source into reference light and measurement light and generates interference light by interfering with the reference light reflected from the reference mirror and the measurement light reflected from the target object A modeling unit for generating an estimated model using computational intelligence from the wavelength distribution information of the interference light, and a modeling unit for estimating the thickness of the subcutaneous fat of the target object from the estimated model, And an estimation unit.
As described above, according to the present invention, by using the apparatus and method for estimating subcutaneous fat layer thickness, it is possible to estimate the thickness of subcutaneous fat layer of 7 mm or more, which was impossible to estimate thickness by infrared scattering.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an apparatus and method for estimating subcutaneous fat layer thickness,

The present invention relates to an apparatus and method for estimating subcutaneous fat layer thickness, and more particularly, to a subcutaneous fat layer thickness estimating apparatus and method for estimating subcutaneous fat layer thickness of a target body by applying wavelength distribution information of light reflected from a target body and a reference mirror to a neural network .

The subcutaneous fat layer, mainly composed of adipocytes, is located between the dermis and the fascia. It stores nutrients, synthesizes fat, prevents heat damage by blocking heat, and protects the body by absorbing shocks. Subcutaneous fat varies in thickness depending on body part, sex, and age, and is generally thicker than females in males and the thickest subcutaneous fat in the waist in the middle-aged.

Unlike visceral fat, subcutaneous fat does not cause major health problems, but it has become a cosmetic concern with the atmosphere of modern society that emphasizes appearance. As a method of measuring subcutaneous fat, there is a method of measuring the thickness of the skin wrinkle which measures the thickness of the subcutaneous fat by using a caliper. The skin wrinkle thickness measurement is a method to estimate the body fat ratio of the whole body by measuring the thickness of the subcutaneous fat in a specific part of the body. The thickness of the subcutaneous fat was measured at 4 points of the triceps, pubic bone, abdomen and thigh, Estimation methods are common.

The skin wrinkle thickness measurement method is instantaneous, easy to calculate, and does not distort the value due to changes in body water, making consistent measurements possible. However, accurate measurement requires skilled techniques, and non-experts may experience errors. In addition, subcutaneous fat can be measured accurately, but visceral fat can not be measured, and error may occur when estimating the body fat ratio.

The bioelectric impedance analysis (BIA), which is a method of measuring the amount of body fat by measuring the amount of water in the body through electrical resistance by passing a weak current through the legs and arms, .

BIA is highly reproducible, low in measurement cost, and easy to measure in measuring body fat, fat mass, and water content. However, since body fat mass is calculated by the regression equation using sex, age and height, errors may occur. In addition, there is an inconvenience that it is necessary to measure urine in a certain period of time in order to increase the reliability because the daily moisture change in the body is intense.

Next, the hydrometry (isotope dilution technique) is a method of directly measuring body fat. Using the principle of Archimedes, the underwater density measurement method allows the subject to enter the water tank and find out the volume of the body by the volume of the water. Using the density of body fat (0.9007g / cc) and fat (1.100g / cc) The average density can be calculated.

However, the measurement of underwater density requires a large apparatus for measurement, and the measurement is troublesome. In order to improve the disadvantages of the underwater density measurement method, an air substitution method (BOD POD) method of measuring the volume using gas instead of water and a method of measuring the volume by scanning the whole body with a three-dimensional laser have appeared.

In addition, the near-infrared method using the principle that the body tissue receives and re-emits the near infrared rays, the dexamethod (DEXA, Dual Energy X-ray Absorptiometry) using a trace amount of X-ray and the computerized tomography Is used as a method for measuring the thickness of subcutaneous fat. However, the conventional method of measuring subcutaneous fat thickness requires much time for measurement and analysis, and it is troublesome for an expert to measure expensive equipment.

Optical coherence tomography (OCT) devices acquire images of tissues by irradiating near infrared rays to the living body, measuring the intensity of light reflected from each tissue at various angles, and processing the signals with a computer. However, optical coherence tomography can only measure subcutaneous fat thickness of less than 7mm due to contamination by light reflected from the living body.

Therefore, it is necessary to develop a device capable of easily measuring the thickness of the subcutaneous fat, which has no limitation in the depth of photographing that can obtain a clear image and is thick in thickness.

The technology to be a background of the present invention is disclosed in Korean Patent No. 10-0716800 (published on May 13, 2007).

The present invention relates to an apparatus and method for estimating subcutaneous fat layer thickness, and more particularly, to a subcutaneous fat layer thickness estimating apparatus and method for estimating subcutaneous fat layer thickness of a target body by applying wavelength distribution information of light reflected from a target body and a reference mirror to a neural network And the like.

According to another aspect of the present invention, there is provided an apparatus for estimating the thickness of subcutaneous fat layer, comprising: a light source for separating incident light incident from a light source into a reference light and a measurement light; A modeling unit configured to generate an estimation model using computational intelligence from the wavelength distribution information of the interference light and to generate an estimation model of the interference light based on the thickness of the subcutaneous fat of the object from the wavelength distribution information of the interference light; And a computer-aided intelligence estimator.

In addition, the reference mirror moves in a direction perpendicular to a direction in which the incident light is irradiated to reflect the reference light, and the optical detecting unit can detect the wavelength distribution information of the interference light corresponding to the position of the reference mirror.

The path length of the reference light and the measurement light may be the same.

The optical detector includes a light splitter for separating the incident light into the reference light and the measurement light, a reference light reflected from the reference mirror, and an optical coupler for condensing the measurement light reflected from the target with an optical fiber, A diffraction grating for separating the interference light according to a wavelength, and a line scan camera (LSC) for detecting wavelength distribution information of the interference light.

The modeling unit may generate learning data on the wavelength distribution information of the interference light collected through the scanning process of the reference mirror and learn correlation between the wavelength of the interference light and the subcutaneous fat layer thickness using the learning data Thereby modeling the estimated model.

The computerized-in-charge estimator may estimate the subcutaneous fat layer thickness of the subject by applying the wavelength distribution information of the interference light of the subject to the estimated model.

Also, the computational intelligence may include a neural network or fuzzy logic having a learning function of data.

According to another embodiment of the present invention, a subcutaneous fat layer thickness estimation method performed by an apparatus for estimating subcutaneous fat layer thickness comprises: separating incident light incident from a light source into reference light and measurement light; Generating interfering light by interfering the measuring light to detect wavelength distribution information of the interference light, generating an estimation model using computational intelligence from the wavelength distribution information of the interference light, and extracting, from the estimation model, And estimating the thickness of the first layer.

Therefore, according to the present invention, by using the apparatus and method for estimating the subcutaneous fat layer thickness, it is possible to estimate the thickness of the subcutaneous fat layer of 7 mm or more which was impossible to estimate by the scattering of infrared rays. In addition, a non-expert can quickly and easily estimate the subcutaneous fat layer thickness of a subject.

1 is a block diagram showing a configuration of an apparatus for estimating subcutaneous fat layer thickness according to an embodiment of the present invention.
2 is a flowchart illustrating a method of estimating subcutaneous fat layer thickness according to an embodiment of the present invention.
3 is a view for explaining a light detecting unit according to an embodiment of the present invention.
4 is a diagram for explaining a structure of a back propagation neural network according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram showing a configuration of an apparatus for estimating subcutaneous fat layer thickness according to an embodiment of the present invention.

1, the subcutaneous fat layer thickness estimating apparatus 100 includes a light detecting unit 110, a modeling unit 120, and a computerized intelligence estimating unit 130. As shown in FIG.

First, the optical detector 110 separates incident light incident from a light source into reference light and measurement light. The photodetector 110 generates interference light by interfering the reference light reflected from the reference mirror with the measurement light reflected from the object, and detects the wavelength distribution information of the interference light.

At this time, the reference light is reflected from a reference mirror positioned orthogonally to the direction in which the incident light is irradiated, and the measurement light is reflected from the body of the object to be estimated of the thickness of the subcutaneous fat layer. Here, the path lengths of the reference light and the measurement light may be the same.

Also, the reference mirror moves in a direction orthogonal to the direction in which the incident light is irradiated and scans. The reference mirror reflects the incident light in the scanning process, and the optical detector 110 can detect the wavelength distribution information of the interference light according to the depth position corresponding to the position of the reference mirror. At this time, the reference mirror may be located outside the optical detection unit 110 or may be included in the optical detection unit 110.

The optical detector 110 may include a light splitter, a reference mirror, an optical coupler, a diffraction grating, and a line scan camera.

The beam splitter separates incident light into reference light and measurement light, and the reference mirror reflects the reference light. The optical coupler collects the reference light reflected from the reference mirror and the measurement light reflected from the object with the optical fiber to generate interference light. The diffraction grating separates the interference light according to the wavelength, The camera (LSC, Line Scan CCD) detects the wavelength distribution information of the interference light.

Next, the modeling unit 120 generates an estimation model from the wavelength distribution information of the interference light using the computational intelligence, and includes a learning pattern configuration unit 121, a neural network modeling unit 122, and a fat layer thickness estimation unit 123 . Also, the modeling unit 120 generates training data on the wavelength distribution information of the interference light collected through the scanning process according to the depth position of the reference mirror. Using the generated learning data, the correlation model between the wavelength of the interference light and the subcutaneous fat layer is modeled.

Computational intelligence means that the wavelength distribution information of the interference light according to each depth position is divided into learning data and test data and the relationship between wavelength distribution information of the interference light and subcutaneous fat layer thickness is learned using learning data, The model is used to estimate subcutaneous fat layer thickness of a subject using neural network or fuzzy logic.

In the neural network, the input pattern and the output pattern are mapped, and the neural network learns the relation between the input pattern and the output pattern through the repetition of this process. Then, an output pattern corresponding to the input pattern is modeled by finding a pattern group that is most similar to a pattern to be input later.

Fuzzy logic is a logic concept that expresses an unclear state or an ambiguous state out of binary legal logic, and it can be mathematically modeled by expressing the degree to which each object belongs to the group as a membership function.

Finally, the computerized intelligence estimator 130 estimates the subcutaneous fat layer thickness of the subject by applying the wavelength distribution information of the subject's interference light to the estimated model.

The computerized intelligence estimator 130 may estimate the subcutaneous fat layer thickness of the target object by extracting an output pattern corresponding to the wavelength distribution information of the interference light of the input target object. At this time, the wavelength according to the depth position of the reference mirror and the depth position of the reference mirror can be used as the wavelength distribution information.

Hereinafter, the method for estimating the subcutaneous fat layer thickness according to the embodiment of the present invention will be described in more detail with reference to FIG. 2 through FIG.

FIG. 2 is a flow chart for explaining a subcutaneous fat layer thickness estimation method according to an embodiment of the present invention, and FIG. 3 is a view for explaining a photodetector according to an embodiment of the present invention.

As shown in FIG. 2, steps S210 to S230 in the method of estimating the subcutaneous fat layer thickness mean a learning step, and steps S240 and S260 refer to an estimation step of estimating the subcutaneous fat layer thickness of the subject using the learned estimation model do. For the convenience of explanation, it is divided into two steps.

First, the subcutaneous fat layer thickness estimation apparatus 100 generates an estimated model through steps S210 to S230, and performs learning.

The photodetector 110 separates incident light and transmits the separated light to the reference mirror 113 and the object 200 (S210).

As shown in FIG. 3, the optical detector 110 can separate the incident light into the reference light and the measurement light by using the optical splitter 111. The light beam splitter 111 transmits the reference light to the reference mirror 113 and transmits the measurement light to a part of the body of the object 200 to be estimated of the thickness of the subcutaneous fat layer.

Here, the object 200 may be a reference body which previously knows the thickness information of the subcutaneous fat layer, and the apparatus 100 for estimating the subcutaneous fat layer thickness according to an embodiment of the present invention estimates the subcutaneous fat layer thickness of the reference body, And the subcutaneous fat layer thickness of the sieve.

In addition, the optical detector 110 interferes with the reference light and the measurement light, and detects the wavelength distribution information of the interference light (S220).

The optical detector 110 uses the optical coupler 115 to condense the reference light reflected from the reference mirror 113 and the measurement light reflected from the target object 200 into an optical fiber to generate interference light. The optical detector 110 can detect the wavelength distribution information of the interference light by using the line scan camera (LSC) 119, by separating the interference light according to the wavelength using the diffraction grating 117.

In Fig. 3, Path? Represents the path along which the reference light is generated from the light source. According to path (1), incident light incident from a light source is reflected by a light splitter (111) and transmitted to a reference mirror (113). The reference light reflected from the reference mirror 113 passes through the optical splitter 111 and is transmitted to the optical coupler 115.

And path (2) represents a path along which measurement light is generated from the light source. According to the path (2), the incident light incident from the light source passes through the light splitter (111) and is transmitted as a part of the body of the object (200). The measurement light reflected from the object 200 is reflected by the light splitter 111 and transmitted to the optical coupler 115.

The measurement light generated by the path (1) and the path (2) produced by path (1) are mutually interfered by the optical coupler (115). The optical coupler 115 condenses the reference light and the measurement light into an optical fiber to generate interference light. Then, the generated interference light is separated according to the wavelength in the diffraction grating 117. Next, the line scan camera 119 detects the wavelength distribution information of the interference light separated according to the wavelength.

Next, the modeling unit 120 generates an estimated model from the detected wavelength distribution information (S230).

The modeling unit 120 generates learning data on the wavelength distribution information of the interference light collected through the scanning process of the reference mirror 113 and learns the correlation between the wavelength of the interference light and the subcutaneous fat layer thickness using the learning data , And models the estimated model. Here, the learning data is generated from the wavelength distribution information of the interference light according to each depth position, and means data for neural network learning.

The modeling unit 120 may include a learning pattern forming unit 121, a neural network modeling unit 122, and a fat layer thickness estimating unit 123.

The learning pattern constructing unit 121 generates the training data for neural network learning and the test data for determining fitness of the learned model from the wavelength distribution information of the interference light according to each depth position. A feature vector (vertor) constituting each learning data and test data may be composed of input information and output information, or may be composed only of input information. That is, the input of the feature vector can be represented by one-dimensional, two-dimensional or three-dimensional information, and when the feature vector is three-dimensional,

Figure 112015056002340-pat00001

Here, n denotes the depth position of the reference mirror 113, m denotes the wavelength of the interference light, and x denotes the intensity of light at the corresponding depth position. At this time, the intensity of the light may have a continuous value, and the continuous value may be quantized and expressed as a bit array of "0" or "1 ".

And a range of thicknesses of the fat layer can be specified by a combination of several outputs. For example, in the case of a neural network with three outputs, if the output is (0,0,0), the thickness of the fat layer is the thinnest. If (1,1,1) is the thickest fat layer, It can be divided into eight steps.

The neural network modeling unit 122 receives the light intensity of the interference light along the depth position of the reference mirror 113 and determines whether the body part of the object 200 that reflects the interference light corresponding to the depth position is a fat- And generates an estimation model in which the output is determined as an output.

Further, the fat layer thickness estimating unit 123 estimates the actual thickness range of the fat layer using the learning data, and thereby learns the estimated model. (0, 0, 0) when the thickness of the fat layer is 0 cm to 1 cm, (0, 0, 1) when the thickness is 1 cm to 2 cm, (1, 1, 1). For convenience of explanation, the neural network having three outputs has been described as a basic model. However, the present invention is not limited thereto and can be applied to various combinations of outputs.

The feature vector of the wavelength distribution of the interference light according to each depth position for the neural network modeling can be expressed as Equation (2).

Figure 112015056002340-pat00002

When developing a neural network model using learning data and test data with these feature vectors, the total number of input neurons of the neural network is determined by the number of l + m + n, and the output neurons can be composed of combinations of 1 to l have. As described above, the wavelength distribution vector of the interference light according to each depth position can be composed of only input information. In this case, the wavelength distribution vector of the interference light according to each depth position used in the development of the neural network model is expressed by the following Equation (3).

Figure 112015056002340-pat00003

Noise may be included in the wavelength distribution vector of the interference light according to each depth position. In the learning pattern filtering, a signal processing technique for reducing the noise of the feature vector according to the prepared depth positions can be applied.

For example, various techniques for reducing noise by applying a wavelet technique and reducing noise can be easily understood by those skilled in the art, so a detailed description thereof will be omitted.

The learning pattern and the test pattern configured through the learning pattern configuration unit 121 are used for developing a neural network model.

In the case of neural network, it is divided into Supervised type and Unsupervised type. In some cases, a combination of supervised type and non-supervised type is used. In the supervised neural network, the feature vector is composed of input pattern and output pattern. In the case of uncontrolled neural network, the feature vector consists only of the input pattern. The most widely used structure of supervised neural networks is backpropagation neural network. In addition, there are Radial basis function network and Generalized regression neural network.

4 is a diagram for explaining a structure of a back propagation neural network according to an embodiment of the present invention.

As shown in Fig. 4, the back propagation neural network consists of an input layer, a hidden layer, and an output layer. The number of hidden layers is usually one, but two or more can be used. The number of input layer neurons is equal to the number of all elements constituting the feature vector, that is, l + m + n, and the number of output layer neurons is three. In some cases, a structure in which a Bias neuron is connected to the hidden layer and neurons in the output layer can be used. The most typical learning algorithms used for learning when using backpropagation neural networks include the Generalized delta rule and apply it to learning.

The back propagation neural network transmits the input pattern to the input layer of the neural network. At this time, the neural network propagates the input pattern from the layer to the layer until the output pattern is generated in the output layer. If the output pattern is different from the target pattern, the error is calculated and propagates backward along the neural network from the output layer to the input layer. Thus, the backpropagation neural network can be modified in weight as the error propagates.

In the case of back propagation neural networks, the estimation performance can be improved by optimizing the learning performance. In the case of back propagation neural networks, factors that affect learning include learning tolerance, number of hidden neurons, initial weight, and slope of neuron activation function. In order to optimize the learning performance of the back propagation neural network, optimization is performed while varying the value of each factor, and an optimization technique such as a genetic algorithm can be applied.

The modeling process consists of learning and testing processes. In the learning process, the model is developed using the generalized delta rule and the learning data described above, and the estimation performance of the model is evaluated using the test data after the learning. Root-Mean-Squared Error (RMSE) is generally used for evaluation, and other evaluation equations using the difference between the model predicted value and actual value can be used.

For convenience of explanation, the model developed by the neural network modeling unit 122 is referred to as an estimation model. The neural network modeling unit 122 configured with input and output patterns applies a supervisory neural network.

Next, the subcutaneous fat layer thickness estimation apparatus 100 estimates the subcutaneous fat layer thickness of the subject using the estimated model learned through steps S240 through S260.

The photodetector 110 separates the incident light and transmits the separated light to the reference mirror 113 and the object 200 (S240).

Here, the object 200 means the subject to be estimated of the thickness of the subcutaneous fat layer. The step S240 is substantially the same as the step S210 of the learning step, and redundant explanations are omitted.

The optical detector 110 interferes with the reference light and the measurement light, and detects the wavelength distribution information of the interference light (S250).

In step S250, the process of interfering the measurement light reflected from the target object 200 as the reference light and the subject and detecting the wavelength distribution information of the interference light is substantially the same as that in step S220, and a duplicate description will be omitted.

Finally, the computerized intelligence estimator 130 estimates the subcutaneous fat thickness of the subject 200 (S240).

The computerized intelligence estimator 130 inputs test data to an estimation model that has learned the correlation between the wavelength of the interference light and the thickness of the subcutaneous fat layer using the learning data and outputs the test data to the target object 200 using the output pattern according to the result. Of the subcutaneous fat layer. The test data includes wavelength distribution information of the interference light along each depth position of the reference mirror 113. The interference light includes measurement light reflected from the object 200 to estimate the thickness of the subcutaneous fat layer.

At this time, the computer-aided intelligence estimator 130 inputs the wavelength distribution information of the received interference light into the estimation model for each depth position of the reference mirror 113, or inputs the interference light continuously received for each depth position of the reference mirror 113 It is possible to estimate the thickness of the subcutaneous fat layer of the target body 200 by inputting the wavelength distribution information into the estimation model at a time in a two-dimensional array.

First, when the computer-aided intelligence estimator 130 inputs to the estimated model for each depth position of the reference mirror 113, the computer-aided intelligence estimator 130 compares the wavelength of the received interference light for each depth position of the reference mirror 113 Distribution information is input to the estimation model to determine whether the body part of the object 200 corresponding to the depth position is composed of fat or skin.

For example, when the estimated model outputs a value of On when the region is judged to be fat and outputs Off when the region is judged to be skin, the computer-aided intelligence estimator 130 outputs On The thickness of the subcutaneous fat layer of the subject 200 can be estimated by extracting the output section and using the number of outputs to which On is successively output.

In addition, the computerized intelligence estimator 130 can measure the thickness of the visceral fat and the retroperitoneal fat, which are located deeper than the subcutaneous fat layer, using the outputted On and Off output patterns.

Next, when the wavelength distribution information of the interference light is input into the estimation model at a time in a two-dimensional array, the computer-aided intelligence estimator 130 can estimate the subcutaneous fat layer thickness of the object 200 using the value of the output pattern have.

For example, if the modeling unit 120 generates an estimation model having three outputs as a basic model in step S230 and the output pattern in step S240 is (0, 0, 0), the computation intelligence estimator 130 ) Is assumed to be 0-1 cm in thickness of subcutaneous fat layer. It is determined that the output pattern is 1 cm to 2 cm when the output pattern is (0, 0, 1) and 2 cm to 3 cm when the output pattern is (0, 1, 0) ), The computational intelligence estimator 130 can estimate that the thickness of the subcutaneous fat layer is 7 cm to 8 cm.

The modeling unit 120 may generate an estimation model having a variety of outputs, and the computer-aided intelligence estimating unit 130 may generate various models of the estimation model. For example, Estimation models with a number of outputs can be used to estimate a wider range of subcutaneous fat layer thickness, or more precisely to estimate the thickness of the subcutaneous fat layer.

As described above, according to the embodiment of the present invention, it is possible to estimate the thickness of subcutaneous fat layer of 7 mm or more, which was impossible to estimate thickness by infrared scattering, by using the apparatus and method for estimating subcutaneous fat layer thickness. In addition, a non-expert can quickly and easily estimate the subcutaneous fat layer thickness of a subject.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, Of the right.

100: Subcutaneous fat layer thickness estimation device 110:
111: beam splitter 113: reference mirror
115: optical coupler 117: diffraction grating
119: line scan camera 120: modeling unit
130: Computational Intelligence Unit 200: Object

Claims (14)

An optical detector for separating incident light incident from a light source into reference light and measurement light and generating interference light by interfering with the reference light reflected from the reference mirror and the measurement light reflected from the target object to detect wavelength distribution information of the interference light,
A modeling unit for generating an estimation model using the computational intelligence from the wavelength distribution information of the interference light,
And a computational intelligence estimator for estimating the thickness of subcutaneous fat of the subject from the estimated model,
The photodetector unit includes:
A light splitter for separating the incident light into the reference light and the measurement light,
An optical coupler for condensing the reference light reflected from the reference mirror and the measurement light reflected from the object with an optical fiber to generate the interference light,
A diffraction grating for separating the interference light according to the wavelength, and
And a line scan camera (LSC) for detecting wavelength distribution information of the interference light.
The method according to claim 1,
The reference mirror moves in a direction perpendicular to a direction in which the incident light is irradiated, reflects the reference light,
The photodetector unit includes:
And the wavelength distribution information of the interference light is detected corresponding to the position of the reference mirror.
3. The method of claim 2,
Wherein the path length of the reference light and the measurement light are the same.
delete 3. The method of claim 2,
The modeling unit,
Generating learning data for wavelength distribution information of the interference light collected through the scanning process of the reference mirror and learning the correlation between the wavelength of the interference light and the subcutaneous fat layer thickness using the learning data, The subcutaneous fat layer thickness estimating device.
The method according to claim 1,
The computer-
And applying the wavelength distribution information of the interference light of the object to the estimation model to estimate the thickness of the subcutaneous fat layer of the subject.
The method according to claim 1,
Wherein the computerized intelligence includes a neural network or fuzzy logic having a learning function of data.
A method for estimating subcutaneous fat layer thickness using an apparatus for estimating subcutaneous fat layer thickness,
Detecting the wavelength distribution information of the interference light by separating the incident light incident from the light source into the reference light and the measurement light, generating interference light by interfering the measurement light reflected from the object with the reference light reflected from the reference mirror,
Generating an estimation model using computational intelligence from the wavelength distribution information of the interference light, and
Estimating a thickness of subcutaneous fat of the subject from the estimated model,
Wherein the step of detecting the wavelength distribution information of the interference light comprises:
Separating the incident light into the reference light and the measurement light, generating the interference light by condensing the reference light reflected from the reference mirror and the measurement light reflected from the object with an optical fiber, separating the interference light according to the wavelength, And the wavelength distribution information of the interference light is detected.
9. The method of claim 8,
The reference mirror moves in a direction perpendicular to a direction in which the incident light is irradiated, reflects the reference light,
Wherein the step of detecting the wavelength distribution information of the interference light comprises:
And the wavelength distribution information of the interference light is detected corresponding to the position of the reference mirror.
10. The method of claim 9,
Wherein the path length between the reference light and the measurement light is the same.
delete 10. The method of claim 9,
Wherein the step of generating the estimation model comprises:
Generating learning data for wavelength distribution information of the interference light collected through the scanning process of the reference mirror and learning the correlation between the wavelength of the interference light and the subcutaneous fat layer thickness using the learning data, A method for estimating the subcutaneous fat layer thickness.
9. The method of claim 8,
Wherein the step of estimating the thickness of subcutaneous fat of the subject comprises:
And estimating a subcutaneous fat layer thickness of the subject by applying wavelength distribution information of the interference light of the subject to the estimated model.
9. The method of claim 8,
Wherein the computational intelligence comprises a neural network or fuzzy logic having a learning function of data.
KR1020150081946A 2015-06-10 2015-06-10 Apparatus and Method for Estimating of Thickness of Subcutaneous Fat KR101567281B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150081946A KR101567281B1 (en) 2015-06-10 2015-06-10 Apparatus and Method for Estimating of Thickness of Subcutaneous Fat

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150081946A KR101567281B1 (en) 2015-06-10 2015-06-10 Apparatus and Method for Estimating of Thickness of Subcutaneous Fat

Publications (1)

Publication Number Publication Date
KR101567281B1 true KR101567281B1 (en) 2015-11-10

Family

ID=54605373

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150081946A KR101567281B1 (en) 2015-06-10 2015-06-10 Apparatus and Method for Estimating of Thickness of Subcutaneous Fat

Country Status (1)

Country Link
KR (1) KR101567281B1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180067961A (en) * 2016-12-13 2018-06-21 세종대학교산학협력단 Apparatus for estimating of thickness of subcutaneous fat using swept source interferometry and method thereof
WO2020146490A1 (en) * 2019-01-08 2020-07-16 Inkbit, LLC Depth reconstruction in additive fabrication
US10830578B2 (en) 2018-10-19 2020-11-10 Inkbit, LLC High-speed metrology
US10926473B1 (en) 2020-02-20 2021-02-23 Inkbit, LLC Multi-material scanning for additive fabrication
US10974460B2 (en) 2019-01-08 2021-04-13 Inkbit, LLC Reconstruction of surfaces for additive manufacturing
US10994477B1 (en) 2019-11-01 2021-05-04 Inkbit, LLC Optical scanning for industrial metrology
US10994490B1 (en) 2020-07-31 2021-05-04 Inkbit, LLC Calibration for additive manufacturing by compensating for geometric misalignments and distortions between components of a 3D printer
US11347908B2 (en) 2018-11-02 2022-05-31 Inkbit, LLC Intelligent additive manufacturing
US11354466B1 (en) 2018-11-02 2022-06-07 Inkbit, LLC Machine learning for additive manufacturing
US11667071B2 (en) 2018-11-16 2023-06-06 Inkbit, LLC Inkjet 3D printing of multi-component resins
KR20230102449A (en) 2021-12-30 2023-07-07 (주)성일이노텍 Smart magnetic stimulation apparatus and operating method of the same
US11712837B2 (en) 2019-11-01 2023-08-01 Inkbit, LLC Optical scanning for industrial metrology
EP4142873A4 (en) * 2020-04-27 2024-05-15 Profound Medical Inc. Automated magnetic resonance image segmentation for ultrasound thermal therapy control

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101891075B1 (en) * 2016-12-13 2018-08-23 세종대학교 산학협력단 Apparatus for estimating of thickness of subcutaneous fat using swept source interferometry and method thereof
KR20180067961A (en) * 2016-12-13 2018-06-21 세종대학교산학협력단 Apparatus for estimating of thickness of subcutaneous fat using swept source interferometry and method thereof
US10830578B2 (en) 2018-10-19 2020-11-10 Inkbit, LLC High-speed metrology
US11651122B2 (en) 2018-11-02 2023-05-16 Inkbit, LLC Machine learning for additive manufacturing
US11347908B2 (en) 2018-11-02 2022-05-31 Inkbit, LLC Intelligent additive manufacturing
US11354466B1 (en) 2018-11-02 2022-06-07 Inkbit, LLC Machine learning for additive manufacturing
US11667071B2 (en) 2018-11-16 2023-06-06 Inkbit, LLC Inkjet 3D printing of multi-component resins
US11077620B2 (en) * 2019-01-08 2021-08-03 Inkbit, LLC Depth reconstruction in additive fabrication
US10974460B2 (en) 2019-01-08 2021-04-13 Inkbit, LLC Reconstruction of surfaces for additive manufacturing
WO2020146490A1 (en) * 2019-01-08 2020-07-16 Inkbit, LLC Depth reconstruction in additive fabrication
US20200223147A1 (en) * 2019-01-08 2020-07-16 lnkbit, LLC Depth reconstruction in additive fabrication
US10994477B1 (en) 2019-11-01 2021-05-04 Inkbit, LLC Optical scanning for industrial metrology
US11712837B2 (en) 2019-11-01 2023-08-01 Inkbit, LLC Optical scanning for industrial metrology
US10926473B1 (en) 2020-02-20 2021-02-23 Inkbit, LLC Multi-material scanning for additive fabrication
EP4142873A4 (en) * 2020-04-27 2024-05-15 Profound Medical Inc. Automated magnetic resonance image segmentation for ultrasound thermal therapy control
US10994490B1 (en) 2020-07-31 2021-05-04 Inkbit, LLC Calibration for additive manufacturing by compensating for geometric misalignments and distortions between components of a 3D printer
US11766831B2 (en) 2020-07-31 2023-09-26 Inkbit, LLC Calibration for additive manufacturing
KR20230102449A (en) 2021-12-30 2023-07-07 (주)성일이노텍 Smart magnetic stimulation apparatus and operating method of the same

Similar Documents

Publication Publication Date Title
KR101567281B1 (en) Apparatus and Method for Estimating of Thickness of Subcutaneous Fat
KR101891075B1 (en) Apparatus for estimating of thickness of subcutaneous fat using swept source interferometry and method thereof
EP2651309B1 (en) Medical imaging devices, methods, and systems
US8094149B2 (en) Multi-spectral reconstruction method based on adaptive finite element
AU2019333924A1 (en) Apparatus and process for medical imaging
IT202100027281A1 (en) A COMPUTER-IMPLEMENTED METHOD AND A SYSTEM FOR ESTIMATING A PITH LOCATION WITH REGARD TO A TIMBER BOARD
KR102091568B1 (en) Method and system for detecting wrinkle of skin using optical coherence tomography
Olefir et al. A Bayesian approach to eigenspectra optoacoustic tomography
CN115581436B (en) High-resolution near-infrared brain function tomography algorithm fused with deep learning
CN103815929B (en) Subject information acquisition device
Ullah et al. A machine learning-based classification method for monitoring Alzheimer’s disease using electromagnetic radar data
US20220245430A1 (en) Method and system for confidence estimation of a trained deep learning model
Aguiam et al. Estimation of X-mode reflectometry first fringe frequency using neural networks
CN117333569A (en) Light scattering mammary gland diagnostic system
Avanaki et al. De-noising speckled optical coherence tomography images using an algorithm based on artificial neural network
CN117694885B (en) Method, device, system and medium for detecting blood oxygen related parameters of muscle tissue
CN117607155B (en) Strain gauge appearance defect detection method and system
Gao et al. PI-VEGAN: Physics Informed Variational Embedding Generative Adversarial Networks for Stochastic Differential Equations
Benedetti et al. An adaptive multiscaling imaging technique based on a fuzzy-logic strategy for dealing with the uncertainty of noisy scattering data
KR102397104B1 (en) Method of Detecting and Localizing People inside Vehicle Using Impulse Radio Ultra-Wideband Radar
CN111316087B (en) Photoacoustic method for determining characteristics of inhomogeneous samples using measurement light having a predetermined wavelength range
Tambe 3D PHOTOACOUSTIC TOMOGRAPHY SETUP CALIBRATION
CN118121196A (en) HCT prediction method, system, equipment and medium
Mao Reconstruction of the spatial structure of cell with Brillouin ultrasonic imaging
CN114820858A (en) Algorithm for calculating skin tissue optical parameter distribution

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20181023

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20191014

Year of fee payment: 5