CN113362267A - Inspection and judgment system and method combining optical image and thermal image - Google Patents

Inspection and judgment system and method combining optical image and thermal image Download PDF

Info

Publication number
CN113362267A
CN113362267A CN202010104231.2A CN202010104231A CN113362267A CN 113362267 A CN113362267 A CN 113362267A CN 202010104231 A CN202010104231 A CN 202010104231A CN 113362267 A CN113362267 A CN 113362267A
Authority
CN
China
Prior art keywords
features
thermal
optical image
field
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010104231.2A
Other languages
Chinese (zh)
Inventor
黄彦铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aike Technology Co ltd
Original Assignee
Aike Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aike Technology Co ltd filed Critical Aike Technology Co ltd
Priority to CN202010104231.2A priority Critical patent/CN113362267A/en
Publication of CN113362267A publication Critical patent/CN113362267A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Abstract

An inspection and determination system combining optical images and thermal images, comprising: the optical receiving unit receives optical image information of a field; a thermal energy receiving unit for receiving thermal image information of the field; the operation unit is used for processing the optical image information according to a first model to acquire optical image characteristics, processing the thermal image information according to a second model to acquire thermal image characteristics, combining the optical image characteristics and the thermal image characteristics according to a third model to acquire coupling characteristics, obtaining the possibility of different situations of the field and obtaining the most possible inspection judgment result; and a display unit that displays the inspection judgment result.

Description

Inspection and judgment system and method combining optical image and thermal image
Technical Field
The present invention relates to an inspection and determination system, and more particularly, to an inspection and determination system and method combining optical images and thermal images.
Background
In the prior art, when nursing a wound of a patient, medical staff often observe the wound with naked eyes and describe the condition of the wound through written records. However, identification of a wound by the naked eye may be distorted by subjective identification by a caregiver, in addition to possible misjudgment concerns. Furthermore, visual observation of the wound, while clearly indicating the size or depth of the wound, is easily misaligned for the condition of the wound, such as tissue necrosis or inflammation, which can only be assessed empirically by the medical practitioner.
Therefore, if a tool capable of automatically analyzing the state of the wound can be developed, the state of the wound can be accurately and objectively analyzed, the basis for the medical care personnel to attend to the wound is provided, the chance of medical negligence can be obviously greatly reduced, and the medical care quality is further improved.
Disclosure of Invention
To solve the problems of the prior art, the present invention provides an inspection and determination system combining optical images and thermal images, comprising: the optical receiving unit receives optical image information of a field; a thermal energy receiving unit for receiving thermal image information of the field; the operation unit is used for processing the optical image information according to the first model to acquire optical image characteristics, processing the thermal image information according to the second model to acquire thermal image characteristics, combining the optical image characteristics and the thermal image characteristics according to the third model to acquire coupling characteristics, acquiring the possibility of different situations of the field and acquiring the most possible inspection judgment result; and a display unit that displays the inspection judgment result.
In an embodiment, the inspection and determination system further includes a wireless transmission unit for transmitting the optical image information received by the optical receiving unit or the thermal image information of the field received by the thermal energy receiving unit to the computing unit.
In one embodiment, the first model, the second model and the third model are built in the arithmetic unit.
In one embodiment, the first model comprises a convolutional neural network, and the captured optical image features comprise one-dimensional results and two-dimensional results, wherein the one-dimensional results comprise a preliminary grading and whether the image is valid, and the two-dimensional results comprise a field of interest (ROI) and surrounding environment features.
In one embodiment, the second model comprises a convolutional neural network, and the captured thermal image features comprise one-dimensional results and two-dimensional results, wherein the one-dimensional results comprise a preliminary ranking and whether the image is valid, and the two-dimensional results comprise a field of interest (ROI) and surrounding environment features.
In an embodiment, the third model includes a convolutional neural network and a multilayer perceptron, the third model generates new features by using the attention field as a convolutional mask, repeats and arranges one-dimensional features to generate new features by using the preliminary hierarchical adjustment feature weight, and the coupling mode of the third model includes displacement, scaling and adjustment of the attention field, and performs integrated judgment of feature data to generate a judgment result.
The invention also provides a checking and judging method combining the optical image and the thermal image, which comprises the following steps: receiving optical image information of a field; receiving thermal image information of the field; processing the optical image information by using a convolutional neural network to acquire optical image characteristics; processing the thermal image information using a convolutional neural network to capture thermal image features; and combining the optical image features and the thermal image features by using a convolutional neural network and a multilayer perceptron to obtain coupling features.
In one embodiment, the captured optical image features include a one-dimensional result and a two-dimensional result, wherein the one-dimensional result includes a determination of the preliminary ranking and whether the image is valid, and the two-dimensional result includes a field of interest (ROI) and surrounding environment features.
In one embodiment, the captured thermal image features include a one-dimensional result and a two-dimensional result, wherein the one-dimensional result includes a determination of the preliminary ranking and whether the image is valid, and the two-dimensional result includes a field of interest (ROI) and ambient features.
In one embodiment, the step of merging the optical image feature and the thermal image feature further includes: generating new features by taking the field of interest as a convolution mask; adjusting feature weights with the preliminary ranking of the one-dimensional results; and repeating, arranging and combining the one-dimensional features to generate new features.
By simultaneously utilizing the optical receiving unit and the heat energy receiving unit, when the optical image cannot be judged due to the conditions of texture, interference and the like, the invention utilizes the heat image to assist the optical image to position the affected part, and combines the two images to form the current inspection and judgment, thereby accurately judging the state of the affected part.
Drawings
FIG. 1 is a block diagram of an inspection and determination system according to the present invention.
FIG. 2 shows the related flow executed by the arithmetic unit
FIG. 3 shows the operation of the checking and determining system in either standalone version or networked version.
FIG. 4 is a flowchart illustrating a checking and determining method according to the present invention.
FIG. 5 is a schematic diagram of a more specific embodiment of the inspection and determination system and method of the present invention.
Description of the symbols:
inspection and judgment system 1
Optical receiving unit 11
Thermal energy receiving unit 12
Arithmetic unit 13
Display unit 14
Wireless transmission unit 15
Power management unit 16
First model 21
Optical image feature 210
Second model 22
Thermal image feature 220
Third model 23
Checking the determination result 230
Region A
Central heat source zone a1-a8
Region of interest a9-a16
Other regions a17
Heat source blocks H1, H2
Video blocks V1 and V2
Steps S01-S05
Steps S11-S15
Detailed Description
The following description of the embodiments of the present invention is provided for illustrative purposes, and other advantages and effects of the present invention will become apparent to those skilled in the art from the present disclosure. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways.
It should be understood that the structures, ratios, sizes, and the like shown in the drawings and described in the specification are only used for understanding and reading the present disclosure, and are not used for limiting the conditions of the present disclosure, which will not be technically significant, and any modifications of the structures, ratios, or adjustments of the sizes and the structures should fall within the scope of the present disclosure without affecting the efficacy and attainment of the present disclosure. In addition, the terms "above", "inside", "outside", "bottom" and "one" used in the present specification are for the sake of clarity only, and are not intended to limit the scope of the present invention, and their relative changes and modifications are considered to be within the scope of the present invention without substantial changes and modifications.
Referring to fig. 1, an architecture diagram of an inspection and determination system 1 combining optical images and thermal images according to the present invention is shown. The examination judging system 1 combining the optical image and the thermal image can be applied to the examination of the body surface affected part of the warm-blooded animal, such as a wound, but is not limited thereto. In one embodiment, the components of the inspection and determination system 1 mainly include an optical receiving unit 11, a thermal energy receiving unit 12, an operation unit 13 and a display unit 14.
The optical receiving unit 11 is used for receiving optical image information of a field. The thermal energy receiving unit 12 is configured to receive thermal image information of the field. In one embodiment, the optical receiving unit 11 includes an optical image sensor, and the measuring wavelength band may include visible light or near infrared light. As for the thermal energy receiving unit 12, it includes a thermal energy sensor. When the examination determination system 1 is applied to the examination of an affected part on the surface of a thermostatic animal body, the field may include a wound and peripheral skin or tissue.
Referring to fig. 2, fig. 2 shows a process executed by the operation unit 13. First, the optical image information is processed according to the first model 21 to extract the optical image feature 210. Meanwhile, the thermal image information is processed according to the second model 22 to extract thermal image features 220. Then, the optical image feature and the thermal image feature are combined according to the third model 23 to obtain the coupling feature. That is, after integrating the features extracted by the first model 21 and the second model 22, the third model 23 is used for analyzing and obtaining the possibility of various different conditions of the sensed field according to the coupling features, and finally deducing the most probable inspection judgment result 230.
In one embodiment, the first model 21 is a convolutional neural network, and the captured optical image features 210 include a one-dimensional result and a two-dimensional result, wherein the one-dimensional result includes a preliminary classification and whether the image is valid, and the two-dimensional result includes a field of interest (ROI) and surrounding environment features.
In one embodiment, the second model 22 is a convolutional neural network, and the captured thermal image features 220 include one-dimensional results including determining preliminary ranking and whether the image is valid and two-dimensional results including a field of interest (ROI) and ambient features.
In one embodiment, the third model 23 includes a convolutional neural network and a multi-layered perceptron. The third model 23 generates new features by using the field of interest as a convolution mask, adjusts the feature weight in the preliminary grading manner, and generates new features by repeating, arranging and combining the one-dimensional features. Thus, the optical image feature 210 and the thermal image feature 220 are considered by combining through the third model 23, and the two features are effectively coupled to assist in locating a field of interest (ROI) to be considered in the image, and the coupling result is directly formed in the third model 23 to finally determine probability distributions of different situations. It is to be noted that, since the hardware characteristics of the collecting device of the optical image information and the thermal image information are different, the coupling manner of the third model 23 includes shifting, scaling, adjusting the region of interest, and performing feature integration judgment to generate the judgment result.
After the operation unit 13 calculates probability distributions of different conditions of the sensed field according to the model, the inspection result 230 can be displayed through the display unit 14 shown in fig. 1.
In one embodiment, as shown in fig. 1, the checking and determining system 1 further includes a wireless transmission unit 15 and a power management unit 16. The wireless transmission unit 15 is an optional component for transmitting the optical image information received by the optical receiving unit 11 or the thermal image information of the field received by the thermal receiving unit 12 to the operation unit 13. The power management unit 16 can provide the power required by the operation of the optical receiving unit 11, the thermal energy receiving unit 12, the computing unit 13, the display unit 14 or the wireless transmission unit 15 according to the actual requirement.
It should be noted that the checking and judging system 1 provided by the present invention can be designed as a stand-alone version or a networked version. In the stand-alone version, the first model 21, the second model 22 and the third model 23 are built in the operation unit 13 by default. For the user, the checking and determining result can be obtained directly through the single-computer operation. If the networking version is adopted, after receiving the optical image information and the thermal image information, the optical image information and the thermal image information can be transmitted to the external computing unit 13 through the wireless transmission unit 15 to obtain the checking and judging result.
Referring to fig. 3, it shows the operation flow of the checking and determining system 1 in the standalone version or the networking version according to the present invention, the related steps are as follows:
s01: the arithmetic unit 13 obtains the thermal image and the optical image information in the current field from the thermal energy receiving unit 12 and the optical receiving unit 11;
s02: judging whether a single-machine operation function exists;
s03: if yes, the operation unit 13 processes the optical image and the thermal image information through the first model 21 and the second model 22, respectively, and couples the two through a third model 23 to obtain an operation check judgment result;
s04: if not, the thermal image information and the optical image information are transmitted to the external operation unit 13 through the wireless transmission unit 15, and the check and judgment result of the return operation is obtained;
s05: finally, the results may be presented via the display unit 14.
It should be noted that the inspection and judgment system 1 provided by the present invention is not limited to be constructed in a single device, but may be constructed in a plurality of different devices. For example, the computing unit 13 may be disposed on the cloud or on a server, and the user receives the optical image information and the thermal image information through the optical receiving unit 11 and the thermal receiving unit 12 on the handheld device, and uploads the optical image information and the thermal image information to the computing unit 13 through the wireless transmission unit 15. Alternatively, the optical receiving unit 11 and the thermal receiving unit 12 may be disposed on different devices, for example, the thermal receiving unit 12 may be disposed in a separate thermal image sensing device, and transmit the received thermal image information through a transmission module of the thermal image sensing device.
Referring to fig. 4, the present invention also provides a method for checking and determining a combination of optical images and thermal images, comprising the following steps:
s11: receiving optical image information of a field;
s12: receiving thermal image information of the field;
s13: processing the optical image information by using a convolutional neural network to acquire optical image characteristics;
s14: processing the thermal image information by using a convolutional neural network to acquire thermal image characteristics; and
s15: the coupling features are obtained by combining the optical image features and the thermal image features using a convolutional neural network and a multi-layer perceptron.
In one embodiment, the optical image features captured in step S13 include a one-dimensional result and a two-dimensional result, wherein the one-dimensional result includes a determination of the preliminary ranking and whether the image is valid, and the two-dimensional result includes a field of interest (ROI) and surrounding environment features.
In one embodiment, the thermal image features captured in step S14 include a one-dimensional result and a two-dimensional result, wherein the one-dimensional result includes a determination of the preliminary ranking and whether the image is valid, and the two-dimensional result includes a field of interest (ROI) and surrounding environment features.
In one embodiment, the step of merging the optical image feature and the thermal image feature in step S15 further includes: generating new features by taking the field of interest as a convolution mask; adjusting the feature weight according to the primary grading of the one-dimensional result; and repeating, arranging and combining the one-dimensional features to generate new features.
Please refer to fig. 5, which is a schematic diagram of a combined optical image and thermal image inspection and determination system and method according to the present invention. As shown in fig. 5, the first model 21 processes the optical image information to extract optical image features.
The second model 22 processes the thermal image information to extract thermal image features, for example, the area a predicted by the thermal energy receiving unit 12 is divided into a central heat source area a1-a8, a focus area a9-a16, and another area a17 by software setting. Preferably, in the present embodiment, the length-width ratio of at least one of the attention areas a9-a16 to at least one of the central heat source areas a1-a8 is a fixed multiple, but not limited thereto. In the present embodiment, the heat source block H1 sensed by the heat receiving unit 12 in the central heat source area a1-a6 can be determined as the wound of the patient with the wound level two, and the heat receiving unit 12 is located in the heat source block H2 sensed by the attention area a 15.
The third model 23 can combine the optical image features captured by the first model 21 with the thermal image features captured by the second model 22 to obtain the coupling features, so as to obtain the possibility of different situations of the field and obtain the most possible inspection and determination results. As mentioned above, in the present embodiment, the impact area V1 sensed by the heat energy receiving unit 12 in the central heat source area a1-a6 can be determined as the wound of the patient at the wound level, and the image area V2 sensed by the heat energy receiving unit 12 in the attention area a15 can be determined as the birthmark of the patient instead of the wound.
The optical receiving unit and the heat energy receiving unit are used for respectively capturing the optical image and the thermal image of the same field, when the optical image cannot be judged due to the conditions of texture, interference and the like, the thermal image is used for assisting the optical image to position the affected part, and the current examination and judgment are formed by combining the two images, so that the state of the affected part can be accurately and objectively judged. In some embodiments, for example: inspecting the abraded wound of the human body by utilizing a near-infrared light image and a thermal image, acquiring a region with higher temperature as an ROI through the thermal image, judging the water content in the region through the near-infrared light image, and finally judging the whole wound to be an acute-phase wound with inflammation and suppuration; for another example: using optical image and thermal image to inspect the long hair cat, wherein the thermal image obtains the specific part with less hair quantity and higher surface temperature, the optical image observation is shielded by the cat hair, and the specific part is judged to be inspected by personnel; the following steps are repeated: the optical image and the thermal image are used for inspecting the bedsore wound of the human body, the thermal image acquisition ROI area is not large, and the optical image judges a few black areas, so that the bedsore with the grade three is judged. Obviously, the examination and judgment system provided by the invention can help medical staff to clearly define the most possible state of the affected part.
The features and spirit of the present invention will become more apparent to those skilled in the art from the description of the preferred embodiments given above, which are given by way of illustration only, and not by way of limitation, of the principles and functions of the present invention. Thus, any modifications and variations may be made to the above-described embodiments without departing from the spirit of the invention, and the scope of the invention is to be determined by the appended claims.

Claims (10)

1. An inspection and determination system combining optical images and thermal images, the inspection and determination system comprising:
the optical receiving unit receives optical image information of a field;
a thermal energy receiving unit for receiving thermal image information of the field;
the operation unit is used for processing the optical image information according to a first model to acquire optical image characteristics, processing the thermal image information according to a second model to acquire thermal image characteristics, and combining the optical image characteristics and the thermal image characteristics according to a third model to acquire coupling characteristics so as to obtain the possibility of different situations of the field and obtain the most possible inspection judgment result; and
and the display unit displays the checking and judging result.
2. The system according to claim 1, further comprising a wireless transmission unit for transmitting the optical image information received by the optical receiving unit or the thermal image information received by the thermal receiving unit to the computing unit.
3. The system of claim 1, wherein the first model, the second model and the third model are built in the computing unit.
4. The system of claim 1, wherein the first model comprises a convolutional neural network, the captured optical image features comprise one-dimensional results and two-dimensional results, the one-dimensional results comprise a preliminary ranking and whether the image is valid, and the two-dimensional results comprise a field of interest and a surrounding environment feature.
5. The system of claim 1, wherein the second model comprises a convolutional neural network, the captured thermal image features comprise one-dimensional results and two-dimensional results, the one-dimensional results comprise a preliminary ranking and whether the image is valid, and the two-dimensional results comprise a field of interest and a surrounding environment feature.
6. The system according to claim 4 or 5, wherein the third model comprises a convolutional neural network and a multi-layer perceptron, and the third model generates new features by using the field of interest as a convolutional mask, and combines one-dimensional features by repeating and arranging feature weights according to the preliminary hierarchical adjustment, wherein the third model is coupled in a manner including shifting, scaling, adjusting the field of interest, and performing feature integration judgment to generate a judgment result.
7. An inspection and determination method combining an optical image and a thermal image, the inspection and determination method combining an optical image and a thermal image comprising the steps of:
receiving optical image information of a field;
receiving thermal image information of the field;
processing the optical image information by using a convolutional neural network to acquire optical image characteristics;
processing the thermal image information using a convolutional neural network to capture thermal image features; and
and combining the optical image features and the thermal image features by using a convolutional neural network and a multilayer perceptron to obtain coupling features, so as to obtain the most possible inspection judgment result.
8. The method of claim 7, wherein capturing the optical image features comprises one-dimensional results and two-dimensional results, wherein the one-dimensional results comprise determining the preliminary grading and whether the image is valid, and the two-dimensional results comprise the field of interest and the surrounding environment features.
9. The method of claim 7, wherein capturing the thermal image features comprises one-dimensional results and two-dimensional results, wherein the one-dimensional results comprise determining the preliminary grading and whether the image is valid, and the two-dimensional results comprise the field of interest and the ambient environment features.
10. The method according to claim 8 or 9, wherein the step of merging the optical image features and the thermal image features further comprises the steps of: generating new features by taking the field of interest as a convolution mask; adjusting feature weights with the preliminary ranking of the one-dimensional results; and repeating, arranging and combining the one-dimensional features to generate new features.
CN202010104231.2A 2020-02-20 2020-02-20 Inspection and judgment system and method combining optical image and thermal image Pending CN113362267A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010104231.2A CN113362267A (en) 2020-02-20 2020-02-20 Inspection and judgment system and method combining optical image and thermal image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010104231.2A CN113362267A (en) 2020-02-20 2020-02-20 Inspection and judgment system and method combining optical image and thermal image

Publications (1)

Publication Number Publication Date
CN113362267A true CN113362267A (en) 2021-09-07

Family

ID=77523057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010104231.2A Pending CN113362267A (en) 2020-02-20 2020-02-20 Inspection and judgment system and method combining optical image and thermal image

Country Status (1)

Country Link
CN (1) CN113362267A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104434031A (en) * 2013-09-17 2015-03-25 汉唐集成股份有限公司 Thermal infrared image system and method for analyzing surface temperature influence factors of free flaps
TW201723928A (en) * 2015-12-17 2017-07-01 Nat Chung-Shan Inst Of Science And Tech Thermal image region segmentation method by utilizing temperature information in the thermal image plus contour information and the region smoothness information of a visible image having the same image pickup range
CN108294728A (en) * 2017-01-12 2018-07-20 财团法人工业技术研究院 wound state analysis method and system
CN108470342A (en) * 2017-02-23 2018-08-31 南宁市富久信息技术有限公司 A kind of Edge detection of infrared image
CN109974891A (en) * 2017-12-27 2019-07-05 宇博先进电子工业有限公司 The method for indicating object temperature

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104434031A (en) * 2013-09-17 2015-03-25 汉唐集成股份有限公司 Thermal infrared image system and method for analyzing surface temperature influence factors of free flaps
TW201723928A (en) * 2015-12-17 2017-07-01 Nat Chung-Shan Inst Of Science And Tech Thermal image region segmentation method by utilizing temperature information in the thermal image plus contour information and the region smoothness information of a visible image having the same image pickup range
CN108294728A (en) * 2017-01-12 2018-07-20 财团法人工业技术研究院 wound state analysis method and system
CN108470342A (en) * 2017-02-23 2018-08-31 南宁市富久信息技术有限公司 A kind of Edge detection of infrared image
CN109974891A (en) * 2017-12-27 2019-07-05 宇博先进电子工业有限公司 The method for indicating object temperature

Similar Documents

Publication Publication Date Title
KR102317478B1 (en) Method and system for wound assessment and management
US20190110740A1 (en) System, apparatus and method for assessing wound and tissue conditions
US6567682B1 (en) Apparatus and method for lesion feature identification and characterization
KR101492803B1 (en) Apparatus and method for breast tumor detection using tactile and near infrared hybrid imaging
US7657101B2 (en) Devices and methods for identifying and monitoring changes of a suspect area on a patient
AU2008296303B2 (en) Dermatology information
Lucas et al. Wound size imaging: ready for smart assessment and monitoring
CN102469937B (en) Tomography apparatus and control method for same
US20120206587A1 (en) System and method for scanning a human body
US20220095998A1 (en) Hyperspectral imaging in automated digital dermoscopy screening for melanoma
US10993625B1 (en) System, method, and apparatus for temperature asymmetry measurement of body parts
US10362969B2 (en) Image-based detection and diagnosis of diastasis recti
CN113362267A (en) Inspection and judgment system and method combining optical image and thermal image
WO2008033010A1 (en) Device and method for positioning recording means for recording images relative to an object
CN113100823A (en) Noninvasive cerebral blood flow detection system
CN215305780U (en) System for assessing survival of parathyroid glands
Gabriel et al. Development and clinical application of Vertebral Metrics: using a stereo vision system to assess the spine
TWM600595U (en) Inspection and judgment system combining optical image and thermal image
TW202131854A (en) Examination determining system and method combining optical image and thermal image for precisely determining status of affected area
JP2021137344A (en) Medical image processing device, medical image processing device control method, and program
Cerussi et al. Monte Carlo modeling of light propagation in the human head for applications in sinus imaging
US20200046215A1 (en) A medical monitoring system and method
JP2019069025A (en) Information processing device and information processing method
CN217907732U (en) Detection device and medical instrument
KR102222059B1 (en) Object position recognition diagnostic device that outputs the object position and the diagnosis result using a diagnostic device with marker

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination