CN113367638B - Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal - Google Patents

Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal Download PDF

Info

Publication number
CN113367638B
CN113367638B CN202110525987.9A CN202110525987A CN113367638B CN 113367638 B CN113367638 B CN 113367638B CN 202110525987 A CN202110525987 A CN 202110525987A CN 113367638 B CN113367638 B CN 113367638B
Authority
CN
China
Prior art keywords
dimensional
image
fluorescence image
depth
pixel unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110525987.9A
Other languages
Chinese (zh)
Other versions
CN113367638A (en
Inventor
梁江荣
伍思樾
黄泽鑫
顾兆泰
任均宇
安昕
张浠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oupu Mandi Technology Co ltd
Original Assignee
Guangdong Optomedic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Optomedic Technology Co Ltd filed Critical Guangdong Optomedic Technology Co Ltd
Priority to CN202110525987.9A priority Critical patent/CN113367638B/en
Publication of CN113367638A publication Critical patent/CN113367638A/en
Application granted granted Critical
Publication of CN113367638B publication Critical patent/CN113367638B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/0261Measuring blood flow using optical means, e.g. infrared light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention discloses a method, a device, a storage medium and a terminal for acquiring a high-precision three-dimensional fluorescence image, wherein a two-dimensional fluorescence image and a three-dimensional depth image are acquired simultaneously, the depth information of each pixel unit on the three-dimensional depth image is copied to the pixel unit corresponding to the two-dimensional fluorescence image to form a three-dimensional fluorescence image, the fluorescence intensity compensation coefficient of each pixel unit of the three-dimensional fluorescence image is calculated according to the depth information on the three-dimensional depth image, and the three-dimensional fluorescence image is compensated according to the compensation coefficient to obtain the high-precision three-dimensional fluorescence image; particularly, when a fluorescent navigation system acquires a non-planar three-dimensional fluorescent image, the problem that the measurement accuracy of tissue fluorescence intensity is obviously influenced due to the fact that different regions in the non-plane may have obvious depth information difference is solved.

Description

Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal
Technical Field
The invention relates to computer software/image processing, in particular to a method, a device, a storage medium and a terminal for acquiring a high-precision three-dimensional fluorescence image.
Background
In recent years, a fluorescence navigation system based on indocyanine green has been widely used in surgical operations, and particularly, in an orthopedic surgery, a near-infrared fluorescence navigation endoscope system with a quantitative analysis function can achieve quantitative evaluation of tissue blood supply in the operation, and if the fluorescence intensity reaches a certain standard value, the tissue blood supply is normal; if the fluorescence is lower than the standard value, the tissue blood supply is poor, the risk of tissue necrosis is high, and a doctor can perform secondary treatment on the tissue wound in time according to the evaluation, so that the risk of problems of a patient after the operation can be effectively reduced.
At present, a fluorescence navigation system mainly comprises a traditional two-dimensional image sensor, visible light images and fluorescence images acquired by the system only acquire brightness and color information of tissues in an operation on x and y planes, and in clinical application, the consistency of acquisition distances is difficult to ensure, so that the fluorescence intensity on the images is influenced to a certain extent, and the fluorescence intensity cannot be accurately acquired.
In addition, individual navigation systems incorporate a point laser sensor to obtain depth information for estimating the approximate distance of the navigation system to the tissue. However, the spot laser sensor can only cover a single spot area of the tissue, and cannot cover the whole. In particular, in orthopedic applications, it is often necessary to acquire images of non-planar, more curved tissues, such as skin flaps of feet, legs, back, and breasts, and there may be significant depth information differences in each region of the skin flaps, which may significantly affect the accuracy of the measurement of tissue fluorescence intensity.
Therefore, the prior art still needs to be improved and developed.
Disclosure of Invention
The invention aims to provide a method, a device, a storage medium and a terminal for acquiring a high-precision three-dimensional fluorescence image, and aims to solve one or more problems in the prior art.
The technical scheme of the invention is as follows: the technical scheme provides a method for acquiring a high-precision three-dimensional fluorescence image, which specifically comprises the following steps:
acquiring a two-dimensional fluorescence image and a three-dimensional depth image, wherein pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one;
obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image, and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image;
fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image with the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image;
obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information;
performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image;
and outputting the compensated three-dimensional fluorescence image.
In the technical scheme, the two-dimensional fluorescence image and the three-dimensional depth image are acquired simultaneously, the depth information of each pixel unit on the three-dimensional depth image is copied to the pixel unit corresponding to the two-dimensional fluorescence image to form the three-dimensional fluorescence image, the fluorescence intensity compensation coefficient of each pixel unit of the three-dimensional fluorescence image is calculated according to the depth information on the three-dimensional depth image, and the three-dimensional fluorescence image is compensated according to the compensation coefficient to obtain the high-precision three-dimensional fluorescence image; particularly, when a fluorescent navigation system acquires a non-planar three-dimensional fluorescent image, the problem that the measurement accuracy of tissue fluorescence intensity is obviously influenced due to the fact that different regions in the non-planar three-dimensional fluorescent image may have obvious depth information difference is solved.
Further, the two-dimensional fluorescence image is acquired by a fluorescence camera of a two-dimensional fluorescence navigation system; the three-dimensional depth image is acquired by a depth camera.
Further, the fluorescence camera and depth camera have matched field of view adjustments within the system prior to use.
Further, the fluorescence information refers to RGB information of each pixel unit in the two-dimensional fluorescence image; the depth information refers to depth information of each pixel unit in the three-dimensional depth image.
Further, fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image and the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image, specifically comprising the following processes: traversing the three-dimensional depth image and the two-dimensional fluorescence image, and copying the depth information of each pixel unit on the three-dimensional depth image to the RGB information of the corresponding pixel unit on the two-dimensional fluorescence image one by one to obtain point cloud data of an RGBD structure, namely forming the three-dimensional fluorescence image.
Further, deriving a compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information, specifically by: the depth information and the fluorescence intensity value are realized through a relational expression of compensation weight; or by a table lookup.
Further, performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image, wherein the specific process is as follows: and performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight, and correspondingly updating RGB information of point cloud data of the three-dimensional fluorescence image to obtain a compensated three-dimensional fluorescence image.
The technical scheme also provides a device for acquiring the high-precision three-dimensional fluorescence image, which specifically comprises the following steps:
the image acquisition module is used for acquiring a two-dimensional fluorescence image and a three-dimensional depth image, and pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one;
the information acquisition module is used for obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image;
the information fusion module is used for fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image and the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image;
the compensation weight acquisition module is used for obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information;
the compensation processing module is used for performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image;
and the image output module outputs the compensated three-dimensional fluorescence image.
The present invention also provides a storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute any one of the methods described above.
The present technical solution further provides a terminal, including a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the method according to any one of the foregoing methods by calling the computer program stored in the memory.
According to the technical scheme, the two-dimensional fluorescence image and the three-dimensional depth image are acquired simultaneously, the depth information of each pixel unit on the three-dimensional depth image is copied to the pixel unit corresponding to the two-dimensional fluorescence image to form the three-dimensional fluorescence image, the fluorescence intensity compensation coefficient of each pixel unit of the three-dimensional fluorescence image is calculated according to the depth information on the three-dimensional depth image, and the three-dimensional fluorescence image is compensated according to the compensation coefficient to obtain the high-precision three-dimensional fluorescence image; particularly, when a fluorescent navigation system acquires a non-planar three-dimensional fluorescent image, the problem that the measurement accuracy of tissue fluorescence intensity is obviously influenced due to the fact that different regions in the non-plane possibly have obvious depth information difference is solved; according to the technical scheme, the depth information of each position of the whole measuring object can be completely acquired, the accuracy of the fluorescence intensity is further improved according to the acquired depth information, the three-dimensional fluorescence image is synchronously output, the three-dimensional form of the target tissue can be presented in a more intuitive mode, and the precision of the output three-dimensional fluorescence image is improved.
Drawings
FIG. 1 is a flow chart of the steps of the method of obtaining a high-precision three-dimensional fluorescence image according to the present invention.
Fig. 2 is a schematic diagram of an apparatus for acquiring a high-precision three-dimensional fluorescence image according to the present invention.
FIG. 3 is a schematic diagram of a three-dimensional fluoroscopic navigation system of the present invention.
Fig. 4 is a schematic diagram of a terminal in the present invention.
FIG. 5 is a schematic view of the light source emitting light to illuminate an object plane in the present invention.
FIG. 6 is a schematic diagram of the image area of the invention where the light from the same object plane is irradiated to the points with different distances.
Fig. 7 is a schematic diagram of the object plane emitting light to the image sensor according to the present invention.
FIG. 8 is a schematic diagram of a fluorescence navigation endoscope system according to the present invention, in which a light source irradiates an object plane and the object plane is imaged on an image sensor.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
As shown in fig. 1, a method for acquiring a high-precision three-dimensional fluorescence image is suitable for acquiring a high-precision three-dimensional fluorescence image, and is particularly suitable for acquiring a non-planar high-precision three-dimensional fluorescence image, and specifically includes the following steps:
s1: and acquiring a two-dimensional fluorescence image and a three-dimensional depth image, wherein pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one.
Wherein the two-dimensional fluoroscopic image is acquired by a fluoroscopic camera of a two-dimensional fluoroscopic navigation system.
Wherein, the three-dimensional depth image is obtained by a depth camera (compared with the traditional camera, the depth camera adds a depth measurement in function, thereby more conveniently and accurately perceiving the surrounding environment and change). In this embodiment, the depth camera can adopt three-dimensional imaging modes such as binocular vision, TOF depth camera, laser spot projection, structured light according to actual need, and its target all is the three-dimensional imaging information who obtains the shooting object. By introducing the depth camera module, depth information is provided for the fluorescence image, and the problem that the measurement accuracy is influenced due to inconsistent measurement distance is effectively solved.
Before use, the fluorescence camera and depth camera have matched the field of view adjustments inside the system: if the light paths to the fluorescence camera and the depth camera are coaxial based on the optical beam splitting principle, the consistent view field can be achieved; or processing can be performed based on image algorithms such as pattern matching and affine transformation, so that the visual fields are consistent. The depth camera and the fluorescence camera have the same visual field, and the fluorescence information of each position is compensated, so that the fluorescence intensity accuracy of the non-planar tissue is obviously improved.
S2: and obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image, and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image.
Wherein, the fluorescence information refers to RGB information (RGB is color representing red, green and blue) of each pixel unit in the two-dimensional fluorescence image; the Depth information refers to Depth (i.e. Depth) information of each pixel unit in the three-dimensional Depth image, i.e. a Depth distance of each pixel unit in the three-dimensional Depth image with respect to a certain reference plane, which may be set as required, such as a plane where a point with the lowest height is located in the three-dimensional Depth image, or a plane where a point with the highest height is located in the three-dimensional Depth image, and the like.
The acquired two-dimensional fluorescence image and the three-dimensional depth image are synchronously transmitted to a processing unit in the multi-video signal acquisition and processing system to be processed, and fluorescence information of each pixel unit in the two-dimensional fluorescence image and depth information of each pixel unit in the three-dimensional depth image are obtained.
S3: and fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image and the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image.
The processing unit in the multi-video signal acquisition and processing system traverses the three-dimensional Depth image and the two-dimensional fluorescence image, and copies the Depth information Depth of each pixel unit (x, y) on the three-dimensional Depth image to the color information RGB of the corresponding pixel unit (x, y) on the two-dimensional fluorescence image one by one to obtain point cloud data of an RGBD structure, namely the three-dimensional fluorescence image is formed. Where the x-y plane is still the RGB information of the two-dimensional fluorescence image and the z-direction is the Depth information of the three-dimensional Depth image.
S4: and obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information.
Wherein, according to the relation of' fluorescence intensity compensation-distance
Figure DEST_PATH_IMAGE001
Figure DEST_PATH_IMAGE003
D is the depth (i.e. distance information) of each pixel unit on the three-dimensional depth image, and F is a function relation), namely the compensation coefficient of the fluorescence intensity corresponding to the distance d with different depths can be obtained
Figure 918301DEST_PATH_IMAGE004
. Note that the functional relation can be obtained from the results of multiple independent experiments of different fluorescent navigation systems, and can be a linear relation, a nonlinear relation, or a table lookup.
For example, the relation of an ideal three-dimensional fluorescence navigation system (the ideal three-dimensional fluorescence navigation system is obtained by induction fitting in an experiment, wherein in the three-dimensional fluorescence navigation system, the aperture, the shutter and the gain of a camera end are fixed values, and the function of closing automatic exposure) is as follows:
Figure DEST_PATH_IMAGE005
the procedure for fitting this formula is summarized from the experiment as follows:
1. when the light source reaches the object, the relation between the energy and the distance is obtained as follows:
as shown in fig. 5, the light source illuminates the object plane S at positions 1 and 2, respectively. Wherein the distance between the bit 1 and the object plane S is d1, and the light emission angle is
Figure DEST_PATH_IMAGE006
1; the distance between the position 2 and the object plane S is d2, and the divergence angle of the emergent light is
Figure 881446DEST_PATH_IMAGE006
2; assuming that the total energy emitted by the light source is Q, the energy from the bit 1 to the object plane is Q1, and the energy from the bit 2 to the object plane is Q2, the following results are obtained:
Figure 333288DEST_PATH_IMAGE007
Figure DEST_PATH_IMAGE008
the shorter the distance, the more the emission angle
Figure 851467DEST_PATH_IMAGE006
The larger the energy reaches the object plane, the more the emission angle is calculated from the solid angle
Figure 603523DEST_PATH_IMAGE006
And distance d is an inverse square relationship, so there is:
Figure 363668DEST_PATH_IMAGE009
2. when imaging on an image sensor, the energy versus distance relationship is as follows:
unlike point 1, when the light source directly irradiates the object plane S, if the object is imaged on the image sensor through the optical lens (at this time, the light source emitted by the object is irradiated by the optical lens for imaging), when the object is close to the lens, the same energy emitted by the object is Q, the total energy of the image sensor with the energy Q reaching the position 1 is Q1', the total energy of the image sensor with the energy Q reaching the position 2 is Q2', and the energy of each pixel on the target surface of the image sensor changes due to the change of the imaging size. If the distribution is wide, the energy will be divided, for example, although the emission angle of booth 1 is larger, the divided energy will be more, and the energy of a single pixel will be reduced. Assuming that the object is imaged at bit 1 with n x n and bit 2 with m x m as shown in fig. 6, and assuming that the energy per pixel on the image imaged in the bit 1, bit 2 image sensor is Q1 "and Q2", respectively, as shown in fig. 7, then there are:
Figure DEST_PATH_IMAGE010
therefore, on the camera system, the energy of the light emitted at different distances, the ratio of the energy reaching the image sensor is:
Figure 332499DEST_PATH_IMAGE011
namely, the object which is actively radiated to emit light, the energy of the object is not lost with the distance after being captured by the camera system.
3. The energy loss of the fluorescence endoscope system is as follows:
as can be seen from fig. 8, this situation is different from the situation of "active radiative emission", where the object plane S needs to passively receive the light emitted from the optical fiber, and then the light emitted from the object plane S returns to the camera system, so that there is a problem of twice emission angle loss: 1) The light emitted from the optical fiber is diffused to the object plane S at a divergence angle of
Figure 203503DEST_PATH_IMAGE006
(ii) a 2) The reflected light of the object plane S returns to the camera system, and the diaphragm of the camera system can limit the angle of the collecting optical fiber of the camera
Figure DEST_PATH_IMAGE012
Similar to the divergence angle. While
Figure 341223DEST_PATH_IMAGE012
The angular losses are offset by the imaging system, which demonstrates the above-described 2 nd point, but the loss of divergence angle of the fiber light
Figure 6691DEST_PATH_IMAGE006
There is no cancellation, so under this fluorescence endoscope system, suppose that the energy of each pixel at bit 1 and bit 2 is Q1'' 'and Q2' '', respectively, in combination with point 1 aboveAnd point 2, then there are:
Figure 75535DEST_PATH_IMAGE013
from the above, it can be confirmed that, in the fluorescence endoscope system, since energy and distance are in inverse square relationship, the energy compensation coefficient and distance are in square relationship (from this, it can be understood that the closer the distance is, the energy and distance are in inverse square relationship, and therefore, to compensate for the loss of energy, it is necessary to compensate using the energy compensation coefficient in positive square relationship corresponding to the aforementioned inverse square relationship). Assuming that α is a compensation coefficient, d is a distance from the light emitted from the optical fiber to the object plane, and θ is a constant coefficient, the following are:
Figure DEST_PATH_IMAGE014
for example, θ =0.5 (θ is a weight coefficient, θ is related to the attenuation speed of the three-dimensional fluorescent navigation system (where the attenuation speed is related to the quantum efficiency of the infrared specific wavelength band in the image sensor chip used by the camera system), the attenuation speed of the three-dimensional fluorescent navigation system is high, the value of θ is large, the attenuation speed of the three-dimensional fluorescent navigation system is low, and the value of θ is small), the unit of d is centimeter, when d =0,
Figure 186448DEST_PATH_IMAGE015
=1; when the d =1, the number of the magnetic poles is increased,
Figure DEST_PATH_IMAGE016
=1.5; when the ratio d =2, the ratio,
Figure 913095DEST_PATH_IMAGE017
=3。
the formula of the compensation coefficient alpha is suitable for compensating the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image obtained by an ideal three-dimensional fluorescence navigation system; for a general three-dimensional fluoroscopic navigation system, the influence of the gain and exposure time factor of the imaging system in the three-dimensional fluoroscopic navigation system also needs to be considered.
S5: and performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image.
And corresponding to the three-dimensional fluorescence image, obtaining a compensation coefficient of the fluorescence intensity value of each pixel point according to the Depth information of each pixel point. Wherein the compensated three-dimensional fluorescence IMAGE' = IMAGE
Figure 749464DEST_PATH_IMAGE019
IMAGE' is the compensated three-dimensional fluorescence IMAGE and IMAGE is the three-dimensional fluorescence IMAGE before compensation, wherein
Figure 787828DEST_PATH_IMAGE019
The compensation coefficients are a set of compensation coefficients, and the compensation coefficients of all the pixels under the corresponding IMAGE resolution are included. Meanwhile, the RGB information of the point cloud data of the three-dimensional fluorescence IMAGE is updated according to IMAGE' to obtain a new three-dimensional form.
S6: and outputting the compensated three-dimensional fluorescence image.
Wherein outputting the compensated three-dimensional fluorescence image: for example, the output may be displayed on a display, or output for other quantitative measurement processes.
As shown in fig. 2, a device for acquiring a high-precision three-dimensional fluorescence image specifically includes the following steps:
the image acquisition module 101 is used for acquiring a two-dimensional fluorescence image and a three-dimensional depth image, wherein pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one by one;
the information acquisition module 102 is used for obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image;
the information fusion module 103 is used for fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image and the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image;
a compensation weight obtaining module 104, configured to obtain a compensation weight of a fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information;
the compensation processing module 105 is used for performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image;
and an image output module 106 for outputting the compensated three-dimensional fluorescence image.
In the technical scheme, as shown in fig. 3, the device for acquiring a high-precision three-dimensional fluorescence image can be implemented by using a three-dimensional fluorescence navigation system, and the three-dimensional fluorescence navigation system comprises a two-dimensional fluorescence navigation system 3 (the two-dimensional fluorescence navigation system 3 comprises a fluorescence camera 2), a depth camera 1 and a multi-video signal acquisition and processing system 4; the two-dimensional fluoroscopic navigation system 3 and the depth camera 1 are both built in and belong to a multi-video signal acquisition and processing system 4.
When the two-dimensional fluoroscopic navigation system 3 acquires an image signal, the depth camera 1 acquires an image signal synchronously, for example, an image signal of the foot 6. In particular, the depth camera 1 and the fluoroscopic camera 2 of the two-dimensional fluoroscopic navigation system 3 have matched the field of view adjustment inside the system. The two image signals are synchronously acquired in the multi-video signal acquisition and processing system 4 and transmitted to a processing unit in the multi-video signal acquisition and processing system 4 for signal processing, so that the depth image information can be fused into the fluorescent image to form a three-dimensional fluorescent image. Meanwhile, the processing unit in the multi-video signal acquisition and processing system 4 calculates the compensation weight of each position in the three-dimensional fluorescent image and completes the fluorescent intensity compensation of the three-dimensional fluorescent image. Finally, the system outputs the compensated three-dimensional fluorescence image with three-dimensional information in real time, and completes image display on the display 5 or performs other quantitative measurement processing.
Referring to fig. 4, an embodiment of the present invention further provides a terminal. As shown, the terminal 300 includes a processor 301 and a memory 302. The processor 301 is electrically connected to the memory 302. The processor 301 is a control center of the terminal 300, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by running or calling a computer program stored in the memory 302 and calling data stored in the memory 302, thereby performing overall monitoring of the terminal 300.
In this embodiment, the processor 301 in the terminal 300 loads instructions corresponding to processes of one or more computer programs into the memory 302 according to the following steps, and the processor 301 runs the computer programs stored in the memory 302, thereby implementing various functions: acquiring a two-dimensional fluorescence image and a three-dimensional depth image, wherein pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one; obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image, and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image; fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image with the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image; obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information; performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image; and outputting the compensated three-dimensional fluorescence image.
Memory 302 may be used to store computer programs and data. The memory 302 stores computer programs containing instructions executable in the processor. The computer program may constitute various functional modules. The processor 301 executes various functional applications and data processing by calling a computer program stored in the memory 302.
An embodiment of the present application provides a storage medium, and when being executed by a processor, the computer program performs a method in any optional implementation manner of the foregoing embodiment to implement the following functions: acquiring a two-dimensional fluorescence image and a three-dimensional depth image, wherein pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one; obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image, and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image; fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image with the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image; obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information; performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image; and outputting the compensated three-dimensional fluorescence image. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units into only one type of logical function may be implemented in other ways, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. A method for acquiring a high-precision three-dimensional fluorescence image is characterized by comprising the following steps:
acquiring a two-dimensional fluorescence image and a three-dimensional depth image, wherein pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one;
obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image, and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image;
fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image with the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image;
obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information;
the method comprises the following specific steps:
Figure 660198DEST_PATH_IMAGE001
wherein, alpha is a compensation coefficient, namely a compensation weight; d is the distance from the light emitted by the optical fiber to the object plane; θ is a constant coefficient;
performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image;
and outputting the compensated three-dimensional fluorescence image.
2. The method for acquiring a high-precision three-dimensional fluoroscopic image according to claim 1, wherein the acquisition of the two-dimensional fluoroscopic image is acquired by a fluoroscopic camera of a two-dimensional fluoroscopic navigation system; the three-dimensional depth image is acquired by a depth camera.
3. A method of acquiring high accuracy three dimensional fluoroscopic images as claimed in claim 2, characterized in that the fluoroscopic camera and the depth camera have matched field of view adjustments inside the system before use.
4. The method of claim 1, wherein the fluorescence information is RGB information of each pixel unit in the two-dimensional fluorescence image; the depth information refers to depth information of each pixel unit in the three-dimensional depth image.
5. The method for acquiring a high-precision three-dimensional fluorescence image according to claim 1, wherein the fluorescence information of each pixel unit in the two-dimensional fluorescence image and the depth information of each pixel unit in the three-dimensional depth image are fused to form a three-dimensional fluorescence image, and the method specifically comprises the following steps: traversing the three-dimensional depth image and the two-dimensional fluorescence image, and copying the depth information of each pixel unit on the three-dimensional depth image to the RGB information of the corresponding pixel unit on the two-dimensional fluorescence image one by one to obtain point cloud data of an RGBD structure, namely forming the three-dimensional fluorescence image.
6. The method according to claim 5, wherein the compensation processing of the fluorescence intensity value is performed on each pixel unit in the three-dimensional fluorescence image according to the compensation weight, so as to obtain a compensated three-dimensional fluorescence image, and the specific process is as follows: and performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight, and correspondingly updating RGB information of point cloud data of the three-dimensional fluorescence image to obtain a compensated three-dimensional fluorescence image.
7. A device for acquiring a high-precision three-dimensional fluorescence image is characterized by comprising the following steps:
the image acquisition module is used for acquiring a two-dimensional fluorescence image and a three-dimensional depth image, and pixel units of the two-dimensional fluorescence image correspond to pixel units of the three-dimensional depth image one to one;
the information acquisition module is used for obtaining the fluorescence information of each pixel unit in the two-dimensional fluorescence image according to the two-dimensional fluorescence image and obtaining the depth information of each pixel unit in the three-dimensional depth image according to the three-dimensional depth image;
the information fusion module is used for fusing the fluorescence information of each pixel unit in the two-dimensional fluorescence image and the depth information of each pixel unit in the three-dimensional depth image to form a three-dimensional fluorescence image;
the compensation weight acquisition module is used for obtaining the compensation weight of the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the depth information;
the method comprises the following specific steps:
Figure 5729DEST_PATH_IMAGE001
wherein, alpha is a compensation coefficient, namely a compensation weight; d is the distance from the light emitted by the optical fiber to the object plane; θ is a constant coefficient;
the compensation processing module is used for performing compensation processing on the fluorescence intensity value of each pixel unit in the three-dimensional fluorescence image according to the compensation weight to obtain a compensated three-dimensional fluorescence image;
and the image output module outputs the compensated three-dimensional fluorescence image.
8. A storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 6.
9. A terminal, characterized in that it comprises a processor and a memory, in which a computer program is stored, the processor being adapted to carry out the method of any one of claims 1 to 6 by calling the computer program stored in the memory.
CN202110525987.9A 2021-05-14 2021-05-14 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal Active CN113367638B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110525987.9A CN113367638B (en) 2021-05-14 2021-05-14 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110525987.9A CN113367638B (en) 2021-05-14 2021-05-14 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN113367638A CN113367638A (en) 2021-09-10
CN113367638B true CN113367638B (en) 2023-01-03

Family

ID=77571083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110525987.9A Active CN113367638B (en) 2021-05-14 2021-05-14 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN113367638B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114052913B (en) * 2022-01-17 2022-05-17 广东欧谱曼迪科技有限公司 AR (augmented reality) fluorescence surgery navigation system and method
CN114266817B (en) * 2022-03-02 2022-06-07 广东欧谱曼迪科技有限公司 Fluorescent depth image synthesis method and device, electronic equipment and storage medium
CN115393348B (en) * 2022-10-25 2023-03-24 绵阳富临医院有限公司 Burn detection method and system based on image recognition and storage medium
CN117281451A (en) * 2023-11-14 2023-12-26 杭州显微智能科技有限公司 3D endoscope fluorescence imaging system and imaging method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108420405A (en) * 2018-03-20 2018-08-21 苏州大学 Fluorescent molecule tomography rebuilding method based on depth-volume hybrid compensation strategy
KR20180131092A (en) * 2017-05-31 2018-12-10 부산대학교 산학협력단 Apparatus for acquiring three-dimensional fluorescence image for intraoperative fluorescence imaging system and its method
CN109758094A (en) * 2019-01-31 2019-05-17 广东欧谱曼迪科技有限公司 A kind of focusing feedback-type fluorescence navigation endoscopic system and image are from processing method
CN110619617A (en) * 2019-09-27 2019-12-27 中国科学院长春光学精密机械与物理研究所 Three-dimensional imaging method, device, equipment and computer readable storage medium
WO2020148721A1 (en) * 2019-01-17 2020-07-23 University Health Network Systems, methods, and devices for three-dimensional imaging, measurement, and display of wounds and tissue specimens
CN111685711A (en) * 2020-05-25 2020-09-22 中国科学院苏州生物医学工程技术研究所 Medical endoscope three-dimensional imaging system based on 3D camera
CN111803013A (en) * 2020-07-21 2020-10-23 深圳市博盛医疗科技有限公司 Endoscope imaging method and endoscope imaging system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180131092A (en) * 2017-05-31 2018-12-10 부산대학교 산학협력단 Apparatus for acquiring three-dimensional fluorescence image for intraoperative fluorescence imaging system and its method
CN108420405A (en) * 2018-03-20 2018-08-21 苏州大学 Fluorescent molecule tomography rebuilding method based on depth-volume hybrid compensation strategy
WO2020148721A1 (en) * 2019-01-17 2020-07-23 University Health Network Systems, methods, and devices for three-dimensional imaging, measurement, and display of wounds and tissue specimens
CN109758094A (en) * 2019-01-31 2019-05-17 广东欧谱曼迪科技有限公司 A kind of focusing feedback-type fluorescence navigation endoscopic system and image are from processing method
CN110619617A (en) * 2019-09-27 2019-12-27 中国科学院长春光学精密机械与物理研究所 Three-dimensional imaging method, device, equipment and computer readable storage medium
CN111685711A (en) * 2020-05-25 2020-09-22 中国科学院苏州生物医学工程技术研究所 Medical endoscope three-dimensional imaging system based on 3D camera
CN111803013A (en) * 2020-07-21 2020-10-23 深圳市博盛医疗科技有限公司 Endoscope imaging method and endoscope imaging system

Also Published As

Publication number Publication date
CN113367638A (en) 2021-09-10

Similar Documents

Publication Publication Date Title
CN113367638B (en) Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal
CA2931529C (en) 3d corrected imaging
US6587183B1 (en) Range finder and camera
CN110192390A (en) The light-field capture of head-mounted display and rendering
JP6438216B2 (en) Image generating apparatus and image generating method
WO2015023990A1 (en) Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance
LU500127B1 (en) Enhanced augmented reality headset for medical imaging
CN104380066A (en) System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light
KR102009558B1 (en) Goggle system for image guide surgery
US20130023732A1 (en) Endoscope and endoscope system
CN106560163A (en) Surgical navigation system and registration method of surgical navigation system
JP2001137173A (en) Fluorescent image measurement method and equipment
CN113208567A (en) Multispectral imaging system, imaging method and storage medium
JP2019141578A (en) Image processing method and apparatus using elastic mapping of vascular plexus structures
CN111508068A (en) Three-dimensional reconstruction method and system applied to binocular endoscope image
CN113436129B (en) Image fusion system, method, device, equipment and storage medium
CN109068035A (en) A kind of micro- camera array endoscopic imaging system of intelligence
WO2021029277A1 (en) Endoscope system and method for operating same
CN114882096B (en) Method and device for measuring distance under fluorescent endoscope, electronic equipment and storage medium
CN114882093B (en) Intraoperative three-dimensional point cloud generation method and device and intraoperative local structure navigation method
JP2020130634A (en) Endoscope apparatus
CN112294453B (en) Microsurgery surgical field three-dimensional reconstruction system and method
CN105213032B (en) Location of operation system
CN115316919B (en) Dual-camera 3D optical fluorescence endoscope imaging system, method and electronic equipment
WO2022272002A1 (en) Systems and methods for time of flight imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 528253 Room 503, Floor 5, Building A, Jingu Zhichuang Industrial Community, No. 2, Yong'an North Road, Dawu Community, Guicheng Street, Nanhai District, Foshan City, Guangdong Province (residence declaration)

Patentee after: Guangdong Oupu Mandi Technology Co.,Ltd.

Address before: Room B, room 504-2, floor 5, block a, Jingu photoelectric community, No. 1, Yongan North Road, Pingzhou, Guicheng Street, Nanhai District, Foshan City, Guangdong Province, 528251

Patentee before: GUANGDONG OPTOMEDIC TECHNOLOGY CO.,LTD.