CN111727013A - Imaging method and imaging system - Google Patents

Imaging method and imaging system Download PDF

Info

Publication number
CN111727013A
CN111727013A CN201880055971.2A CN201880055971A CN111727013A CN 111727013 A CN111727013 A CN 111727013A CN 201880055971 A CN201880055971 A CN 201880055971A CN 111727013 A CN111727013 A CN 111727013A
Authority
CN
China
Prior art keywords
target tissue
photoacoustic
volume data
image
ultrasound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880055971.2A
Other languages
Chinese (zh)
Other versions
CN111727013B (en
Inventor
姜玉新
李建初
杨萌
杨芳
朱磊
苏娜
王铭
唐鹤文
张睿
唐天虹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Peking Union Medical College Hospital Chinese Academy of Medical Sciences
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Peking Union Medical College Hospital Chinese Academy of Medical Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd, Peking Union Medical College Hospital Chinese Academy of Medical Sciences filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN111727013A publication Critical patent/CN111727013A/en
Application granted granted Critical
Publication of CN111727013B publication Critical patent/CN111727013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An imaging method and an imaging system. The imaging method comprises the following steps: emitting laser light toward a target and receiving a photoacoustic signal (301) returned from the target; transmitting ultrasonic waves to the target body and receiving ultrasonic echoes returned from the target body to obtain ultrasonic echo signals (302); obtaining photoacoustic volume data from the photoacoustic signal and ultrasound volume data from the ultrasound echo signal (303); determining a boundary of a target tissue in the ultrasound volume data (304); rendering the target tissue according to the boundary of the target tissue to obtain an ultrasonic volume image (305) of the target tissue; rendering the photoacoustic volume data to obtain a photoacoustic volume image (306) of the target tissue; the ultrasound volume image is fused with the photoacoustic volume image to obtain a fused image of the target tissue (307). The imaging method and system are used for improving the intuitiveness of the image.

Description

Imaging method and imaging system Technical Field
The present application relates to the field of medical devices, and in particular, to an imaging method and an imaging system.
Background
Photoacoustic Imaging (PAI) is a new biomedical Imaging technology, and the principle of PAI is based on Photoacoustic effect, and when biological tissue is irradiated by short pulse laser light, for example, nanosecond (ns) order, substances with strong optical absorption property in biological tissue, such as blood, will cause local temperature rise and thermal expansion after absorbing light energy, thereby generating Photoacoustic signal and transmitting outwards. The photoacoustic signal generated by the biological tissue irradiated by the short-pulse laser can be detected by the ultrasonic probe, and the position and the shape of the absorber, namely the substance with the high optical absorption characteristic, can be reconstructed by utilizing a corresponding reconstruction algorithm. The photoacoustic imaging combines the advantages of optics and ultrasound, has unique advantages on early diagnosis and prognosis evaluation of some serious diseases, and is a novel imaging technology with great clinical and industrial prospects. Photoacoustic imaging applications focus on some superficial organs, limited by the ability of light to penetrate biological tissues. The photoacoustic imaging reflects the function information of the organism, the traditional ultrasonic imaging reflects the structure information of the organism, and the two are effectively combined, namely, the photoacoustic-ultrasonic dual-mode imaging overcomes the defect of single-mode imaging and can provide more comprehensive organization structure and function information.
In view of the traditional ultrasonic three-dimensional scanning mode, the three-dimensional photoacoustic-ultrasonic imaging method utilizes a mechanical device to drive a photoacoustic-ultrasonic composite probe to move along a certain direction, so that three-dimensional (3D, 3Dimensions) data can be acquired, and then the 3D data is rendered, displayed and the like, so that an operator can observe the tissue structure and functions in a three-dimensional manner.
However, in the existing scheme, when a photoacoustic image and an ultrasound image are obtained by three-dimensional photoacoustic-ultrasound imaging, a target tissue, that is, a biological tissue can be displayed by adjusting an angle and transparency. However, the ultrasound image only has a grayscale image, and cannot intuitively display the image of the target tissue, which is inconvenient for observing the target tissue. For example, the gray scale values of the lesion and the normal tissue may be very close, and thus, the image obtained by the three-dimensional photoacoustic-ultrasonic imaging cannot clearly observe the lesion and cannot effectively display the lesion.
Disclosure of Invention
The application provides an imaging method and an imaging system, which are used for improving the intuitiveness of an image.
A first aspect of embodiments of the present application provides an imaging method, including: emitting laser light to a target body and receiving a photoacoustic signal returned from the target body; transmitting ultrasonic waves to the target body and receiving ultrasonic echo signals returned from the target body; obtaining photoacoustic volume data from the photoacoustic signal and ultrasonic volume data from the ultrasonic echo signal; determining a boundary of a target tissue in the ultrasound volume data; rendering the target tissue according to the boundary of the target tissue to obtain an ultrasonic volume image of the target tissue; rendering the photoacoustic volume data to obtain a photoacoustic volume image of the target tissue; and fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue.
A second aspect of embodiments of the present application provides an imaging system, comprising: the device comprises a laser, a probe, a transmitting circuit, a receiving circuit and a processor; the laser is used for generating laser for irradiating a target body, the laser is coupled to the probe through a fiber bundle and emits the laser to the target body through the fiber bundle; the receiving circuit is used for controlling the probe to receive the photoacoustic signal returned from the target body; the transmitting circuit is used for controlling the probe to transmit ultrasonic waves to the target body; the receiving circuit is used for controlling the probe to receive an ultrasonic echo signal returned from the target body; the processor is used for generating a control signal and sending the control signal to the laser so as to control the laser to generate the laser; the processor is further used for obtaining photoacoustic volume data according to the photoacoustic signals and obtaining ultrasonic volume data according to the ultrasonic echo signals; determining a boundary of a target tissue in the ultrasonic volume data, and rendering the ultrasonic volume data according to the boundary of the target tissue to obtain an ultrasonic volume image of the target tissue; rendering the photoacoustic volume data to obtain a photoacoustic volume image of the target tissue; and fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue.
A third aspect of embodiments of the present application provides a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the imaging method provided by the first aspect described above.
In the present embodiment, first, laser light and ultrasonic waves are emitted to a target body, respectively, to obtain ultrasonic volume data and photoacoustic volume data. Generally, the ultrasound volume data may be a gray scale image, which may display the shape of the target tissue, and the photoacoustic volume data may generally include distribution data inside the target tissue, such as blood vessels, blood oxygen, and the like. Therefore, the boundary of the target tissue can be segmented according to the ultrasonic volume data, and the ultrasonic volume data is rendered to obtain the ultrasonic volume image of the target tissue. And rendering the photoacoustic volume data to obtain a photoacoustic volume image, which is an image including distribution conditions of blood vessels, blood oxygen, and the like inside or around the target tissue. And fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue, so that a 3D image of the structure of the target tissue and the distribution condition of the inside or the periphery of the target tissue can be simultaneously displayed in the fused image. Therefore, the fused image obtained by fusion can carry out more comprehensive three-dimensional display on the target tissue, so that an operator can observe the target tissue more comprehensively and more intuitively.
Drawings
Fig. 1 is a schematic structural block diagram of a possible imaging system provided in an embodiment of the present application;
fig. 2 is a schematic view of an application scenario of a possible ultrasound imaging method provided in an embodiment of the present application;
FIG. 3 is a flow chart of one possible imaging method provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a possible mechanical scanner provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a possible probe according to an embodiment of the present disclosure.
Detailed Description
The application provides an imaging method and an imaging system, which are used for improving the intuitiveness of image display.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a block diagram illustrating an imaging system 10 according to an embodiment of the present disclosure. The imaging system 10 may include a probe 110, a laser 120, and a mechanical scanner 130, as well as transmit circuitry 101, transmit/receive selection switch 102, receive circuitry 103, a processor 105, a display 106, and a memory 107. Of course, the imaging system 10 may also include other devices or devices, etc., not shown in the figures.
The transmit circuitry 101 may excite the probe 110 to transmit ultrasound waves toward the target volume. After the probe 110 transmits the ultrasound waves, the receiving circuit 103 may receive the ultrasound echoes returned from the target body through the probe 110, thereby obtaining ultrasound echo signals/data. The ultrasonic echo signals/data are sent to the processor 105 directly or after being processed by the beam forming circuit. The processor 105 processes the ultrasound echo signals/data processed by the beamforming circuit to obtain ultrasound volume data of the target volume. Ultrasound volume data obtained by the processor 105 may be stored in the memory 107. The laser 120 may generate laser light and emit the laser light to a target body through a fiber bundle. After the laser 120 emits laser light, the receiving circuit 103 may also receive photoacoustic signals/data returned by the target under excitation of the laser light through the probe 110. The photoacoustic signal/data is sent to the processor 105 directly or after being processed, and the processor processes the photoacoustic signal/data to obtain photoacoustic volume data of the target volume. The mechanical scanner 130 can move the probe 110. The ultrasound volume data and the photoacoustic volume data can be displayed on the display 106, that is, the ultrasound image and the photoacoustic image can be displayed on the display 106.
The probe 110 may be caused to receive ultrasound echo signals/data or photoacoustic signals/data from different orientations by the mechanical scanner 130, and the processor 105 may be caused to process the received ultrasound echo signals/data or photoacoustic signals/data to obtain ultrasound volume data or photoacoustic volume data.
Where the mechanical scanner 130 is an optional device, in some embodiments, the mechanical scanner 130 is disposed inside the probe 110, i.e., the functionality of the mechanical scanner 130 is integrated on the probe 110.
In an embodiment of the present application, the mechanical scanner 130 may further include a motor controller and a motor, and the motor controller controls a motion trajectory, a stroke, or a speed of the motor in the mechanical scanner 130 according to a control signal sent by the processor.
In one embodiment of the present application, the probe 110 may be independent, or may be disposed on the mechanical scanner 130, and the mechanical scanner 130 drives the probe 110 to move.
In an embodiment of the present application, the laser 120 may be connected to the transmitting/receiving selection switch 102, and the transmitting/receiving selection switch 102 controls the transmission of the laser, or the laser 120 may be directly connected to the probe 110 through a light conduction tool, and a fiber bundle is coupled to the probe 110, and the laser is conducted to both sides of the acoustic head of the probe 110 by using the fiber bundle, and the target is irradiated by using a back-type lighting method.
In an embodiment of the present application, the probe 110 may specifically include an ultrasound transducer, and the ultrasound transducer has functions of transmitting and receiving signals, and can perform a plurality of imaging such as gray-scale imaging and doppler bleeding imaging. In addition, in some implementations, the fiber bundle and the ultrasonic transducer are coupled and enclosed by the housing to form a probe integrating the photoacoustic imaging and the ultrasonic imaging functions, that is, under the probe with such a structure, the laser emits laser light, the laser light is irradiated onto a target body through the probe, and a photoacoustic signal formed under the excitation of the laser light and returned from the target body is received through the probe. Of course, the probe may also be used for conventional ultrasound imaging, i.e. transmitting ultrasound waves towards the target volume and receiving ultrasound echoes back from the target volume. Of course, the laser can also be directly coupled with the ultrasonic transducer and completely or partially surrounded by the shell to form a probe integrating the functions of photoacoustic imaging and ultrasonic imaging, and the probe can be used for both photoacoustic imaging and ultrasonic imaging.
In an embodiment of the application, the display 106 may be a touch display screen, a liquid crystal display screen, or the like built in the imaging system, or may be an independent display device such as a liquid crystal display, a television, or the like independent of the imaging system, or may be a display screen on an electronic device such as a mobile phone, a tablet computer, or the like.
In an embodiment of the present application, the memory 107 can be a flash memory card, a solid-state memory, a hard disk, or the like.
In an embodiment of the present application, a computer-readable storage medium is further provided, where a plurality of program instructions are stored, and when the plurality of program instructions are called by the processor 105 to be executed, some or all of the steps of the ultrasound imaging method in the embodiments of the present application, or any combination of the steps thereof, may be executed.
In one embodiment of the present application, the computer readable storage medium may be the memory 107, which may be a non-volatile storage medium such as a flash memory card, a solid state memory, a hard disk, and the like.
In one embodiment of the present application, the aforementioned processor 105 may be implemented by software, hardware, firmware or a combination thereof, and may use a circuit, a single or multiple Application Specific Integrated Circuits (ASICs), a single or multiple general purpose integrated circuits, a single or multiple microprocessors, a single or multiple programmable logic devices, or a combination of the aforementioned circuits or devices, or other suitable circuits or devices, so that the processor 105 may execute the corresponding steps of the imaging method in the various embodiments of the present application.
The imaging method in the present application is described in detail below based on the aforementioned imaging system.
It should be noted that, with reference to the schematic structural block diagram of the imaging system shown in fig. 1, the imaging method provided in the embodiment of the present application may be applied to the following application scenarios: for example, a specific application scenario may refer to fig. 2. The operator scans the probe 110 on the target 201, the laser emits laser light and irradiates the target through the optical fiber beam, the probe receives photoacoustic signals returned from the target, the probe emits ultrasonic waves to the target, and ultrasonic echo signals returned from the target are received through the probe. The operator may view the tissue structure, etc. through the display 106.
Based on this, referring to fig. 3, an imaging method provided in an embodiment of the present application may be applied to the imaging system shown in fig. 1, where the imaging method includes:
301. laser light is emitted toward the target body, and a photoacoustic signal returned from the target body is received.
After determining the target body where the target tissue is located, the laser 120 emits laser light to the target body through the optical fiber bundle, and then the probe 110 receives a photoacoustic signal generated by the target body under the excitation of the laser light. The received photoacoustic signal may also differ depending on the target tissue.
Specifically, a laser is coupled to the probe through a fiber bundle, the laser emits laser light, and the laser light is then emitted by the fiber bundle toward the target. After the tissue in the target absorbs the light energy, it will cause temperature rise and thermal expansion, thereby generating an outward propagation of the photoacoustic signal, which is detected by the probe 110.
In one embodiment of the present application, the probe 110 may be disposed on the mechanical scanner 130, and then the processor 105 may send a control signal to the mechanical scanner 130 to control a motor within the mechanical scanner 130, to control the scanning speed and trajectory of the mechanical scanner 130, and so on. After the laser light is emitted to the target, the probe 110 is allowed to receive photoacoustic signals returned from the target from different angles around the target to perform photoacoustic imaging on the target from different angles.
Illustratively, the mechanical scanner 130 may be as shown in FIG. 4,
in one embodiment of the present application, the laser 120 may receive a control signal sent by the processor 105, the control signal may include a frequency, a timing, etc. of the generated laser light, and the laser 120 generates laser light according to the control signal and transmits the laser light to the target through a fiber optic bundle coupled to the probe 110.
In one embodiment of the present application, after the laser 120 generates the laser light, the laser may transmit a feedback signal to the processor 105, where the feedback signal may include an actual transmission time of the laser light, and the processor 105 determines an interval duration for receiving the photoacoustic signal according to a preset algorithm and controls the probe 110 to receive the photoacoustic signal.
302. Ultrasonic waves are transmitted to a target body, and ultrasonic echoes returned from the target body are received, so that ultrasonic echo signals are obtained.
An ultrasonic wave may be transmitted to the target body through the probe 110, and an ultrasonic echo returned from the target body is received through the probe 110 and converted into an ultrasonic echo signal. The received ultrasound echo signals may also differ depending on the target tissue. The ultrasound echo signals may be understood as the aforementioned ultrasound echo signals/data.
It should be noted that, the laser and the ultrasonic wave are not transmitted simultaneously, the laser may be transmitted first, or the ultrasonic wave may be transmitted first, that is, step 301 may be executed first, or step 302 may be executed first, and the specific method may be adjusted according to an actual application scenario, and is not limited herein.
In one embodiment of the present application, the ultrasound waves are transmitted through the probe 110, the probe 110 may be disposed on the mechanical scanner 130, and then the processor 105 may transmit control signals to the mechanical scanner 130 to control the motor within the mechanical scanner 130 to control the scanning speed, trajectory, etc. of the mechanical scanner 130, so that the probe 110 may transmit ultrasound waves from different angles around the target and receive ultrasound echoes from different angles to ultrasonically image the target from different angles.
In an embodiment of the present application, as shown in fig. 5, specifically, the processor 105 may control to turn on the transmission/reception selection switch 102, and control the transmission circuit 101 to transmit the ultrasonic wave to the target body through the probe 110, and receive the ultrasonic echo through the probe 110, and transmit the ultrasonic echo to the receiving circuit 103, that is, it may be understood that the receiving circuit 103 may receive the ultrasonic echo returned from the target body through the probe 110, so as to obtain an ultrasonic echo signal.
In one embodiment of the present application, a fiber bundle is coupled to the ultrasound array probe, and the fiber bundle is used to conduct laser to both sides of the probe 110 to irradiate the target body in a back-to-back manner. And the probe 110 includes an ultrasonic transducer, which has the functions of transmitting and receiving signals, so that on the basis of ensuring the traditional ultrasonic imaging and doppler blood flow imaging, the ultrasonic transducer has larger frequency bandwidth and high sensitivity, improves the detection capability of photoacoustic signals, and can detect even weak signals.
303. Photoacoustic volume data is obtained from the photoacoustic signals, and ultrasound volume data is obtained from the ultrasound echo signals.
After receiving the photoacoustic signal and the ultrasonic echo signal, the photoacoustic signal may be converted into photoacoustic volume data, and the ultrasonic echo signal may be converted into ultrasonic volume data.
Specifically, after the ultrasonic echo signal is received, the noise in the ultrasonic signal can be removed. The ultrasonic echo signal is subjected to beamforming processing by the beamforming circuit and then transmitted to the processor 105, and the processor 105 processes the ultrasonic echo signal to obtain ultrasonic volume data of a target body. After the photoacoustic signal is acquired, noise in the photoacoustic signal may also be removed, and then image reconstruction processing such as beam synthesis processing may be performed to obtain photoacoustic volume data of the target volume. In general, the ultrasound volume data may be a grayscale image, and may embody structural information of a target tissue within the target body, and the photoacoustic volume data may embody functional information of the tissue within the target body.
In an embodiment of the present application, if the probe 110 is set in the mechanical scanner 130 to move, a plurality of ultrasonic echo signals and photoacoustic signals at different angles can be acquired, and then a plurality of frames of corresponding ultrasonic volume data and photoacoustic volume data can also be obtained. Generally, the 3D structure of the target tissue can be displayed comprehensively by changing the light projection direction and angle or adjusting the transparency of the object display, so that the operator can perform certain observation through the ultrasonic volume data and the photoacoustic volume data.
In general, doppler flow imaging can be achieved by using doppler shift, and blood flow with a certain flow velocity can be imaged. However, doppler blood flow imaging is too sensitive to motion, including tissue motion and probe motion, so that it is difficult to implement three-dimensional doppler imaging using a mechanical scanner, and artifacts due to motion can be introduced in the process of scanning by the mechanical scanner driving the probe. But photoacoustic imaging relies on the photoacoustic signal generated by the absorption of laser light at a specified wavelength by tissue and is therefore not sensitive to the motion of the tissue or probe. Therefore, the acquisition of the photoacoustic volume data and the ultrasonic volume data of the target body can be realized by using the mechanical scanner, the acquisition of the functional information of the target body is displayed through the photoacoustic volume data, and the acquisition of the structural information of the target body is realized through the ultrasonic volume data, so that the Doppler blood flow imaging is not required, and the 3D acquisition of the function and the structural information of the tissue body can also be realized.
In one embodiment of the present application, after acquiring the photoacoustic volume data and the ultrasound volume data, the photoacoustic volume data and/or the ultrasound volume data may be displayed in the display 106, and an operator may also select to display any one of the frame images of the photoacoustic volume data or the ultrasound volume data.
It should be noted that, in the embodiment of the present application, the order of acquiring the photoacoustic volume data and the ultrasound volume data is not limited, the photoacoustic volume data may be acquired first, or the ultrasound volume data may be acquired first, and may be specifically adjusted according to an actual application scenario, and this is not limited here.
304. A boundary of the target tissue is determined in the ultrasound volume data.
After acquiring the ultrasound volume data, a boundary of the target tissue is determined in the ultrasound volume data. The boundary of the target tissue may be determined in the ultrasound volume data according to a preset algorithm, or the operator may input the boundary of the target tissue in the ultrasound volume data according to the ultrasound volume data.
In one embodiment of the present application, the boundary of the target tissue may be determined by comparing the parameter values of the target tissue with other tissues surrounding the target tissue in the ultrasound volume data. Wherein the parameter value may comprise at least one of a grey value, a brightness value, a pixel value or a gradient value in the ultrasound volume data. Of course, the parameter value may be other values capable of performing image contrast besides the above-mentioned gray scale value, luminance value, pixel value or gradient value, and may be specifically adjusted according to the actual application, and is not limited herein.
In one embodiment of the present application, the boundary of the target tissue in the ultrasound volume data may also be determined by manual selection by the operator. The processor 105 receives input parameters for the ultrasound volume data and determines the boundary of the target tissue based on the input parameters. For example, ultrasound volume data may be displayed in the display 106, and the operator may select a boundary of the target tissue in the ultrasound volume data via the input device to generate the input parameters. Therefore, even when the contrast of the target tissue and the surrounding normal tissue is not obvious, the boundary of the target tissue can be manually defined by an operator, so that the ultrasonic volume image of the target tissue obtained subsequently is more accurate.
In one embodiment of the present application, when there are multiple frames of ultrasound volume data, the multiple frames of ultrasound volume data may be displayed as 3D ultrasound volume data in a fused manner, and then manually selected by an operator in the 3D ultrasound volume data to determine the boundary of the target tissue in the ultrasound volume data.
305. Rendering the ultrasound volume data according to the boundary of the target tissue to obtain an ultrasound volume image of the target tissue.
After the boundary of the target tissue is determined in the ultrasonic volume data, rendering the ultrasonic volume data, including adjusting the color value, the brightness value or the gray value of the boundary of the target tissue, to obtain a three-dimensional ultrasonic volume image of the target tissue.
In an embodiment of the application, when rendering the ultrasound volume data, the multi-frame ultrasound volume data may be referred to, and the ultrasound volume data may be rendered in a plurality of three-dimensional rendering modes, such as volume rendering and surface rendering, to obtain an ultrasound volume image of the target tissue, that is, the ultrasound volume image is a three-dimensional ultrasound image.
It can be understood that, after the frame shape of the target tissue is determined in the ultrasound volume data, the target tissue is rendered according to the frame shape, and a stereoscopic ultrasound volume image of the target tissue is obtained.
306. And rendering the photoacoustic volume data to obtain a photoacoustic volume image of the target tissue.
After the photoacoustic volume data is acquired, the photoacoustic volume data is rendered, and light, color and the like of the photoacoustic volume data are adjusted to obtain a stereo photoacoustic volume image of the target tissue.
In an embodiment of the present application, there are various ways to obtain a photoacoustic volume image of a target tissue, and the photoacoustic volume data may be rendered in a three-dimensional rendering manner to obtain a photoacoustic volume image, and a specific three-dimensional rendering manner may include various ways such as volume rendering, surface rendering, and the like, that is, the photoacoustic volume image is a three-dimensional photoacoustic image.
It should be noted that, the order of acquiring the ultrasound volume image and the photoacoustic volume image is not limited in the present application, the ultrasound volume image may be acquired first, or the photoacoustic volume image may be acquired first, that is, step 305 may be executed first, or step 306 may be executed first, and the order may be adjusted according to the actual application scenario, which is not limited herein.
In one embodiment of the present application, the region rendering the ultrasound volume data may be greater than, equal to, or smaller than the region rendering the photoacoustic volume data. The region rendering the ultrasound volume data may include all or a partial region of the target tissue, or the region rendering the photoacoustic volume data may include all or a partial region of the target tissue. Assuming that the region rendering the ultrasound volume data is an a region and the region rendering the photoacoustic volume data is a B region, the a region may be greater than, equal to, or smaller than the B region. For example, in the process of fusing the photoacoustic volume image and the ultrasound volume image, if the photoacoustic volume image is below the ultrasound volume image, the B region may be larger than the a region, that is, the a region may be rendered only for the region where the target tissue is located, and the B region not only renders the region where the target tissue is located, but also renders other regions except for the target tissue, so that the rendering modes are different, and the characteristics of the target tissue can be more obviously represented on the fused image, and the intuitiveness of the image is improved. Of course, the area a may include all or part of the target tissue, and the area B may also include all or part of the target tissue, that is, assuming that only a certain part of the target tissue is subjected to the key analysis, the area where the certain part of the target tissue is located is only required to be rendered without rendering the entire area where the target tissue is located, and this is not specifically limited herein.
307. And fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue.
And after the ultrasonic volume image and the photoacoustic volume image are obtained, fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue.
Specifically, the photoacoustic volume image may be superimposed on the ultrasound volume image, or the ultrasound volume image may be superimposed on the photoacoustic volume image, or the operator may select to superimpose the photoacoustic volume image on the ultrasound volume image or superimpose the ultrasound volume image on the photoacoustic volume image, which may be specifically adjusted according to an actual application scenario.
Further, the pixel value, the brightness value, the gray value, and the like of the ultrasound volume image may be superimposed based on the photoacoustic volume image, or the pixel value, the brightness value, the gray value, and the like of the photoacoustic volume image may be superimposed based on the ultrasound volume image, or an operator may select to perform the superimposition based on the photoacoustic volume image or the ultrasound volume image, and the like, which may be specifically adjusted according to an actual application scenario.
In one embodiment of the present application, the fused image may be displayed on the display 106 after it is obtained.
In one embodiment of the present application, the ultrasound volume image and the photoacoustic volume image may be set to different colors when performing image fusion to better distinguish the target tissue from the structural information within the target tissue.
In one embodiment of the present application, after the fused image is obtained, the target tissue may be analyzed according to the parameters of the fused image to obtain the analysis result, and the analysis result is displayed on the display 106. For example, some blood vessel or blood oxygen distribution may be obtained from the fused image, the target tissue may be analyzed according to the blood vessel or blood oxygen distribution inside or around the target tissue, the state of the target tissue may be evaluated, and the evaluation result may be displayed on the display 106. So that the operator can observe the target tissue more comprehensively by taking the evaluation result as a reference.
Therefore, in the present embodiment, laser light is emitted to a target body to obtain photoacoustic volume data, and ultrasonic waves are emitted to the target body to obtain ultrasonic volume data. Generally, the ultrasound volume data may be a gray scale image, which may display the approximate shape of the target tissue, and the photoacoustic volume data may generally display the distribution inside the target tissue, such as blood vessels, blood oxygen, and the like. The boundary of the target tissue may then be segmented from the ultrasound volume data and the ultrasound volume data rendered to obtain an ultrasound volume image of the target tissue. And obtaining a photoacoustic volume image through photoacoustic volume data rendering, wherein the photoacoustic volume image is an image including distribution conditions of blood vessels, blood oxygen and the like in or around the target tissue. And superposing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue, so that a 3D image corresponding to the boundary of the target tissue and the distribution condition of the inside or the periphery of the target tissue can be simultaneously displayed in the fused image. Therefore, the fused image obtained by fusion can carry out more comprehensive three-dimensional display on the target tissue, so that an operator can observe the target tissue more comprehensively and more intuitively.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In practical applications, the target may be a human body, an animal, or the like. The target tissue may be a face, a spine, a heart, a uterus, a pelvic floor, or the like, or may be other parts of a human tissue, such as a brain, a bone, a liver, or a kidney, and is not limited herein.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (18)

  1. An imaging method, comprising:
    emitting laser light to a target body and receiving a photoacoustic signal returned from the target body;
    transmitting ultrasonic waves to the target body, and receiving ultrasonic echoes returned from the target body to obtain ultrasonic echo signals;
    obtaining photoacoustic volume data from the photoacoustic signal and ultrasonic volume data from the ultrasonic echo signal;
    determining a boundary of a target tissue in the ultrasound volume data;
    rendering the ultrasound volume data according to the boundary of the target tissue to obtain an ultrasound volume image of the target tissue;
    rendering the photoacoustic volume data to obtain a photoacoustic volume image of the target tissue;
    and fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue.
  2. The method of claim 1, wherein said determining a boundary of a target tissue in said ultrasound volume data comprises:
    determining a boundary of the target tissue by comparing parameter values of the target tissue with other tissues surrounding the target tissue, wherein the parameter values include at least one of gray values, brightness values, pixel values and gradient values in the ultrasound volume data.
  3. The method of claim 1, wherein said determining a boundary of a target tissue in said ultrasound volume data comprises:
    receiving input parameters for the ultrasound volume data;
    and determining the boundary of the target tissue according to the input parameters.
  4. The method according to any one of claims 1-3, wherein said rendering the ultrasound volume data according to the boundary of the target tissue to obtain an ultrasound volume image of the target tissue comprises:
    rendering the ultrasonic volume data according to the boundary of the target tissue in a three-dimensional rendering mode to obtain the ultrasonic volume image.
  5. The method of any of claims 1-4, wherein the rendering the photoacoustic volume data to obtain a photoacoustic volume image of the target tissue comprises:
    and rendering the photoacoustic volume data in a three-dimensional rendering mode to obtain the photoacoustic volume image.
  6. The method according to any one of claims 1-5, further comprising:
    and displaying the fused image.
  7. The method according to any one of claims 1-6, further comprising:
    and analyzing the target tissue according to the fused image to obtain tissue distribution data of the target tissue.
  8. The method according to any one of claims 1 to 7,
    the rendering region of the ultrasound volume data is greater than, equal to, or smaller than the rendering region of the photoacoustic volume data, wherein the rendering region of the ultrasound volume data includes all or part of the target tissue, and the rendering region of the photoacoustic volume data includes all or part of the target tissue.
  9. The method according to any of claims 1-8, wherein said fusing the ultrasound volume image with the photoacoustic volume image to obtain a fused image of the target tissue comprises:
    superimposing the ultrasound volume image into the photoacoustic volume image;
    alternatively, the photoacoustic volume image is superimposed into the ultrasound volume image.
  10. An imaging system, comprising: the device comprises a laser, a probe, a transmitting circuit, a receiving circuit and a processor;
    the laser is used for generating laser for irradiating a target body, the laser is coupled to the probe through a fiber bundle and emits the laser to the target body through the fiber bundle;
    the receiving circuit is used for controlling the probe to receive the photoacoustic signal returned from the target body;
    the transmitting circuit is used for controlling the probe to transmit ultrasonic waves to the target body;
    the receiving circuit is used for controlling the probe to receive the ultrasonic echo returned from the target body to obtain an ultrasonic echo signal;
    the processor is used for generating a control signal and sending the control signal to the laser so as to control the laser to generate the laser;
    the processor is further used for obtaining photoacoustic volume data according to the photoacoustic signals and obtaining ultrasonic volume data according to the ultrasonic echo signals; determining a boundary of a target tissue in the ultrasonic volume data, and rendering the ultrasonic volume data according to the boundary of the target tissue to obtain an ultrasonic volume image of the target tissue; rendering the photoacoustic volume data to obtain a photoacoustic volume image of the target tissue; and fusing the ultrasonic volume image and the photoacoustic volume image to obtain a fused image of the target tissue.
  11. The imaging system of claim 10, wherein the processor is specifically configured to:
    determining a boundary of the target tissue by comparing parameter values of the target tissue with other tissues surrounding the target tissue, wherein the parameter values include at least one of gray values, brightness values, pixel values and gradient values in the ultrasound volume data.
  12. The imaging system of claim 10, wherein the processor is specifically configured to:
    receiving input parameters for the ultrasound volume data;
    and determining the boundary of the target tissue according to the input parameters.
  13. The imaging system of any of claims 10-12, wherein the processor is specifically configured to:
    rendering the target tissue according to the boundary of the target tissue in a three-dimensional rendering mode to obtain the ultrasonic volume image.
  14. The imaging system of any of claims 10-13, wherein the processor is specifically configured to:
    and rendering the photoacoustic volume data in a three-dimensional rendering mode to obtain the photoacoustic volume image.
  15. The imaging system of any of claims 10-14, further comprising: a display;
    the display is used for displaying the fusion image.
  16. The imaging system of any of claims 10-15,
    the processor is further configured to analyze the target tissue according to the fused image to obtain tissue distribution data of the target tissue.
  17. The imaging system of any of claims 10-16,
    the rendering region of the ultrasound volume data is greater than, equal to, or smaller than the rendering region of the photoacoustic volume data, wherein the rendering region of the ultrasound volume data includes all or part of the target tissue, and the rendering region of the photoacoustic volume data includes all or part of the target tissue.
  18. The imaging system of any of claims 10-17, wherein the processor is specifically configured to superimpose the ultrasound volume image into the photoacoustic volume image; alternatively, the photoacoustic volume image is superimposed into the ultrasound volume image.
CN201880055971.2A 2018-10-24 2018-10-24 Imaging method and imaging system Active CN111727013B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/111679 WO2020082269A1 (en) 2018-10-24 2018-10-24 Imaging method and imaging system

Publications (2)

Publication Number Publication Date
CN111727013A true CN111727013A (en) 2020-09-29
CN111727013B CN111727013B (en) 2023-12-22

Family

ID=70330893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880055971.2A Active CN111727013B (en) 2018-10-24 2018-10-24 Imaging method and imaging system

Country Status (2)

Country Link
CN (1) CN111727013B (en)
WO (1) WO2020082269A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022228461A1 (en) * 2021-04-28 2022-11-03 中慧医学成像有限公司 Three-dimensional ultrasonic imaging method and system based on laser radar

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113243889B (en) * 2020-08-10 2022-05-10 北京航空航天大学 Method and apparatus for obtaining information of biological tissue

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053521A (en) * 2006-04-12 2007-10-17 株式会社东芝 Medical image display apparatus
CN101336844A (en) * 2007-07-05 2009-01-07 株式会社东芝 Medical image processing apparatus and medical image diagnosis apparatus
US20090187099A1 (en) * 2006-06-23 2009-07-23 Koninklijke Philips Electronics N.V. Timing controller for combined photoacoustic and ultrasound imager
US20100113931A1 (en) * 2008-11-03 2010-05-06 Medison Co., Ltd. Ultrasound System And Method For Providing Three-Dimensional Ultrasound Images
US20130021336A1 (en) * 2011-07-19 2013-01-24 Toshiba Medical Systems Corporation Image processing system, image processing device, image processing method, and medical image diagnostic device
US20140024918A1 (en) * 2011-03-29 2014-01-23 Fujifilm Corporation Photoacoustic imaging method and photoacoustic imaging apparatus
CN106214130A (en) * 2016-08-31 2016-12-14 北京数字精准医疗科技有限公司 A kind of hand-held optical imaging and ultra sonic imaging multi-modal fusion imaging system and method
CN107174208A (en) * 2017-05-24 2017-09-19 哈尔滨工业大学(威海) A kind of photoacoustic imaging system and method suitable for peripheral vascular imaging
CN107223035A (en) * 2017-01-23 2017-09-29 深圳迈瑞生物医疗电子股份有限公司 A kind of imaging system, method and ultrasonic image-forming system
CN108403082A (en) * 2018-01-24 2018-08-17 苏州中科先进技术研究院有限公司 A kind of imaging in biological tissues system and imaging method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6109512B2 (en) * 2012-09-20 2017-04-05 東芝メディカルシステムズ株式会社 Image processing apparatus, X-ray diagnostic apparatus and program
US20160192840A1 (en) * 2013-08-01 2016-07-07 Sogang University Research Foundation Device and method for acquiring fusion image
KR20150046637A (en) * 2013-10-22 2015-04-30 삼성전자주식회사 Wideband ultrasonic probe for photoacoustic image and ultrasound image
JP5990834B2 (en) * 2014-03-28 2016-09-14 株式会社日立製作所 Diagnostic image generating apparatus and diagnostic image generating method
US20170209119A1 (en) * 2016-01-27 2017-07-27 Canon Kabushiki Kaisha Photoacoustic ultrasonic imaging apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053521A (en) * 2006-04-12 2007-10-17 株式会社东芝 Medical image display apparatus
US20090187099A1 (en) * 2006-06-23 2009-07-23 Koninklijke Philips Electronics N.V. Timing controller for combined photoacoustic and ultrasound imager
CN101336844A (en) * 2007-07-05 2009-01-07 株式会社东芝 Medical image processing apparatus and medical image diagnosis apparatus
US20100113931A1 (en) * 2008-11-03 2010-05-06 Medison Co., Ltd. Ultrasound System And Method For Providing Three-Dimensional Ultrasound Images
US20140024918A1 (en) * 2011-03-29 2014-01-23 Fujifilm Corporation Photoacoustic imaging method and photoacoustic imaging apparatus
US20130021336A1 (en) * 2011-07-19 2013-01-24 Toshiba Medical Systems Corporation Image processing system, image processing device, image processing method, and medical image diagnostic device
CN106214130A (en) * 2016-08-31 2016-12-14 北京数字精准医疗科技有限公司 A kind of hand-held optical imaging and ultra sonic imaging multi-modal fusion imaging system and method
CN107223035A (en) * 2017-01-23 2017-09-29 深圳迈瑞生物医疗电子股份有限公司 A kind of imaging system, method and ultrasonic image-forming system
CN107174208A (en) * 2017-05-24 2017-09-19 哈尔滨工业大学(威海) A kind of photoacoustic imaging system and method suitable for peripheral vascular imaging
CN108403082A (en) * 2018-01-24 2018-08-17 苏州中科先进技术研究院有限公司 A kind of imaging in biological tissues system and imaging method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022228461A1 (en) * 2021-04-28 2022-11-03 中慧医学成像有限公司 Three-dimensional ultrasonic imaging method and system based on laser radar

Also Published As

Publication number Publication date
WO2020082269A1 (en) 2020-04-30
CN111727013B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
US11323625B2 (en) Subject information obtaining apparatus, display method, program, and processing apparatus
EP1614387B1 (en) Ultrasonic diagnostic apparatus, image processing apparatus and image processing method
KR100932472B1 (en) Ultrasound Diagnostic System for Detecting Lesions
US20130338478A1 (en) Photoacoustic imaging apparatus and photoacoustic imaging method
EP3266378A1 (en) Apparatus, method, and program for obtaining information derived from ultrasonic waves and photoacoustic waves
WO2012157221A1 (en) Tomographic image generating device, method, and program
US20160324423A1 (en) Photoacoustic measurement apparatus and signal processing device and signal processing method for use therein
WO2018008439A1 (en) Apparatus, method and program for displaying ultrasound image and photoacoustic image
US9448100B2 (en) Signal processing apparatus
CN111727013B (en) Imaging method and imaging system
WO2007072490A1 (en) An operating mode for ultrasound imaging systems
CN110338754B (en) Photoacoustic imaging system and method, storage medium, and processor
CN111432730A (en) Imaging method and imaging system
US20210330226A1 (en) Imaging method and imaging system
CN112773403A (en) Ultrasonic imaging method and system
EP3329843B1 (en) Display control apparatus, display control method, and program
US20180368696A1 (en) Object information acquiring apparatus and object information acquiring method
JP4909132B2 (en) Optical tomography equipment
JP7077384B2 (en) Subject information acquisition device
JP4909131B2 (en) Optically assisted ultrasonic velocity change imaging apparatus and optically assisted ultrasonic velocity change image display method
WO2019111552A1 (en) Ultrasonic diagnostic device and method of controlling ultrasonic diagnostic device
JP2020018468A (en) Information processing device, information processing method, and program
JP2020018466A (en) Information processing device, information processing method, and program
JP2020036981A (en) Subject information acquisition device and control method thereof
JP2020018467A (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant