CN111432730A - Imaging method and imaging system - Google Patents

Imaging method and imaging system Download PDF

Info

Publication number
CN111432730A
CN111432730A CN201880055953.4A CN201880055953A CN111432730A CN 111432730 A CN111432730 A CN 111432730A CN 201880055953 A CN201880055953 A CN 201880055953A CN 111432730 A CN111432730 A CN 111432730A
Authority
CN
China
Prior art keywords
image
pixel
target
photoacoustic
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880055953.4A
Other languages
Chinese (zh)
Inventor
杨萌
李建初
姜玉新
杨芳
陈志杰
朱磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Peking Union Medical College Hospital Chinese Academy of Medical Sciences
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Peking Union Medical College Hospital Chinese Academy of Medical Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd, Peking Union Medical College Hospital Chinese Academy of Medical Sciences filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN111432730A publication Critical patent/CN111432730A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An imaging method and an imaging system (10) for clearly displaying photoacoustic-ultrasound bimodal imaging. The imaging method comprises the following steps: transmitting ultrasonic waves to target tissue (201) and receiving ultrasonic echoes returned from the target tissue (201) to obtain ultrasonic echo signals (301); emitting laser light to a target tissue (201) and receiving a photoacoustic signal (302) returned from the target tissue (201); acquiring a gray-scale image and a color Doppler image (303) of a target tissue (201) according to the ultrasonic echo signal; acquiring a photoacoustic image (304) of the target tissue (201) from the photoacoustic signal; the photoacoustic image and the color Doppler image are fused with the grayscale image to obtain a fused image (305) of the target tissue (201).

Description

Imaging method and imaging system Technical Field
The present application relates to the field of medical devices, and in particular, to an imaging method and an imaging system.
Background
Photoacoustic Imaging (PAI) is a new biomedical Imaging technology, and the principle of PAI is based on Photoacoustic effect, and when biological tissue is irradiated by short pulse laser light, for example, nanosecond (ns) order, substances with strong optical absorption property in biological tissue, such as blood, will cause local temperature rise and thermal expansion after absorbing light energy, thereby generating Photoacoustic signal and transmitting outwards. The photoacoustic signal generated by the biological tissue irradiated by the short-pulse laser can be detected by the ultrasonic probe, and the position and the shape of the absorber, namely the substance with the high optical absorption characteristic, can be reconstructed by utilizing a corresponding reconstruction algorithm. The photoacoustic imaging combines the advantages of optics and ultrasound, has unique advantages on early diagnosis and prognosis evaluation of some serious diseases, and is a novel imaging technology with great clinical and industrial prospects. Photoacoustic imaging applications focus on some superficial organs, limited by the ability of light to penetrate biological tissues. The photoacoustic imaging reflects the function information of the organism, the traditional ultrasonic imaging reflects the structure information of the organism, and the two are effectively combined, namely, the photoacoustic-ultrasonic dual-mode imaging overcomes the defect of single-mode imaging and can provide more comprehensive organization structure and function information.
Generally, photoacoustic-ultrasound dual-mode imaging is displayed in a double-amplitude or multi-amplitude mode, and the gray-scale image provides structural information of tissues for other imaging modes, plays a role in positioning and guiding, and is essential in display. For example, the displayed image may include an ultrasound grayscale image fused to a Color Doppler Flow Image (CDFI), a result of which an ultrasound grayscale image is fused to a photoacoustic image. However, due to the limitation of the ultrasound display screen, when two or more images are displayed, the sharpness of each image is reduced. Therefore, how to clearly display photoacoustic-ultrasound bimodal imaging becomes an urgent problem to be solved.
Disclosure of Invention
The application provides an imaging method and an imaging system for clearly displaying photoacoustic-ultrasound bimodal imaging.
A first aspect of embodiments of the present application provides an imaging method, including: transmitting ultrasonic waves to target tissue and receiving ultrasonic echoes returned from the target tissue; emitting laser light to the target tissue and receiving photoacoustic signals returned from the target tissue; acquiring a gray scale image and a color Doppler image of the target tissue according to the ultrasonic echo signal; acquiring a photoacoustic image of the target tissue according to the photoacoustic signal; and fusing the photoacoustic image and the color Doppler image with the gray scale image to obtain a fused image of the target tissue.
A second aspect of embodiments of the present application provides an imaging system, comprising: the device comprises a laser, a probe, a transmitting circuit, a receiving circuit and a processor;
the laser is used for generating laser for irradiating target tissue, the laser is coupled to the probe through a fiber bundle, and the laser is emitted to the target tissue through the probe.
The receiving circuit is used for controlling the probe to receive the photoacoustic signal returned from the target tissue.
The transmitting circuit is also used for controlling the probe to transmit the ultrasonic waves to the target tissue;
the receiving circuit is also used for controlling the probe to receive the ultrasonic echo signals returned from the target tissue.
The processor is used for generating a control signal and sending the control signal to the laser so as to control the laser to generate the laser.
The processor is also used for acquiring a gray scale image and a color Doppler image of the target tissue according to the ultrasonic echo signal; acquiring a photoacoustic image of the target tissue according to the photoacoustic signal; and fusing the photoacoustic image and the color Doppler image with the gray scale image to obtain a fused image of the target tissue.
A third aspect of embodiments of the present application provides a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the imaging method provided by the first aspect described above.
In the application, ultrasonic waves are transmitted to target tissues and ultrasonic echo signals are received so as to obtain gray-scale images and color Doppler images. Laser light is also emitted to the target tissue and photoacoustic signals are received to obtain photoacoustic images. And the color Doppler image and the photoacoustic image are superposed on the basis of the gray scale image to obtain a fused image. Therefore, the obtained fusion image can display the contents of the gray-scale image, the photoacoustic image and the color Doppler image only through one image, and compared with the method for displaying a plurality of images on the same ultrasonic display screen, the method and the device for displaying the photoacoustic-ultrasonic bimodal imaging can improve the definition of the displayed images and more clearly display photoacoustic-ultrasonic bimodal imaging. The observation accuracy of the operator on the tissues in the fusion image is improved.
Drawings
Fig. 1 is a schematic structural block diagram of a possible imaging system provided in an embodiment of the present application;
fig. 2 is a schematic view of an application scenario of a possible ultrasound imaging method provided in an embodiment of the present application;
FIG. 3 is a flow chart of one possible imaging method provided by an embodiment of the present application;
fig. 4 is a schematic diagram of a possible probe according to an embodiment of the present disclosure.
Detailed Description
The application provides an imaging method and an imaging system for clearly displaying photoacoustic-ultrasound bimodal imaging.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a block diagram illustrating an imaging system 10 according to an embodiment of the present disclosure. The imaging system 10 may include transmit circuitry 101, a transmit/receive selection switch 102, receive circuitry 103, a processor 105, a display 106, a memory 107, a probe 110, and a laser 120. Of course, the imaging system 10 may also include other devices or devices, etc., not shown in the figures.
The transmit circuitry 101 may excite the probe 110 to transmit ultrasound waves to the target tissue. After the probe 110 transmits the ultrasound waves, the receiving circuit 103 may receive the ultrasound echoes returned from the target tissue through the probe 110, thereby obtaining ultrasound echo signals/data. The ultrasonic echo signals/data are sent to the processor 105 directly or after being processed by the beam forming circuit. The processor 105 processes the ultrasound echo signals/data to obtain an ultrasound image of the target tissue. The ultrasound images obtained by the processor 105 may be stored in the memory 107. The laser 120 may generate laser light and emit the laser light through the probe 110 to the target tissue. After the probe 110 emits the laser light, the receiving circuit 103 may also receive photoacoustic signals/data returned by the target tissue under excitation of the laser light through the probe 110. The photoacoustic signals/data are fed to the processor 105, either directly or after processing, and the processor processes the photoacoustic signals/data to obtain a photoacoustic image of the target tissue. The ultrasound image and the photoacoustic image described above may be displayed on the display 106.
In an embodiment of the present application, the laser 120 may be connected to the transmitting/receiving selection switch 102, and the transmitting/receiving selection switch 102 controls the transmission of the laser, or the laser 120 may be directly connected to the probe 110 through a light conduction tool, and a fiber bundle is coupled to the probe 110, and the laser is conducted to both sides of the acoustic head of the probe 110 by using the fiber bundle, and the target tissue is irradiated by using a back-type light.
In an embodiment of the present application, the probe 110 may specifically include an ultrasound transducer, and the ultrasound transducer has functions of transmitting and receiving signals, which may ensure gray scale imaging and doppler bleeding imaging.
In an embodiment of the application, the display 106 of the imaging system may be a touch display screen, a liquid crystal display, or the like, or may be an independent display device such as a liquid crystal display, a television, or the like, which is independent of the imaging system, or may be a display screen on an electronic device such as a mobile phone, a tablet computer, or the like.
In one embodiment of the present application, the memory 107 of the imaging system can be a flash memory card, a solid state memory, a hard disk, or the like.
In an embodiment of the present application, a computer-readable storage medium is further provided, where a plurality of program instructions are stored, and when the plurality of program instructions are called by the processor 105 to be executed, some or all of the steps of the ultrasound imaging method in the embodiments of the present application, or any combination of the steps thereof, may be executed.
In one embodiment of the present application, the computer readable storage medium may be the memory 107, which may be a non-volatile storage medium such as a flash memory card, a solid state memory, a hard disk, and the like.
In an embodiment of the present application, the processor 105 of the aforementioned imaging system may be implemented by software, hardware, firmware or a combination thereof, and may use a circuit, a single or multiple Application Specific Integrated Circuits (ASICs), a single or multiple general purpose integrated circuits, a single or multiple microprocessors, a single or multiple programmable logic devices, or a combination of the aforementioned circuits or devices, or other suitable circuits or devices, so that the processor 105 may execute the corresponding steps of the imaging method in the various embodiments of the present application.
The imaging method in the present application is described in detail below based on the aforementioned imaging system.
It should be noted that, with reference to the schematic structural block diagram of the imaging system shown in fig. 1, the imaging method provided in the embodiment of the present application may be applied to the following application scenarios: for example, a specific application scenario may refer to fig. 2. The operator scans the probe 110 over the target tissue 201, emits laser light from the probe, receives returned photoacoustic signals, emits ultrasonic waves from the probe, and receives ultrasonic echo signals. The operator may view the tissue structure, etc. through the display 106.
Based on this, referring to fig. 3, an imaging method provided in an embodiment of the present application may be applied to the imaging system shown in fig. 1, where the imaging method includes:
301. ultrasonic waves are transmitted to the target tissue, and ultrasonic echoes returned from the target tissue are received to obtain ultrasonic echo signals.
Ultrasonic waves are transmitted to the target tissue through the probe 110, and ultrasonic echoes returned from the target tissue are received and converted into ultrasonic echo signals. The received ultrasound echo signals may also differ depending on the target tissue. The ultrasound echo signals may be understood as the aforementioned ultrasound echo signals/data.
In an embodiment of the present application, as shown in fig. 4, specifically, the processor 105 may control to turn on the transmission/reception selection switch 102, and control the transmission circuit 101 to transmit the ultrasonic wave to the target tissue through the probe 110, and receive the ultrasonic echo through the probe 110, and transmit the ultrasonic echo to the receiving circuit 103, that is, it may be understood that the receiving circuit 103 may receive the ultrasonic echo returned from the target tissue through the probe 110, so as to obtain an ultrasonic echo signal.
302. Laser light is emitted toward the target tissue and photoacoustic signals returned from the target tissue are received.
Laser light may be emitted to the target tissue through the probe 110 and then photoacoustic signals generated by the target tissue under the excitation of the laser light are received. The received photoacoustic signal may also differ depending on the target tissue.
Specifically, the laser light is coupled to the probe through a fiber optic bundle and then emitted by the probe 110 toward the target tissue. When the tissue in the target tissue absorbs the light energy, it will cause temperature rise and thermal expansion, thereby generating an outward propagation of the photoacoustic signal, which is detected by the probe 110.
In one embodiment of the present application, the laser 120 may receive a control signal sent by the processor 105, the control signal may include the frequency, timing, etc. of the generated laser light, and the laser 120 generates laser light according to the control signal, couples to the probe 110 through the fiber bundle, and sends the laser light to the target tissue.
In one embodiment of the present application, after the laser 120 generates the laser light, the laser may transmit a feedback signal to the processor 105, where the feedback signal may include an actual transmission time of the laser light, and the processor 105 determines an interval duration for receiving the photoacoustic signal according to a preset algorithm and controls the probe 110 to receive the photoacoustic signal.
It should be noted that, the laser and the ultrasonic wave are not transmitted simultaneously, the laser may be transmitted first, or the ultrasonic wave may be transmitted first, that is, step 301 may be executed first, or step 302 may be executed first, and the specific method may be adjusted according to an actual application scenario, and is not limited herein.
In an embodiment of the present application, as shown in fig. 5, specifically, the processor 105 may control to turn on the transmission/reception selection switch 102, and control the transmission circuit 101 to transmit the ultrasonic wave to the target tissue through the probe 110, and receive the ultrasonic echo through the probe 110, and transmit the ultrasonic echo to the receiving circuit 103, that is, it may be understood that the receiving circuit 103 may receive the ultrasonic echo returned from the target tissue through the probe 110, so as to obtain an ultrasonic echo signal.
In one embodiment of the present application, a fiber bundle is coupled to the ultrasound array probe, and the fiber bundle is used to conduct laser to both sides of the probe 110 to irradiate the target tissue in a back-to-back manner. And the probe 110 includes an ultrasonic transducer, which has the functions of transmitting and receiving signals, so that on the basis of ensuring the traditional ultrasonic image and doppler blood flow imaging, the ultrasonic transducer has larger frequency bandwidth and high sensitivity, improves the detection capability of photoacoustic signals, and can detect even weak signals.
303. And acquiring a gray scale image and a color Doppler image of the target tissue according to the ultrasonic echo signal.
And after receiving the ultrasonic echo signal returned from the target tissue, acquiring a gray-scale image and a color Doppler image of the target tissue according to the ultrasonic echo signal.
Specifically, after the ultrasonic echo signal is received, the noise in the ultrasonic signal can be removed. The ultrasonic echo signal is processed by the beam forming circuit, and then transmitted to the processor 105, and the processor 105 processes the ultrasonic echo signal to obtain an ultrasonic image of the target tissue. Generally, the ultrasound image is a grayscale image, which can represent structural information of the target tissue.
In one embodiment of the present application, the gray-scale image and the color Doppler image are obtained and then displayed in real time on the display 106.
It should be noted that, in the embodiment of the present application, the order of obtaining the grayscale image and the color doppler image is not limited, the grayscale image may be obtained first, or the color doppler image may be obtained first, and specifically, the order may be adjusted according to an actual application scene, and is not limited herein.
304. And acquiring a photoacoustic image of the target tissue according to the photoacoustic signal.
After the photoacoustic signal is acquired, noise in the photoacoustic signal may also be removed, and then image reconstruction processing such as beam-forming processing may be performed to obtain a photoacoustic image of the target tissue. Generally, the ultrasound image is a grayscale image, which can represent structural information of a target tissue in the target tissue, and the photoacoustic image can represent functional information of a tissue in the target tissue.
It should be noted that, in the embodiment of the present application, the order of obtaining the photoacoustic image and the ultrasound image is not limited, and the ultrasound images, that is, the grayscale image and the color doppler image, may first obtain the photoacoustic image, and may also first obtain the ultrasound image, and specifically, may be adjusted according to an actual application scenario, and is not limited herein.
305. And fusing the photoacoustic image and the color Doppler image with the gray scale image to obtain a fused image of the target tissue.
And after the photoacoustic image, the color Doppler image and the gray scale image are obtained, fusing the photoacoustic image and the color Doppler image into the gray scale image on the basis of the gray scale image to obtain a fused image of the target tissue.
Therefore, in the embodiment of the present application, the photoacoustic image and the color doppler image may be fused into the grayscale image, resulting in a fused image of the target tissue. The obtained fusion image can display the contents of a gray-scale image, a photoacoustic image and a color Doppler image only through one image, and compared with the method for displaying a plurality of images on the same ultrasonic display screen, the method and the device for displaying the photoacoustic-ultrasonic bimodal images can improve the definition of the displayed images and more clearly display the photoacoustic-ultrasonic bimodal images. The observation accuracy of the operator on the tissues in the fusion image is improved. Specifically, compared with the case that multiple frames of images, namely, a grayscale image, a photoacoustic image, a color doppler image and the like, are simultaneously displayed on the same display, the embodiment of the present application can simultaneously display information included in the grayscale image, the photoacoustic image, the color doppler image and the like through one frame of image, and under the condition of the same resolution of the display, the fused image provided by the embodiment of the present application can more clearly reflect the functional information and the structural information of the target tissue and improve the accuracy of observing the tissue in the fused image by an operator compared with the case that multiple frames of images are simultaneously displayed.
In one embodiment of the application, before the photoacoustic image and the color Doppler image are fused to the grayscale image, the pixel number of the photoacoustic image, the pixel number of the color Doppler image and the pixel number of the grayscale image are determined, and the pixel number of the photoacoustic image is determinedAnd at least one of the pixel number of the color Doppler image and the pixel number of the gray-scale image is adjusted to a preset pixel number in an interpolation mode. The photoacoustic imaging method includes the steps of obtaining a photoacoustic image, a color doppler image, a grayscale image, and a partial photoacoustic image, wherein the photoacoustic imaging method includes the steps of adjusting the number of pixels of the photoacoustic image or the partial photoacoustic image, adjusting the number of pixels of the color doppler image or the partial color doppler image, and adjusting the number of pixels of the grayscale image or the partial grayscale image, which is not limited specifically here. Or when the grayscale image and the color doppler image of the target tissue are acquired according to the ultrasound echo signal, when the photoacoustic image of the target tissue is acquired according to the photoacoustic signal, at least one of the grayscale image and the color doppler image of a preset number of pixels and the photoacoustic image of a preset number of pixels is generated, for example, when the color doppler image and the grayscale image are generated by the ultrasound echo signal, the color doppler image and the grayscale image of a preset number of pixels are generated, and when the photoacoustic image is generated by the photoacoustic signal, the photoacoustic image of a preset number of pixels is generated. For example, if the gray-scale image resolution is WB×HBThe number of pixels of the color Doppler image is WC×HCThe number of pixels of the photoacoustic image is WPA×HPA. If the pixel number of the fused image is determined to be WF×HFThen, the pixel numbers of the photoacoustic image, the color Doppler image and the gray-scale image can be adjusted to WF×HF. Of course, the color doppler image with the preset number of pixels may be generated for all or part of the color doppler image, the grayscale image with the preset number of pixels may be generated for all or part of the grayscale image, or the photoacoustic image with the preset number of pixels may be generated for all or part of the photoacoustic image, which is not limited specifically here. Specifically, the number of pixels of the photoacoustic image, the color doppler image, and the grayscale image may be adjusted by interpolation. The calculation is carried out according to a certain operation mode, a new pixel point is generated and inserted into a gap adjacent to the pixel, and therefore the purpose of increasing the number of the pixels is achieved.
The specific step of fusing the photoacoustic image and the color doppler image to the grayscale image may be to superimpose the pixel value of each pixel point in the photoacoustic image and the color doppler image on the pixel point corresponding to the grayscale image according to a preset mode to obtain the fused image.
Further, in an embodiment of the present application, the first target pixel point is any pixel point in the photoacoustic image, the second target pixel point is a pixel point corresponding to the first target pixel point in the color doppler image, the third target pixel point is a pixel point corresponding to the first target pixel point in the grayscale image, and the fourth target pixel point is a pixel point corresponding to the first target pixel point in the fused image. The first threshold and the second threshold may be determined on demand. And when the pixel value of the first target pixel point is smaller than the first threshold value and the pixel value of the second target pixel point is smaller than the second threshold value, taking the pixel value of a corresponding third target pixel point in the gray-scale image as the pixel value of a fourth target pixel point. And when the pixel value of the first target pixel point is smaller than the first threshold value and the pixel value of the second target pixel point is not smaller than the second threshold value, taking the pixel value of the second target pixel point as the pixel value of the fourth target pixel point. And when the pixel value of the first target pixel point is not smaller than the first threshold value and the pixel value of the second target pixel point is smaller than the second threshold value, taking the pixel value of the first target pixel point as the pixel value of the fourth target pixel point. When the pixel value of the first target pixel point is not less than the first threshold value and the pixel value of the second target pixel point is not less than the second threshold value, the pixel value of one of the first target pixel point or the second target pixel point can be selected by default by the system as the pixel value of the fourth target pixel point, or the pixel value of one of the first target pixel point or the second target pixel point can be selected by the operator as the pixel value of the fourth target pixel point.
Generally, the photoacoustic image may reflect functional information of the target tissue, for example, the photoacoustic image may display the position and the form of a blood vessel. The pixel value of each pixel point may reflect the strength of the functional information. If the pixel value of one pixel point is lower than the first threshold value, the pixel point cannot reflect the functional information. The color doppler image may reflect the direction and velocity of blood flow of the target tissue. The pixel value of each pixel point in the color doppler image may reflect the blood flow velocity. If the pixel value of one of the pixel points is lower than the second threshold, the blood flow velocity represented by the pixel point is too low, and it can be understood that the pixel point has no blood vessel.
Therefore, it can be understood that when the pixel value of the first target pixel is smaller than the first threshold and the pixel value of the second target pixel is smaller than the second threshold, the intensity of the first target pixel in the photoacoustic image is too low, which may indicate that there is no blood vessel at the first target pixel and that there is no blood flow at the second target pixel. Therefore, when the first target pixel point and the corresponding second target pixel point have no blood vessel function and no blood flow velocity, it can be understood that there is no blood vessel and no blood flow, and therefore, the pixel value of the corresponding third target pixel point in the gray-scale image can be used as the pixel value of the corresponding fourth target pixel point of the fusion image. That is, when one of the pixel points has no blood vessel structure or blood flow, the pixel value of the corresponding pixel point in the fusion image can be obtained based on the gray-scale image.
When the pixel value of the first target pixel point is smaller than the first threshold value and the pixel value of the second target pixel point is not smaller than the second threshold value, it can be understood that the intensity of the first target pixel point in the photoacoustic image is too low, which can represent that the first target pixel point has no blood vessel, and the pixel value of the second target pixel point represents the blood flow velocity.
When the pixel value of the first target pixel point is not less than the first threshold value and the pixel value of the second target pixel point is less than the second threshold value, it can be understood that the intensity reflected by the pixel value of the first target pixel point is high, which can represent that there is a blood vessel at the first target pixel point, and the reflection of the second target pixel point has no blood flow velocity, so that the pixel value of the first target pixel point can be used as the pixel value of the fourth target pixel point.
When the pixel value of the first target pixel point is not less than the first threshold value and the pixel value of the second target pixel point is not less than the second threshold value, it can be understood that the intensity reflected by the pixel value of the first target pixel point is large, the first target pixel point has a blood vessel, and the pixel value of the second target pixel point reflects the blood flow velocity, so that the pixel value of any one of the first target pixel point or the second target pixel point can be used as the pixel value of the fourth target pixel point, or the operator selects the pixel value of one of the first target pixel point or the second target pixel point as the pixel value of the fourth target pixel point. Specifically, the processor may receive selection information input by an operator, where the selection information may include one of the first target pixel point or the second target pixel point.
It should be noted that the pixel value may also be a gray value, a brightness value, or the like that can reflect the amplitude of the target pixel point.
In addition, in an embodiment of the present application, the operator may select a fusion mode for fusing the photoacoustic image and the color doppler image into the grayscale image. The processor receives control information of the photoacoustic image, the color Doppler image and the gray-scale image, wherein the control information comprises fusion weight, namely the weight value occupied by each image when the images are fused. According to the fusion weight, the proportion of the pixel value of each pixel point in the photoacoustic image and the pixel value of each pixel point in the color Doppler image in the fusion image can be determined, and the fusion image can be obtained. For example, an operator may set a weight ratio of a pixel value of each pixel point in the photoacoustic image to a pixel value of each pixel point in the color doppler image to be 1:1, and then, when the fused image is obtained by fusion, the pixel value of each pixel point may be obtained by an average value of the pixel value of each pixel point in the photoacoustic image and the pixel value of each pixel point in the color doppler image.
In an embodiment of the present application, the color map of the photoacoustic image is different from the color map of the color doppler image, and therefore, the position and the form of the blood vessel displayed by the fused image after fusion, and information such as the direction and the speed of the blood flow, may also be different color maps, that is, the displayed colors are different. Generally, the photoacoustic image and the color doppler image can adopt color atlases with larger visual difference so as to better distinguish information included in the photoacoustic image from information included in the color doppler image. Therefore, the embodiment of the application can obviously distinguish the position and the form of the blood vessel from information such as blood flow direction and speed by displaying different colors, so that an operator can observe the information of the target tissue more accurately. In some possible implementation manners, the color map of the grayscale image is different from the color map of the photoacoustic image and the color map of the color doppler image, that is, the color map of the grayscale image is different from the color map of the photoacoustic image and the color map of the color doppler image, so that respective image features can be distinguished more. Wherein, the color atlas of the gray-scale image is a color atlas obtained by pseudo-color. Of course, the color map of the grayscale image may be a color map obtained by processing in other manners except for pseudo color, and the grayscale image may also be the most original black-and-white image without pseudo color processing, and is not particularly limited herein.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In practical applications, the target tissue may be a face, a spine, a heart, a uterus, or a pelvic floor, or other parts of a human tissue, such as a brain, a bone, a liver, or a kidney, and is not limited herein.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

  1. An imaging method, comprising:
    transmitting ultrasonic waves to a target tissue, and receiving ultrasonic echoes returned from the target tissue to obtain ultrasonic echo signals;
    emitting laser light to the target tissue and receiving a photoacoustic signal returned from the target tissue;
    acquiring a gray scale image and a color Doppler image of the target tissue according to the ultrasonic echo signal;
    acquiring a photoacoustic image of the target tissue according to the photoacoustic signal;
    and fusing the photoacoustic image and the color Doppler image with the gray scale image to obtain a fused image of the target tissue.
  2. The method of claim 1, wherein prior to said fusing the photoacoustic image and the color doppler image with the grayscale image to obtain a fused image of the target tissue, the method further comprises:
    determining the pixel number of the photoacoustic image, the pixel number of the color Doppler image and the pixel number of the gray-scale image; adjusting at least one of the pixel number of the photoacoustic image, the pixel number of the color Doppler image and the pixel number of the gray-scale image to a preset pixel number in an interpolation mode;
    or when the gray scale image and the color Doppler image of the target tissue are acquired according to the ultrasonic echo signal, and when the photoacoustic image of the target tissue is acquired according to the photoacoustic signal, at least one of the gray scale image and the color Doppler image with the preset number of pixels and the photoacoustic image with the preset number of pixels is generated.
  3. The method of claim 1 or 2, wherein said fusing the photoacoustic image and the color doppler image with the grayscale image to obtain a fused image of the target tissue comprises:
    and superposing the pixel value of each pixel point in the photoacoustic image and the color Doppler image to the corresponding pixel point of the gray-scale image according to a preset mode to obtain the fused image.
  4. The method according to claim 3, wherein the superimposing the pixel value of each pixel point in the photoacoustic image and the color doppler image to a corresponding pixel point in the grayscale image according to a preset manner to obtain the fused image comprises:
    when the pixel value of a first target pixel point in the photoacoustic image is smaller than a first threshold value and the pixel value of a second target pixel point corresponding to the first target pixel point in the color Doppler image is smaller than a second threshold value, taking the pixel value of a corresponding third target pixel point in the gray-scale image as the pixel value of a corresponding fourth target pixel point in the fused image, wherein the first target pixel point is any one pixel point in the photoacoustic image;
    or the like, or, alternatively,
    when the pixel value of a first target pixel point is smaller than the first threshold value and the pixel value of a second target pixel point is not smaller than the second threshold value, taking the pixel value of the second target pixel point as the pixel value of a fourth target pixel point, wherein the first target pixel point is any one pixel point in the photoacoustic image;
    or the like, or, alternatively,
    when the pixel value of a first target pixel point is not smaller than the first threshold value and the pixel value of a second target pixel point is smaller than the second threshold value, taking the pixel value of the first target pixel point as the pixel value of a fourth target pixel point, wherein the first target pixel point is any one pixel point in the photoacoustic image;
    or the like, or, alternatively,
    when the pixel value of a first target pixel is not smaller than the first threshold and the pixel value of a second target pixel is not smaller than the second threshold, taking the pixel value of the second target pixel as the pixel value of a fourth target pixel or taking the pixel value of the first target pixel as the pixel value of the fourth target pixel, or calculating the pixel value of the first target pixel and the pixel value of the second target pixel according to a system preset weight to obtain a target pixel value, and taking the target pixel value as the pixel value of the fourth target pixel, wherein the first target pixel is any pixel in the photoacoustic image.
  5. The method according to claim 3, wherein the superimposing the pixel value of each pixel point in the photoacoustic image and the color doppler image to a corresponding pixel point in the grayscale image according to a preset manner to obtain the fused image comprises:
    receiving control information of the photoacoustic image, the color Doppler image and the gray scale image, wherein the control information comprises fusion weight;
    and calculating the pixel value of each pixel point in the fused image according to the fusion weight, the pixel value of each pixel point in the photoacoustic image and the pixel value of each pixel point in the color Doppler image to obtain the fused image.
  6. The method of any of claims 1-5, wherein the color map of the photoacoustic image is different than the color map of the color Doppler image.
  7. The method according to any one of claims 1 to 6, wherein the color map of the grayscale image is different from the color map of the photoacoustic image and the color map of the color Doppler image, wherein the color map of the grayscale image is a color map obtained by pseudo-color.
  8. An imaging system, comprising: the device comprises a laser, a probe, a transmitting circuit, a receiving circuit and a processor;
    the laser is used for generating laser for irradiating target tissues, the laser is coupled to the probe through a fiber bundle and emits the laser to the target tissues through the probe;
    the receiving circuit is used for controlling the probe to receive the photoacoustic signal returned from the target tissue;
    the transmitting circuit is also used for controlling the probe to transmit the ultrasonic waves to target tissues;
    the receiving circuit is further used for controlling the probe to receive the ultrasonic echo returned from the target tissue to obtain an ultrasonic echo signal;
    the processor is used for generating a control signal and sending the control signal to the laser so as to control the laser to generate the laser;
    the processor is also used for acquiring a gray scale image and a color Doppler image of the target tissue according to the ultrasonic echo signal; acquiring a photoacoustic image of the target tissue according to the photoacoustic signal; and fusing the photoacoustic image and the color Doppler image with the gray scale image to obtain a fused image of the target tissue.
  9. The imaging system of claim 8, wherein the processor is further configured to determine a number of pixels of the photoacoustic image, a number of pixels of the color doppler image, and a number of pixels of the grayscale image; adjusting at least one of the pixel number of the photoacoustic image, the pixel number of the color Doppler image and the pixel number of the gray-scale image to a preset pixel number in an interpolation mode;
    or the processor is further configured to generate at least one of a gray scale image and a color doppler image with a preset number of pixels and a photoacoustic image with a preset number of pixels when the gray scale image and the color doppler image of the target tissue are acquired according to the ultrasonic echo signal and the photoacoustic image of the target tissue is acquired according to the photoacoustic signal.
  10. The imaging system of claim 8 or 9,
    the processor is specifically configured to superimpose the pixel value of each pixel point in the photoacoustic image and the color doppler image onto a corresponding pixel point in the grayscale image according to a preset mode, so as to obtain the fused image.
  11. The imaging system of claim 10, wherein the processor is specifically configured to:
    when the pixel value of a first target pixel point in the photoacoustic image is smaller than a first threshold value and the pixel value of a second target pixel point corresponding to the first target pixel point in the color Doppler image is smaller than a second threshold value, taking the pixel value of a corresponding third target pixel point in the gray-scale image as the pixel value of a corresponding fourth target pixel point in the fusion image, wherein the first target pixel point is any one pixel point in the photoacoustic image;
    or the like, or, alternatively,
    when the pixel value of a first target pixel point is smaller than the first threshold value and the pixel value of a second target pixel point is not smaller than the second threshold value, taking the pixel value of the second target pixel point as the pixel value of a fourth target pixel point, wherein the first target pixel point is any one pixel point in the photoacoustic image;
    or the like, or, alternatively,
    when the pixel value of a first target pixel point is not smaller than the first threshold value and the pixel value of a second target pixel point is smaller than the second threshold value, taking the pixel value of the first target pixel point as the pixel value of a fourth target pixel point, wherein the first target pixel point is any one pixel point in the photoacoustic image;
    or the like, or, alternatively,
    when the pixel value of a first target pixel is not smaller than the first threshold and the pixel value of a second target pixel is not smaller than the second threshold, taking the pixel value of the second target pixel as the pixel value of a fourth target pixel or taking the pixel value of the first target pixel as the pixel value of the fourth target pixel, or calculating the pixel value of the first target pixel and the pixel value of the second target pixel according to a system preset weight to obtain a target pixel value, and taking the target pixel value as the pixel value of the fourth target pixel, wherein the first target pixel is any pixel in the photoacoustic image.
  12. The imaging system of claim 10, wherein the processor is specifically configured to:
    receiving control information of the photoacoustic image, the color Doppler image and the gray scale image, wherein the control information comprises fusion weight;
    and calculating the pixel value of each pixel point in the fused image according to the fusion weight, the pixel value of each pixel point in the photoacoustic image and the pixel value of each pixel point in the color Doppler image to obtain the fused image.
  13. The imaging system of claims 8-12, wherein the color map of the photoacoustic image is different than the color map of the color doppler image.
  14. The imaging system of claims 8-13, wherein the color map of the grayscale image is different from the color map of the photoacoustic image and the color map of the color doppler image, wherein the color map of the grayscale image is a color map obtained by pseudo-color.
CN201880055953.4A 2018-10-24 2018-10-24 Imaging method and imaging system Pending CN111432730A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/111692 WO2020082270A1 (en) 2018-10-24 2018-10-24 Imaging method and imaging system

Publications (1)

Publication Number Publication Date
CN111432730A true CN111432730A (en) 2020-07-17

Family

ID=70330838

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880055953.4A Pending CN111432730A (en) 2018-10-24 2018-10-24 Imaging method and imaging system

Country Status (2)

Country Link
CN (1) CN111432730A (en)
WO (1) WO2020082270A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112741651A (en) * 2020-12-25 2021-05-04 上海交通大学烟台信息技术研究院 Method and system for processing ultrasonic image of endoscope
CN113367660A (en) * 2021-06-09 2021-09-10 东北大学秦皇岛分校 Photoacoustic Doppler flow velocity measuring device and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013005957A (en) * 2011-06-27 2013-01-10 Fujifilm Corp Method and device for displaying doppler images
WO2017135167A1 (en) * 2016-02-05 2017-08-10 富士フイルム株式会社 System, device, and method for photoacoustic image generation
WO2017145988A1 (en) * 2016-02-22 2017-08-31 富士フイルム株式会社 Display device and display method for acoustic wave images
WO2018008439A1 (en) * 2016-07-08 2018-01-11 Canon Kabushiki Kaisha Apparatus, method and program for displaying ultrasound image and photoacoustic image

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9561017B2 (en) * 2006-12-19 2017-02-07 Koninklijke Philips N.V. Combined photoacoustic and ultrasound imaging system
CN102240213A (en) * 2010-05-12 2011-11-16 国立清华大学 Calcification imaging method and system
JP6103931B2 (en) * 2012-12-28 2017-03-29 キヤノン株式会社 Subject information acquisition apparatus and subject information acquisition method
KR20150084559A (en) * 2014-01-14 2015-07-22 삼성메디슨 주식회사 Photoacoustic apparatus and operating method for the same
CN107223035B (en) * 2017-01-23 2021-01-15 深圳迈瑞生物医疗电子股份有限公司 Imaging system, method and ultrasonic imaging system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013005957A (en) * 2011-06-27 2013-01-10 Fujifilm Corp Method and device for displaying doppler images
WO2017135167A1 (en) * 2016-02-05 2017-08-10 富士フイルム株式会社 System, device, and method for photoacoustic image generation
WO2017145988A1 (en) * 2016-02-22 2017-08-31 富士フイルム株式会社 Display device and display method for acoustic wave images
WO2018008439A1 (en) * 2016-07-08 2018-01-11 Canon Kabushiki Kaisha Apparatus, method and program for displaying ultrasound image and photoacoustic image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112741651A (en) * 2020-12-25 2021-05-04 上海交通大学烟台信息技术研究院 Method and system for processing ultrasonic image of endoscope
CN112741651B (en) * 2020-12-25 2022-11-25 上海交通大学烟台信息技术研究院 Method and system for processing ultrasonic image of endoscope
CN113367660A (en) * 2021-06-09 2021-09-10 东北大学秦皇岛分校 Photoacoustic Doppler flow velocity measuring device and method
CN113367660B (en) * 2021-06-09 2022-11-25 东北大学秦皇岛分校 Photoacoustic Doppler flow velocity measuring device and method

Also Published As

Publication number Publication date
WO2020082270A1 (en) 2020-04-30

Similar Documents

Publication Publication Date Title
US11635514B2 (en) Imaging methods and apparatuses for performing shear wave elastography imaging
RU2480147C2 (en) Combined system of photoacoustic and ultrasonic image formation
EP1614387B1 (en) Ultrasonic diagnostic apparatus, image processing apparatus and image processing method
KR100825054B1 (en) Method and ultrasound diagnostic system for imaging color flow images
KR20070069322A (en) Ultrasound diagnostic system and method for detecting lesion
CN108882914B (en) Ultrasonic contrast imaging method and ultrasonic imaging system
KR20120044267A (en) Ultrasound diagnostic apparatus and method for tracing movement of tissue
CN113081054B (en) Ultrasonic imaging method and ultrasonic imaging system
US9448100B2 (en) Signal processing apparatus
JP2004202229A (en) Method and apparatus for contrast agent time intensity curve analysis
KR20160125934A (en) Ultrasonic image display apparatus and control program thereof
CN110604598A (en) Ultrasonic imaging method and ultrasonic imaging system
CN111432730A (en) Imaging method and imaging system
US20120203111A1 (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image acquisition method
CN111727013B (en) Imaging method and imaging system
CN114727760A (en) Photoacoustic imaging method and photoacoustic imaging system
JP2023506861A (en) Systems and methods for assessing placenta
CN110613477B (en) Ultrasonic imaging method and ultrasonic apparatus
US20210330226A1 (en) Imaging method and imaging system
CN108888236A (en) A kind of multi-mode imaging system and method
CN112638270B (en) Ultrasonic imaging method and ultrasonic imaging system in multiplexing mode
JP4909132B2 (en) Optical tomography equipment
EP3854311B1 (en) Ultrasound imaging apparatus and control method thereof
CN112654294B (en) Blood vessel position display method and ultrasonic imaging system
KR20080044393A (en) Ultrasound system for forming ultrasound image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination