CN117297521A - Image acquisition method, image acquisition device, computer device, storage medium, and endoscope - Google Patents

Image acquisition method, image acquisition device, computer device, storage medium, and endoscope Download PDF

Info

Publication number
CN117297521A
CN117297521A CN202311043089.5A CN202311043089A CN117297521A CN 117297521 A CN117297521 A CN 117297521A CN 202311043089 A CN202311043089 A CN 202311043089A CN 117297521 A CN117297521 A CN 117297521A
Authority
CN
China
Prior art keywords
target
image
determining
level signal
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311043089.5A
Other languages
Chinese (zh)
Inventor
陈凯凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huanuokang Technology Co ltd
Original Assignee
Zhejiang Huanuokang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huanuokang Technology Co ltd filed Critical Zhejiang Huanuokang Technology Co ltd
Priority to CN202311043089.5A priority Critical patent/CN117297521A/en
Publication of CN117297521A publication Critical patent/CN117297521A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/046Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for infrared imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Endoscopes (AREA)

Abstract

The present application relates to an image acquisition method, an image acquisition apparatus, a computer device, a storage medium, and an endoscope. The method comprises the following steps: acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device; determining a target exposure parameter according to the illumination type; and controlling the image sensor to acquire a target image of the target object based on the target exposure parameters. The method can enable the image sensor to acquire the image information of the target object based on the target exposure parameters corresponding to the illumination type, and improves the definition and reliability of the target image.

Description

Image acquisition method, image acquisition device, computer device, storage medium, and endoscope
Technical Field
The present application relates to the technical field of medical devices, and in particular, to an image acquisition method, an image acquisition device, a computer device, a storage medium, and an endoscope.
Background
Fluorescent endoscopes are a novel diagnostic device for intra-cavity diagnosis through various endoscopes by laser intrinsic fluorescence spectroscopy. The fluorescent endoscope can automatically identify and diagnose according to the inherent fluorescence spectrum characteristics of human tissues, can immediately prompt whether the tested tissues are normal tissues, can identify benign and malignant lesions of the tested tissues, and improves the diagnosis rate of early cancers and abnormal hyperplasia. When diagnosing human tissue by a fluorescence endoscope, a sensor fluorescence system is often used to collect a target image of the human tissue, and a diagnosis structure is determined by the target image. When the 3D fluorescence electronic endoscope is used for collecting the target image of the human tissue, the internal space of the 3D endoscope rod is smaller, and the fluorescence effect can be realized only through a single image sensor, so that a frame alternation scheme is needed, a white light image of white light irradiated on the human tissue and a fluorescence image of fluorescence irradiated on the human tissue are collected alternately through the 3D fluorescence electronic endoscope system, and the white light image and the fluorescence image are fused into the target image at a processor end.
In the traditional technology, when the 3D fluorescence electronic endoscope is used for acquiring images of white light images and fluorescent images, the image sensor needs to acquire images by adopting different exposure parameters under different illumination types, the light source is controlled according to the acquired image data, and due to the delay of a signal link, the time for opening the exposure of the image sensor and the time for opening the light source are inconsistent, so that the effect of the images is affected. Therefore, how to improve the image quality of the image acquired by the fluorescence endoscope, and also to improve the reliability of the tissue image, and to improve the diagnostic efficiency of the fluorescence endoscope is a problem to be solved.
Disclosure of Invention
In view of the above, it is necessary to provide an image acquisition method, an image acquisition apparatus, a computer device, a storage medium, and an endoscope, which can improve the definition and reliability of a target image and the diagnostic efficiency of a fluorescence endoscope.
In a first aspect, the present application provides an image acquisition method, the method comprising:
acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
determining a target exposure parameter according to the illumination type;
and controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
In one embodiment, acquiring a level signal sent by a photosensor, determining a target photosensor based on the level signal, and determining a type of illumination to be irradiated on a target object according to the target photosensor includes:
acquiring a level signal sent by a photosensitive device, and determining a device identifier based on the level signal;
determining a target photosensitive device based on the device identification;
a type of illumination impinging on the target object is determined based on the target light sensitive device.
In one embodiment, the image acquisition method further comprises:
determining whether to update the target exposure parameters according to the image gray value of the target image;
if yes, determining updating exposure parameters according to the image gray value of the target image;
and adjusting the exposure parameters of the image sensor based on the updated exposure parameters.
In one embodiment, performing exposure parameter adjustment on the image sensor based on the updated exposure parameter includes:
determining illumination information corresponding to the target illumination type according to the level signal;
determining a parameter update time for the target exposure parameter according to the illumination information and the parameter configuration delay time of the image sensor;
and adjusting the exposure parameters of the image sensor based on the parameter updating time and the updated exposure parameters.
In one embodiment, determining the illumination information corresponding to the target illumination type according to the level signal includes:
determining the on time and the off time of the light source according to the level signal;
and determining illumination information corresponding to the target illumination type according to the opening time and the closing time.
In one embodiment, determining whether to update the target exposure parameter according to the image gray value of the target image includes:
determining whether an image gray value of the target image meets a gray value condition;
if yes, determining to update the target exposure parameters.
In a second aspect, the present application also provides an image processing apparatus, the apparatus including:
the illumination type determining module is used for acquiring a level signal sent by the photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
the target exposure parameter determining module is used for determining target exposure parameters according to the illumination type;
and the target image determining module is used for controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
In a third aspect, the present application also provides an endoscope comprising:
a candidate photosensitive device for generating a level signal when the target object is irradiated;
the data processor is used for determining a target photosensitive device from the candidate photosensitive devices according to the level signals, determining the illumination type of the target photosensitive device on the target object, determining a target exposure parameter according to the illumination type, sending the target exposure parameter to the image sensor, and determining a target image according to image data acquired by the image sensor;
and the image sensor is used for acquiring image data of the target object based on the target exposure parameters.
In a fourth aspect, the present application also provides a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
determining a target exposure parameter according to the illumination type;
and controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
In a fourth aspect, the present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
determining a target exposure parameter according to the illumination type;
and controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
The image acquisition method, the image acquisition device, the computer equipment, the storage medium and the endoscope are used for determining the target photosensitive device according to the level signal sent by the photosensitive device when acquiring the target image of the target object and determining the illumination type irradiated on the target object according to the target photosensitive device; determining a target exposure parameter according to the illumination type; the control image sensor acquires a target image of the target object based on the target exposure parameters. The method solves the problem that when the 3D fluorescent electronic endoscope collects the target image of the target object, the target image is difficult to collect clearly and reliably due to the influence of illumination type. When the light irradiates on the target object, the photosensitive device can accurately judge the illumination type, so that the exposure parameters of the image sensor are adjusted according to the illumination type, the image sensor acquires the image information of the target object based on the target exposure parameters corresponding to the illumination type, and the definition and the reliability of the target image are improved.
Drawings
FIG. 1 is a flow chart of an image acquisition method in one embodiment;
FIG. 2 is a flow chart of another embodiment of an image acquisition method;
FIG. 3 is a flow chart of another embodiment of an image acquisition method;
FIG. 4 is a schematic diagram of image sensor exposure time in one embodiment;
FIG. 5 is a schematic diagram of an image acquisition device in one embodiment;
FIG. 6 is a block diagram of an image acquisition apparatus in one embodiment;
FIG. 7 is a schematic view of an endoscope in one embodiment;
fig. 8 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, an image acquisition method is provided, where the embodiment is applied to a terminal to illustrate the method, it is understood that the method may also be applied to a server, and may also be applied to a system including the terminal and the server, and implemented through interaction between the terminal and the server. In this embodiment, the method includes the steps of:
s110, acquiring a level signal sent by the photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on the target object according to the target photosensitive device.
The photosensitive device refers to an element capable of converting an optical signal into a level signal. It should be noted that the types of light-sensitive devices are numerous, and different types of light-sensitive devices can generate level signals in different types of illumination environments. For example, an infrared light photosensor can produce a level signal when infrared light is perceived, a white light photosensor can produce a level signal when white light is perceived, and a fluorescent photosensor can produce a level signal when fluorescence is perceived. The illumination type refers to a type of illumination light that impinges on the photosensitive device, and for example, the illumination type may be infrared light, white light, fluorescence, or the like. The target photosensitive device refers to a photosensitive device that generates a level signal.
Specifically, when the level signal sent by the photosensor is acquired, it is determined that the target object is irradiated by the light emitted by the light source, and the image acquisition device needs to acquire the image information of the target object. Meanwhile, which photosensor is emitted by is determined based on the level signal, the photosensor is determined as a target photosensor, and the illumination type corresponding to the light irradiated on the target object is determined according to the target photosensor.
Wherein the target object is an object to be acquired with image information, and the photosensitive device is irradiated while the light is irradiated on the target object. For example, in the field of medical endoscopes, the target object may be human tissue from which image information is acquired by the medical endoscope.
S120, determining a target exposure parameter according to the illumination type.
The exposure parameter is parameter information capable of affecting the exposure amount of the image frame of the target object.
Specifically, in order to ensure the definition of the acquired image frame of the target object, it is necessary to determine the target exposure parameter corresponding to the illumination type according to the illumination type of the light irradiated on the target object.
S130, controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
Wherein the image sensor is an image acquisition device for acquiring image information of a target object. The target image of the target object is the image picture of the target object.
Specifically, after the target exposure parameters are determined, the image sensor is controlled to acquire image information of the target object based on the target exposure parameters, and the image information is processed to determine a target image of the target object. The image information may be an image frame of the target object acquired by the image sensor.
In the image acquisition method, when a target image of a target object is acquired, a target photosensitive device is determined according to a level signal sent by the photosensitive device, and the illumination type irradiated on the target object is determined according to the target photosensitive device; determining a target exposure parameter according to the illumination type; the control image sensor acquires a target image of the target object based on the target exposure parameters. The method solves the problem that when the 3D fluorescent electronic endoscope collects the target image of the target object, the clear and reliable target image is difficult to collect due to the influence of the illumination type. When the light irradiates on the target object, the photosensitive device can accurately judge the illumination type, so that the exposure parameters of the image sensor are adjusted according to the illumination type, the image sensor acquires the image information of the target object based on the target exposure parameters corresponding to the illumination type, and the definition and the reliability of the target image are improved.
In one embodiment, as shown in fig. 2, acquiring a level signal transmitted by a photosensor, determining a target photosensor based on the level signal, and determining a type of illumination to be irradiated on a target object according to the target photosensor, including:
s210, acquiring a level signal sent by the photosensitive device, and determining the device identification based on the level signal.
Wherein device identification refers to information that can characterize the type of photosensitive device.
Specifically, after the level signal sent by the photosensitive device is obtained, a signal source of the level signal is determined based on the level signal, so that the device identification of the photosensitive device sending the level signal is determined according to the signal source.
S220, determining a target photosensitive device based on the device identification.
For example, a correspondence relationship between the candidate identifier and the candidate photosensitive device may be preset, and after determining the device identifier, the target photosensitive device may be determined from the candidate photosensitive devices according to the device identifier and the correspondence relationship between the candidate identifier and the candidate photosensitive device.
S230, determining the illumination type irradiated on the target object based on the target photosensitive device.
Specifically, the illumination type of the target photosensitive device capable of sensing and generating the level signal is determined, and the illumination type of the target photosensitive device capable of sensing and generating the level signal is taken as the illumination type of the target object.
For example, a correspondence between the candidate photosensitive device and the candidate type may be preset, and according to the correspondence between the candidate photosensitive device and the candidate type, an illumination type corresponding to the target photosensitive device is determined from the candidate types, that is, an illumination type corresponding to the target photosensitive device is illuminated on the target object.
In this embodiment, the device identifier of the photosensitive device is determined by the level signal, and the target photosensitive device that emits the level signal is determined by the device identifier, so that the illumination type that irradiates on the target object is determined according to the target photosensitive device, and the determination efficiency of the illumination type can be improved.
On the basis of the above embodiment, as shown in fig. 3, the exposure parameter adjustment may be further performed on the image sensor according to the acquired target image, including the following steps:
s310, determining whether to update the target exposure parameters according to the image gray value of the target image.
Wherein the image gray value determines the image sharpness and the image quality of the target image, so that the image sharpness of the target image can be measured by the image gray value. In this scheme, only the influence of the exposure parameters on the image sharpness is considered, so if the image sharpness is low, it is indicated that the target exposure parameters need to be updated.
Specifically, whether the image definition of the target image meets the requirement can be determined according to the image gray value of the target image, and if so, the target exposure parameters do not need to be updated; if not, the target exposure parameters adopted in the current image acquisition environment are unsuitable, and the target exposure parameters need to be updated.
For example, it may be determined whether the image gray value of the target image satisfies a gray value condition; if not, determining to update the target exposure parameters.
The gray value meeting condition may be that the image gray value of the target image is within a gray value range corresponding to the expected gray value, that is, the image gray value is greater than a minimum gray value corresponding to the expected gray value, and the gray value is less than a maximum gray value corresponding to the expected gray value.
Whether the target exposure parameters are updated or not is determined according to the image gray values of the target image, so that the judging efficiency and accuracy of whether the target exposure parameters are updated or not can be improved.
And S320, if yes, determining updating exposure parameters according to the image gray value of the target image.
The image gray value refers to the brightness value of each pixel in the target image. Updating the exposure parameters refers to the exposure parameters adopted by the image sensor when acquiring images under the illumination type after updating the target exposure parameters.
Specifically, if the target exposure parameter needs to be updated, determining an image gray value of the target image, and determining the updated exposure parameter according to the image gray value. For example, the expected gray value may be determined according to the gray value of the historical image of the target object, if the image gray value is smaller than the minimum gray value in the expected gray values, the exposure parameter is determined to be required to be increased, the parameter increment for the target exposure parameter is determined according to the image gray value, and the updated exposure parameter is determined according to the target exposure parameter and the parameter increment. If the image gray value is larger than the maximum gray value in the expected gray values, determining that the target exposure parameter needs to be lowered, determining the parameter reduction amount of the target exposure parameter according to the image gray value, and determining the updated exposure parameter according to the target exposure parameter and the parameter reduction amount.
S330, adjusting the exposure parameters of the image sensor based on the updated exposure parameters.
Specifically, when the light source irradiates the light corresponding to the illumination type on the target object again, the image sensor is subjected to exposure parameter adjustment based on the updated exposure parameter, so that the image sensor can acquire the image information of the target object based on the updated exposure parameter.
Illustratively, the adjustment of the exposure parameters of the image sensor based on the updated exposure parameters may be achieved by the sub-steps of:
s3301, determining illumination information corresponding to the target illumination type according to the level signal.
The illumination information includes an illumination duration and a switching frequency of light corresponding to the illumination type.
Specifically, the illumination information corresponding to the target illumination type is determined according to the signal emission frequency of the level signal.
Illustratively, the on-time and off-time of the light source may also be determined from the level signal; and determining illumination information corresponding to the target illumination type according to the opening time and the closing time.
Specifically, the turn-on time and turn-off time of the light source are determined according to the generation time and the disappearance time of the level signal, and the turn-on frequency and the turn-off frequency of the light source corresponding to the target illumination type are determined according to the turn-on time and the turn-off time. And determining illumination information corresponding to the target illumination type according to the light source on frequency and the light source off frequency.
The above scheme provides a preferred embodiment of determining illumination information corresponding to a target illumination type according to the level signal, and when determining the illumination information, the accuracy of the obtained illumination information can be improved by considering the influence of the on time and the off time of the light source on the illumination information.
S3302, determining the parameter updating time of the target exposure parameter according to the illumination information and the parameter configuration delay time of the image sensor.
The delay of the system adopted by the image acquisition method provided by the application comprises the following steps: the delay of the photosensitive device sensing the output signal, the delay of the configuration image sensor, the delay of the image data from the cable to the image processor, the delay of the image processor parsing the data and sending the result to the system controller, and the delay of the system controller sending instructions to the image driver via the cable. The delay in sensing the output signal by the photosensitive device, typically in ns order, is negligible. The delay from the image data to the image processor through the cable is required to be controlled, the delay from the image processor to the image driver through the cable is required to be controlled, the delay from the image data to the image processor through the cable is only required to be ensured, the delay from the image processor to the image processor through the cable is required to be controlled, the sum of the delay from the image processor through the cable to the image driver through the result is required to be controlled, the delay from the image processor through the cable to the image processor through the result is required to be controlled, and the sum of the delay from the image processor through the cable to the image driver through the result is required to be controlled, so that the delay from the image processor through the cable to the image driver through the result is less than 16.6ms, and the delay from the image driver through the system controller through the cable is easy to realize. Therefore, in the present application, only the delay of configuring the image sensor, that is, the parameter configuration delay time, is considered. The parameter configuration delay time refers to delay time generated when information is transmitted inside the handle, and the parameter configuration delay time can be measured in advance. The parameter update time refers to the time for performing exposure parameter adjustment on the image sensor.
It should be noted that, as shown in fig. 4, when the white light image and the fluorescent image of the target object are acquired through the medical endoscope, if the first frame image frame acquired by the image sensor corresponds to white light and the second frame image frame corresponds to fluorescent light, in order to ensure that the white light frame corresponding to the white light image and the fluorescent frame corresponding to the fluorescent image do not interfere with each other, the white light needs to be turned on at the beginning of time t1 and turned off at the end of time t 1; the fluorescence needs to be turned on at the beginning of time t2 and turned off at the end of time t 2. In the field of medical endoscopes, the control time for the image sensor to acquire the image information of the target object has higher requirements, so that the parameter updating time for the target exposure parameter is determined according to the illumination information and the parameter configuration delay time of the image sensor, the accurate control of the image sensor can be realized, and the problem of mutual interference between the image information acquired by the image sensor is avoided.
S3303, adjusting the exposure parameters of the image sensor based on the parameter updating time and the updated exposure parameters.
Specifically, according to the parameter updating time, the exposure parameter of the image sensor is adjusted to be the updated exposure parameter, so that the image sensor acquires the image information of the target object under the illumination condition corresponding to the illumination type based on the updated exposure parameter.
According to the scheme, when the exposure parameters of the image sensor are adjusted, the illumination information and the parameter update time for adjusting the target exposure parameters by the parameter configuration delay time of the image sensor are considered, so that the timeliness of the exposure parameter adjustment is ensured.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
The image acquisition method provided by the embodiment of the application can be applied to the image acquisition device shown in fig. 5. The image acquisition equipment comprises a handle and a host, wherein the handle is connected with the host through a cable. The handle comprises a lens rod, a lens, a two-pass filter, an image sensor and an image driver. The host includes an image processor and a system controller.
The handle is connected with the light source through the light guide beam, the light source can be used for providing white light and excitation light, the light emitted by the light source irradiates the target object through the light guide beam in the mirror rod, and when the target object is irradiated by the excitation light, fluorescence can be released. The lens is used for receiving light from an object to be irradiated, and can be used for white light, a laser method and fluorescence. The dual-pass filter can pass white light and fluorescence, and filter out excitation light. The image sensor is used for performing photoelectric conversion on the optical signals obtained after the filtering to determine image information, generating white light image frames after photoelectric conversion of white light wave bands, and generating fluorescent image frames after photoelectric conversion of fluorescence. The image driver is used to configure the exposure parameters of the image sensor, and may be an MCU (Microcontroller Unit, micro control unit) or other processing chip. The image processor can determine a target image of the target object according to the image information acquired by the image sensor, and perform fusion processing on the target images corresponding to different illumination types to acquire a fusion image. The system controller is used for sending an exposure parameter adjustment instruction to the image driver, controlling the output and switching of the light source and the like.
When the object is imaged, the system controller controls the light source to alternately emit white light and excitation light, the white light or the excitation light emitted by the light source irradiates the object and the photosensitive device through the light guide beam in the mirror rod, and the photosensitive device generates a level signal after sensing illumination. Determining a target photosensitive device according to the level signal by an image driver, determining the illumination type of the light source irradiated on the target object according to the target photosensitive device, and determining a target exposure parameter according to the illumination type; and carrying out parameter configuration on the image sensor through the image driver, enabling the image sensor to acquire image information of the target object based on the target exposure parameters, and sending the image information to the image processor. The image information is processed through the image processor to obtain a target image, whether the target exposure parameter is updated is determined according to the image gray value of the target image, if the image gray value does not meet the gray value condition, the target exposure parameter is determined to be updated, the updated exposure parameter is determined according to the image gray value of the target image, and the updated exposure parameter is sent to the image driver. After the image driver acquires the updated exposure parameters, determining the on time and the off time of the light source according to the level signals of the target photosensitive device, determining illumination information corresponding to the target illumination type according to the on time and the off time, determining the parameter updating time of the target exposure parameters according to the illumination information and the parameter configuration delay time of the image sensor, and adjusting the exposure parameters of the image sensor based on the parameter updating time and the updated exposure parameters.
Based on the same inventive concept, the embodiments of the present application also provide an image acquisition apparatus for implementing the above-mentioned image acquisition method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation of one or more embodiments of the image capturing device provided below may be referred to the limitation of the image capturing method hereinabove, and will not be repeated here.
In one embodiment, as shown in fig. 6, there is provided an image acquisition apparatus including: an illumination type determination module 601, a target exposure parameter determination module 602, and a target image determination module 603, wherein:
the illumination type determining module 601 is configured to obtain a level signal sent by the photosensor, determine a target photosensor based on the level signal, and determine an illumination type of the target photosensor on the target object;
a target exposure parameter determining module 602, configured to determine a target exposure parameter according to the illumination type;
a target image determining module 603 is configured to control the image sensor to acquire a target image of the target object based on the target exposure parameter.
The image acquisition device is used for determining the target photosensitive device according to the level signal sent by the photosensitive device when acquiring the target image of the target object, and determining the illumination type irradiated on the target object according to the target photosensitive device; determining a target exposure parameter according to the illumination type; the control image sensor acquires a target image of the target object based on the target exposure parameters. The method solves the problem that when the 3D fluorescent electronic endoscope collects the target image of the target object, the target image is difficult to collect clearly and reliably due to the influence of illumination type. When the light irradiates on the target object, the photosensitive device can accurately judge the illumination type, so that the exposure parameters of the image sensor are adjusted according to the illumination type, the image sensor acquires the image information of the target object based on the target exposure parameters corresponding to the illumination type, and the definition and the reliability of the target image are improved.
Illustratively, the illumination type determination module 601 is specifically configured to:
acquiring a level signal sent by a photosensitive device, and determining a device identifier based on the level signal;
determining a target photosensitive device based on the device identification;
the type of illumination impinging on the target object is determined based on the target light sensitive device.
Illustratively, the image acquisition apparatus further includes:
the parameter updating judging module is used for determining whether to update the target exposure parameters according to the image gray value of the target image;
the updating exposure parameter determining module is used for determining updating exposure parameters according to the image gray value of the target image if yes;
and the exposure parameter adjustment module is used for adjusting the exposure parameters of the image sensor based on the updated exposure parameters.
Illustratively, the exposure parameter adjustment module is further configured to:
determining illumination information corresponding to the target illumination type according to the level signal;
determining the parameter updating time of the target exposure parameter according to the illumination information and the parameter configuration delay time of the image sensor;
the image sensor is subjected to exposure parameter adjustment based on the parameter update time and the updated exposure parameter.
Illustratively, the exposure parameter adjustment module is further configured to:
determining the on time and the off time of the light source according to the level signal;
and determining illumination information corresponding to the target illumination type according to the opening time and the closing time.
The parameter updating judging module is specifically configured to:
determining whether an image gray value of the target image meets a gray value condition;
if not, determining to update the target exposure parameters.
The respective modules in the image acquisition apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, as shown in fig. 7, there is provided an endoscope comprising:
a candidate photosensitive device 100 for generating a level signal when the target object is irradiated;
a data processor 200 for determining a target light sensor from the candidate light sensors according to the level signal, determining a type of illumination irradiated on the target object according to the target light sensor, determining a target exposure parameter according to the type of illumination, transmitting the target exposure parameter to the image sensor, and determining a target image according to the image data acquired by the image sensor;
the image sensor 300 is used for acquiring image information of a target object based on target exposure parameters.
The endoscope is provided with the candidate photosensitive devices for sensing different types of illumination, the target photosensitive devices can be determined according to the level signals sent by the candidate photosensitive devices through the data processor, and the illumination type is further determined according to the target photosensitive devices so as to determine the target exposure parameters of the image information of the target object collected by the image sensor under the illumination type, so that the image sensor can collect the image information of the target object based on the proper exposure parameters, the reliability of the image information is improved, the definition and the reliability of the target image determined by the data processor according to the image information are improved, the problem that the diagnosis result is inaccurate due to the unreliability of the target image when the patient is diagnosed through the fluorescent endoscope is avoided, and the reliability of the diagnosis result is improved.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 8. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an image acquisition method. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 8 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
step one, acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
step two, determining a target exposure parameter according to the illumination type;
and step three, controlling an image sensor to acquire a target image of the target object based on the target exposure parameters.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
step one, acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
step two, determining a target exposure parameter according to the illumination type;
and step three, controlling an image sensor to acquire a target image of the target object based on the target exposure parameters.
In one embodiment, a computer program product is provided comprising a computer program which, when executed by a processor, performs the steps of:
step one, acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
step two, determining a target exposure parameter according to the illumination type;
and step three, controlling an image sensor to acquire a target image of the target object based on the target exposure parameters.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to comply with the related laws and regulations and standards of the related countries and regions.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A method of image acquisition, the method comprising:
acquiring a level signal sent by a photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
determining a target exposure parameter according to the illumination type;
and controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
2. The image capturing method according to claim 1, wherein capturing a level signal transmitted by a photosensor, determining a target photosensor based on the level signal, determining a type of illumination to be irradiated on a target object from the target photosensor, comprises:
acquiring a level signal sent by a photosensitive device, and determining a device identifier based on the level signal;
determining a target photosensitive device based on the device identification;
a type of illumination impinging on the target object is determined based on the target light sensitive device.
3. The image acquisition method according to claim 1, characterized in that the method further comprises:
determining whether to update the target exposure parameters according to the image gray value of the target image;
if yes, determining updating exposure parameters according to the image gray value of the target image;
and adjusting the exposure parameters of the image sensor based on the updated exposure parameters.
4. The image acquisition method according to claim 3, wherein performing exposure parameter adjustment on the image sensor based on the updated exposure parameter comprises:
determining illumination information corresponding to the target illumination type according to the level signal;
determining a parameter update time for the target exposure parameter according to the illumination information and the parameter configuration delay time of the image sensor;
and adjusting the exposure parameters of the image sensor based on the parameter updating time and the updated exposure parameters.
5. The method of claim 4, wherein determining illumination information corresponding to the target illumination type based on the level signal comprises:
determining the on time and the off time of the light source according to the level signal;
and determining illumination information corresponding to the target illumination type according to the opening time and the closing time, wherein the illumination information comprises illumination duration and switching frequency of light corresponding to the illumination type.
6. The image capturing method according to claim 3, wherein determining whether to update the target exposure parameter according to the image gray value of the target image comprises:
determining whether an image gray value of the target image meets a gray value condition;
if not, determining to update the target exposure parameters.
7. An image acquisition apparatus, the apparatus comprising:
the illumination type determining module is used for acquiring a level signal sent by the photosensitive device, determining a target photosensitive device based on the level signal, and determining the illumination type irradiated on a target object according to the target photosensitive device;
the target exposure parameter determining module is used for determining target exposure parameters according to the illumination type;
and the target image determining module is used for controlling the image sensor to acquire a target image of the target object based on the target exposure parameters.
8. An endoscope, the endoscope comprising:
a candidate photosensitive device for generating a level signal when the target object is irradiated;
the data processor is used for determining a target photosensitive device from the candidate photosensitive devices according to the level signals, determining the illumination type of the target photosensitive device on the target object, determining a target exposure parameter according to the illumination type, sending the target exposure parameter to the image sensor, and determining a target image according to image data acquired by the image sensor;
and the image sensor is used for acquiring image information of the target object based on the target exposure parameters.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 6 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN202311043089.5A 2023-08-17 2023-08-17 Image acquisition method, image acquisition device, computer device, storage medium, and endoscope Pending CN117297521A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311043089.5A CN117297521A (en) 2023-08-17 2023-08-17 Image acquisition method, image acquisition device, computer device, storage medium, and endoscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311043089.5A CN117297521A (en) 2023-08-17 2023-08-17 Image acquisition method, image acquisition device, computer device, storage medium, and endoscope

Publications (1)

Publication Number Publication Date
CN117297521A true CN117297521A (en) 2023-12-29

Family

ID=89296134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311043089.5A Pending CN117297521A (en) 2023-08-17 2023-08-17 Image acquisition method, image acquisition device, computer device, storage medium, and endoscope

Country Status (1)

Country Link
CN (1) CN117297521A (en)

Similar Documents

Publication Publication Date Title
US11237270B2 (en) Hyperspectral, fluorescence, and laser mapping imaging with fixed pattern noise cancellation
US11900594B2 (en) Methods and systems for displaying a region of interest of a medical image
EP1349378A1 (en) Gain correction of image signal and calibration for gain correction
WO2019157078A1 (en) Systems and methods for analysis and remote interpretation of optical histologic images
JP2005040613A (en) Control system and test for medical technology and/or method of operation of treatment apparatus
CN103857335A (en) Anisotropic processing of laser speckle images
US11221414B2 (en) Laser mapping imaging with fixed pattern noise cancellation
JPWO2020022038A1 (en) Information processing equipment, information processing methods, information processing systems, and programs
JP2005109790A (en) Medical image processing apparatus
CN106175804A (en) Image processing equipment and image processing method
JP2014018251A (en) Ophthalmological photographing apparatus and ophthalmological photographing program
JP2001276032A (en) Photographing instrument, its system, its control method and storage medium
JP6105903B2 (en) Image processing apparatus, image processing method, radiation imaging system, and program
CN117297521A (en) Image acquisition method, image acquisition device, computer device, storage medium, and endoscope
JP2001238868A (en) Method of image processing and its apparatus
US6661874B2 (en) X-ray image diagnostic apparatus
US20180374211A1 (en) Information processing apparatus, and program, method and system thereof
JP6898150B2 (en) Pore detection method and pore detection device
JP2006255093A (en) Medical image system
JP2015012984A (en) Blood vessel visualization device, blood vessel visualization method, and program
US20220395247A1 (en) Radiation imaging system and storage medium
WO2019146228A1 (en) Medical imaging device and medical imaging method
JP2020057112A (en) Learning data generation method, learning data set, evaluation system, program, and learning method
EP4202839A1 (en) Ct data reconstruction method and apparatus, electronic device and computer-readable storage medium
JP2015093013A (en) Radiation image processing device, radiographic apparatus, and control methods, and programs thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination