CN113768452A - Intelligent timing method and device for electronic endoscope - Google Patents
Intelligent timing method and device for electronic endoscope Download PDFInfo
- Publication number
- CN113768452A CN113768452A CN202111088686.0A CN202111088686A CN113768452A CN 113768452 A CN113768452 A CN 113768452A CN 202111088686 A CN202111088686 A CN 202111088686A CN 113768452 A CN113768452 A CN 113768452A
- Authority
- CN
- China
- Prior art keywords
- image
- electronic endoscope
- target part
- pixel
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000013528 artificial neural network Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 9
- 230000002496 gastric effect Effects 0.000 description 9
- 238000013527 convolutional neural network Methods 0.000 description 8
- 210000004877 mucosa Anatomy 0.000 description 7
- 238000004043 dyeing Methods 0.000 description 5
- 238000007689 inspection Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000010186 staining Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 235000010724 Wisteria floribunda Nutrition 0.000 description 1
- 210000000436 anus Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 210000003238 esophagus Anatomy 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 210000000936 intestine Anatomy 0.000 description 1
- 210000000214 mouth Anatomy 0.000 description 1
- 231100000915 pathological change Toxicity 0.000 description 1
- 230000036285 pathological change Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- Endoscopes (AREA)
Abstract
The invention discloses an intelligent timing method and device for an electronic endoscope, which are used for acquiring an image acquired by the electronic endoscope, recording the acquisition time of the image if the image is the image of a target part, and further outputting the acquisition time of the image acquired by the electronic endoscope to the target part. The invention judges whether the image is the image collected by the electronic endoscope to the target part or not according to the image collected by the electronic endoscope, further correspondingly records the image collecting time, and can more accurately record the examination time of the electronic endoscope to the target part.
Description
Technical Field
The invention relates to the technical field of image processing application, in particular to an intelligent timing method and device for an electronic endoscope.
Background
The electronic endoscope is a main mode for examining digestive tract diseases, researches show that examination time is in positive correlation with the detectable rate of pathological changes, relevant examination guidelines also have relevant time length specifications for gastrointestinal examination by using the electronic endoscope, and in addition, examination time of a case in which the gastrointestinal examination is carried out is required to be recorded and stored in a doctor.
In the prior art, the time recording is carried out by taking the starting time of the endoscope as the starting time, but in the actual operation process, the target part is not inspected immediately after the endoscope is started, so the method cannot accurately record the actual inspection time of the target part, and the recorded inspection time is longer than the actual inspection time of the target part.
Disclosure of Invention
The invention aims to provide an intelligent timing method and an intelligent timing device for an electronic endoscope, which can accurately record the examination time of the electronic endoscope on a target part.
In order to achieve the purpose, the invention provides the following technical scheme:
an electronic endoscope intelligent timing method, comprising:
acquiring an image acquired by an electronic endoscope;
if the image is the image of the target part, recording the acquisition time of the image;
and outputting the acquisition time of the electronic endoscope for acquiring the image of the target part.
Preferably, the determining whether the image is the image of the target region includes:
selecting pixels belonging to the target part from the image;
determining whether the image is an image of the target site based on an area of the target site in the image.
Preferably, the determining whether the pixel of the image belongs to the target region includes: and for the pixel of the image, acquiring the component values of all channels of the pixel, and determining whether the pixel belongs to the target part according to the component values of all channels of the pixel.
Preferably, the determining whether the pixel belongs to the target portion according to the component values of the channels of the pixel includes: and if the component values of all the channels of the pixel are respectively in the corresponding preset value ranges, the pixel is considered to belong to the target part.
Preferably, the determining whether the pixel of the image belongs to the target region includes:
determining whether pixels of the image belong to the target part in a plurality of color spaces of the image respectively;
and voting by combining the judgment results obtained in each color space of the image to obtain whether the pixel of the image belongs to the target part.
Preferably, the determining whether the image is an image of the target region includes:
and processing the image by using a pre-trained neural network to obtain a judgment result of whether the image is the image of the target part, wherein the neural network is used for processing an input image to obtain a classification result of the image of the target part or a classification result of the image not being the image of the target part, or the neural network is used for processing the input image to obtain a classification result of the image being any one part of a plurality of preset parts.
Preferably, the method further comprises the following steps: and if the current frame image acquired by the electronic endoscope is the image of the target part and the continuous preset number of frame images before the current frame image are all the images of the target part, starting timing, and if the current frame image acquired by the electronic endoscope is not the image of the target part and the continuous preset number of frame images before the current frame image are not the images of the target part, stopping timing.
Preferably, the method further comprises the following steps: and judging the part of the image, and recording the name of the part of the image and the acquisition time.
Preferably, the method further comprises the following steps: and correspondingly storing the recorded time data, the image data acquired by the current object and the personal information of the current object.
An electronic endoscope intelligent timing device is used for executing the electronic endoscope intelligent timing method.
According to the technical scheme, the intelligent timing method and device for the electronic endoscope acquire the image acquired by the electronic endoscope, record the acquisition time of the image if the image is the image of the target part, and further output the acquisition time of the image acquired by the electronic endoscope to the target part. The invention judges whether the image is the image collected by the electronic endoscope to the target part or not according to the image collected by the electronic endoscope, further correspondingly records the image collecting time, and can more accurately record the examination time of the electronic endoscope to the target part.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an intelligent timing method for an electronic endoscope according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for determining whether an image is of a target region according to the present embodiment;
FIG. 3 is a flowchart of a method for determining whether a pixel of an image belongs to a target region according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an electronic endoscope intelligent timing device according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of an intelligent timing method for an electronic endoscope according to the present embodiment, and as shown in the figure, the method includes the following steps:
s10: and acquiring an image collected by the electronic endoscope. The method comprises the steps of acquiring images collected by the electronic endoscope in real time in the working process of the electronic endoscope after the electronic endoscope is started to operate.
S11: and if the image is the image of the target part, recording the acquisition time of the image.
The image acquisition time is the time when the electronic endoscope acquires the image. And judging whether the acquired image collected by the electronic endoscope is the image of the target part or not, namely whether the acquired image is the image collected on the target part or not. And if so, recording the acquisition time of the image.
S12: and outputting the acquisition time of the electronic endoscope for acquiring the image of the target part.
And outputting the corresponding acquisition time of the electronic endoscope for acquiring the image of the target part, so that an operator can know the time condition of acquiring the image of the target part.
The electronic endoscope intelligent timing method of the embodiment judges whether the image is acquired by the target part according to the image acquired by the electronic endoscope, and further records the image acquisition time correspondingly, so that the inspection time of the electronic endoscope on the target part can be recorded more accurately.
The intelligent timing method of the electronic endoscope is described in detail with reference to the specific embodiments.
In this embodiment, a specific method for determining whether an image is an image of a target portion is not limited, and it is sufficient to determine whether an image is an image of a target portion. Optionally, whether the acquired image is the image of the target portion may be determined by the following method, please refer to fig. 2, where fig. 2 is a flowchart of a method for determining whether the image is the image of the target portion in this embodiment, and as shown in the figure, the method includes the following steps:
s20: pixels belonging to the target portion are selected from the image.
And judging whether the pixel belongs to the target part or not for each pixel of the image, thereby obtaining the pixel belonging to the target part from the image.
S21: determining whether the image is an image of the target site based on an area of the target site in the image.
And obtaining the area of the target part in the image according to the pixels belonging to the target part in the image. Thus, whether or not the image is determined as the image of the target portion is determined based on the area of the target portion in the image.
Alternatively, it may be determined whether the image is an image of the target region based on the area proportion of the target region in the image. The area proportion of the target part in the image can be obtained according to the proportion of the number of pixels belonging to the target part in the image to the total number of pixels in the image. Alternatively, a proportion range may be set, and whether the image is the target portion image may be determined based on whether the area proportion of the target portion in the image is within the proportion range.
Alternatively, whether or not to determine the image as the target portion image may be determined based on the size of the area of the target portion in the image. The area of the target portion in the image may be obtained from the number of pixels in the image that belong to the target portion. Optionally, an area value range may be set, and whether the image is the target region image is determined according to whether the area of the target region in the image is within the area value range.
However, it is not limited thereto, and it is also within the scope of the present invention to implement the determination of whether the image is the image of the target portion according to the pixels belonging to the target portion in the image by other methods, for example, according to the distribution of the pixels belonging to the target portion in the image.
Whether an image pixel belongs to the target region can be judged according to the characteristics of the image pixel. Optionally, the determining whether the pixel of the image belongs to the target region may include: and for the pixel of the image, acquiring the component values of all channels of the pixel, and determining whether the pixel belongs to the target part according to the component values of all channels of the pixel.
Optionally, if the image is described in HSV color space, correspondingly, each channel of the pixel includes a hue channel, a saturation channel, and a brightness channel, and each component value includes a hue value, a saturation value, and a brightness value. If the image is described in the HSI color space, each channel of the pixel includes a hue channel, a saturation channel and a brightness channel, and each component value includes a hue value, a saturation value and a brightness value. Or if the image is described in RGB color space, then each channel of a pixel comprises a red channel, a green channel, and a blue channel, and each component value comprises a red value, a green value, and a blue value. Without being limited thereto, the image may also be described in other color spaces, and the image pixels are determined accordingly according to the corresponding channel component values.
Optionally, the following method may be adopted to determine whether the pixel belongs to the target portion according to the component values of each channel of the pixel: and if the component values of all the channels of the pixel are respectively in the corresponding preset value ranges, the pixel is considered to belong to the target part.
For setting the preset value range corresponding to each channel component value, an electronic endoscope can be used for collecting images of the target part, and the distribution of each channel component value of the target part image in the corresponding color space is counted according to the collected images of the target part, so that the preset value range corresponding to each channel component value is set according to the counted condition. Preferably, in practical application, various electronic endoscopes of different models can be used for collecting images of the target part, and statistics can be carried out according to the collected images.
In addition, the electronic endoscope can have a plurality of dyeing modes, and the preset value range of the corresponding component value of each channel can be set for any dyeing mode of the electronic endoscope. For a dyeing mode of the electronic endoscope, the electronic endoscope is used for collecting images of a target part in the dyeing mode, the distribution of component values of each channel of the target part image in a corresponding color space is counted according to the obtained images, and a preset value range corresponding to the component values of each channel in the dyeing mode is set according to a statistical result. When the electronic endoscope is used for actual examination, the electronic endoscope adopts a staining mode, and whether the target part image is the target part image can be judged by using the preset value range of the component value of each channel corresponding to the staining mode. The staining mode of the electronic endoscope includes, but is not limited to, the olympus NBI mode or the fuji BLI mode.
If the electronic endoscope is used for collecting the white light image when the detected object is detected, the preset value range of each channel component value is set for the white light image, and the white light image of the target part is correspondingly collected when the preset value range of each channel component value is set.
Illustratively, in one embodiment, the electronic endoscope captures an image having an effective image area of size Iw*IhSpecifically, the area where the two adjacent frames of images change can be obtained by performing differential operation on the two adjacent frames of images, and the effective image area of the image collected by the electronic endoscope can be obtained according to the changed area.
And converting the effective image part from the RGB color space to the HSV color space, and describing the image in the HSV color space. The target part is gastrointestinal, and the value ranges of the H value, the S value and the V value corresponding to the gastrointestinal mucosa color in the gastrointestinal mucosa image collected under white light are respectively as follows: [0,20] & [156,180], [40,255], [80,255 ]. For a white light image acquired by an electronic endoscope in real time, for each pixel in an image effective area, if the H value, the S value and the V value of the pixel respectively meet the respective corresponding value ranges, the pixel is judged to be the gastrointestinal mucosa. Compared with an RGB image, the accuracy of judging whether the image pixel is the target part or not according to the H value, the S value and the V value of the image pixel is higher.
Counting the number of pixels belonging to the gastrointestinal mucosa in the image as N, and calculating the color area ratio t of the mucosa as N/(I)w*Ih). When t is>When the image is 0.8, the current image is determined as the gastrointestinal mucosa image, when t is more than or equal to 0 and less than or equal to 0.2, the current image is determined as the non-gastrointestinal mucosa image, and when t is more than or equal to 0.2<And when t is less than or equal to 0.8, judging that the current image is unknown.
Preferably, whether the pixel of the image belongs to the target portion can be further determined by the following method, please refer to fig. 3, where fig. 3 is a flowchart of a method for determining whether the pixel of the image belongs to the target portion, and mainly includes the following steps:
s30: determining whether pixels of the image belong to the target region in a plurality of color spaces of the image, respectively.
The color space of an image refers to the way in which the colors of the image are described. The color space of the image includes, but is not limited to, HSV color space, HSI color space, or RGB color space.
The image is described by a plurality of color spaces respectively, and whether the pixel of the image belongs to the target part or not is judged under each color space of the image. Alternatively, determining whether a pixel of the image belongs to the target region in any color space of the image may be performed by the method described above.
S31: and voting by combining the judgment results obtained in each color space of the image to obtain whether the pixel of the image belongs to the target part.
And for any pixel of the image, voting is carried out according to the judgment result of whether the pixel belongs to the target part or not obtained in each color space of the image and the judgment result in each color space, so that whether the pixel belongs to the target part or not is obtained.
In the method for judging whether the pixel of the image belongs to the target part or not according to the embodiment, the judgment is respectively carried out in the plurality of color spaces of the image, and the voting is carried out by combining the judgment results of the color spaces, so that the accuracy can be further improved.
Optionally, the method may further determine whether the acquired image is an image of the target region by the following method, specifically including: and processing the image by using a pre-trained neural network to obtain a judgment result of whether the image is the image of the target part, wherein the neural network is used for processing an input image to obtain a classification result of whether the image is the image of the target part or not.
A training sample image set may be created, and images acquired using the electronic endoscope may be classified and labeled as target site images and non-target site images as sample images. And then training the convolutional neural network by using the sample image to obtain the trained convolutional neural network. Convolutional neural networks may employ, but are not limited to, google lenet, ResNet, and like classification networks.
Optionally, the neural network may be configured to process an input image to obtain a classification result of an image of any one of a plurality of preset portions.
When the neural network is trained, an electronic endoscope is used for collecting images of different parts, such as oral cavity, human face, esophagus, stomach and intestine, anus and the like, and the collected images are marked to be the images of the parts to be used as sample images to establish a training sample image set. And then training the convolutional neural network by using the sample image to obtain the trained convolutional neural network. The trained convolutional neural network can output the classification result of the part of the input image, and further can obtain the result of whether the image is the target part. The convolutional neural network used can adopt, but is not limited to, a target detection network such as YOLO, Faster R-CNN and the like.
Further, the electronic endoscope intelligent timing method of the embodiment further includes the following processes: and if the current frame image acquired by the electronic endoscope is the image of the target part and the continuous preset number of frame images before the current frame image are all the images of the target part, starting timing, and if the current frame image acquired by the electronic endoscope is not the image of the target part and the continuous preset number of frame images before the current frame image are not the images of the target part, stopping timing.
During the operation of the electronic endoscope, images collected by the electronic endoscope are acquired in real time, if the first frame of image is determined to be the target part image, the detection result of the next image is continuously waited, and when n continuous images are determined to be the target part image, timing is started. In the process of inspecting the target region, when an image determined as not being the target region image in the first frame is received, the detection result of the next image is continuously waited, and when n images are continuously detected and determined as not being the target region image, the timing is ended. Therefore, the time for the electronic endoscope to acquire the images of the target part is timed.
The method not only can accurately record the acquisition time of the electronic endoscope for acquiring the images of the target part, but also can time the time length of the electronic endoscope for acquiring the images of the target part, thereby meeting the application requirements of the electronic endoscope.
Further preferably, the electronic endoscope intelligent timing method of the present embodiment further includes: and judging the part of the image, and recording the name of the part of the image and the acquisition time.
In the working process of the electronic endoscope, the image collected by the electronic endoscope is acquired, the part of the image, namely the image of which part the image belongs to is judged, and if the image is identified to be the image of some specific parts, the name of the part of the image and the collection time can be recorded. Thereby realizing the intelligent identification of the examined part of the electronic endoscope. Preferably, when the part of the image is identified, the time for the electronic endoscope to acquire the image of the part can be timed.
Further preferably, the method of this embodiment further includes: and correspondingly storing the recorded time data, the image data acquired by the current object and the personal information of the current object. Thereby facilitating the doctor to diagnose the condition of the object by combining the image obtained by the examination of the object and the data related to the examination time.
Correspondingly, the embodiment also provides an electronic endoscope intelligent timing device, which is used for executing the electronic endoscope intelligent timing method.
The electronic endoscope intelligent timing device of the embodiment acquires an image acquired by the electronic endoscope, records the acquisition time of the image if the image is the image of the target part, and further outputs the acquisition time of the image acquired by the electronic endoscope to the target part, so that whether the image is the image acquired by the electronic endoscope or not is judged according to the image acquired by the electronic endoscope, the acquisition time of the image is correspondingly recorded, and the inspection time of the electronic endoscope to the target part can be accurately recorded.
Optionally, referring to fig. 4, fig. 4 is a schematic view of an electronic endoscope intelligent timing device provided in this embodiment, where the device as shown in the figure includes: a recognition module 30 for judging whether the image is an image of the target portion; and the display module 31 is used for displaying the image acquired by the electronic endoscope on the target part and the acquisition time.
Optionally, the images acquired by the electronic endoscope and the corresponding acquisition time may be displayed in real time on a system interface of the electronic endoscope, so that an operator can know the examination time at any time.
The electronic endoscope further includes a timing module 32, configured to start timing if a current frame image acquired by the electronic endoscope is an image of the target portion and consecutive preset number of frame images before the current frame image are all images of the target portion, and stop timing if the current frame image acquired by the electronic endoscope is not an image of the target portion and consecutive preset number of frame images before the current frame image are not images of the target portion. Therefore, the device can time the time length of the electronic endoscope for collecting the images of the target part.
The above description describes the method and apparatus for intelligent timing of electronic endoscope provided by the present invention in detail. The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.
Claims (10)
1. An electronic endoscope intelligent timing method is characterized by comprising the following steps:
acquiring an image acquired by an electronic endoscope;
if the image is the image of the target part, recording the acquisition time of the image;
and outputting the acquisition time of the electronic endoscope for acquiring the image of the target part.
2. The electronic endoscope intelligent timing method of claim 1, wherein determining whether the image is of the target site comprises:
selecting pixels belonging to the target part from the image;
determining whether the image is an image of the target site based on an area of the target site in the image.
3. The electronic endoscope intelligent timing method of claim 2, wherein determining whether a pixel of the image belongs to the target site comprises: and for the pixel of the image, acquiring the component values of all channels of the pixel, and determining whether the pixel belongs to the target part according to the component values of all channels of the pixel.
4. The intelligent electronic endoscope timing method according to claim 3, wherein determining whether the pixel belongs to the target portion according to the component values of the channels of the pixel comprises: and if the component values of all the channels of the pixel are respectively in the corresponding preset value ranges, the pixel is considered to belong to the target part.
5. The electronic endoscope intelligent timing method of claim 2, wherein determining whether a pixel of the image belongs to the target site comprises:
determining whether pixels of the image belong to the target part in a plurality of color spaces of the image respectively;
and voting by combining the judgment results obtained in each color space of the image to obtain whether the pixel of the image belongs to the target part.
6. The electronic endoscope intelligent timing method of claim 1, wherein determining whether the image is of a target site comprises:
and processing the image by using a pre-trained neural network to obtain a judgment result of whether the image is the image of the target part, wherein the neural network is used for processing an input image to obtain a classification result of the image of the target part or a classification result of the image not being the image of the target part, or the neural network is used for processing the input image to obtain a classification result of the image being any one part of a plurality of preset parts.
7. The electronic endoscope intelligent timing method of any one of claims 1-6, further comprising: and if the current frame image acquired by the electronic endoscope is the image of the target part and the continuous preset number of frame images before the current frame image are all the images of the target part, starting timing, and if the current frame image acquired by the electronic endoscope is not the image of the target part and the continuous preset number of frame images before the current frame image are not the images of the target part, stopping timing.
8. The electronic endoscope intelligent timing method of claim 1, further comprising: and judging the part of the image, and recording the name of the part of the image and the acquisition time.
9. The electronic endoscope intelligent timing method of claim 1, further comprising: and correspondingly storing the recorded time data, the image data acquired by the current object and the personal information of the current object.
10. An electronic endoscope intelligent timing device, which is used for executing the electronic endoscope intelligent timing method of any one of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111088686.0A CN113768452A (en) | 2021-09-16 | 2021-09-16 | Intelligent timing method and device for electronic endoscope |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111088686.0A CN113768452A (en) | 2021-09-16 | 2021-09-16 | Intelligent timing method and device for electronic endoscope |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113768452A true CN113768452A (en) | 2021-12-10 |
Family
ID=78851448
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111088686.0A Pending CN113768452A (en) | 2021-09-16 | 2021-09-16 | Intelligent timing method and device for electronic endoscope |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113768452A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115082739A (en) * | 2022-07-01 | 2022-09-20 | 苏州慧维智能医疗科技有限公司 | Endoscope evaluation method and system based on convolutional neural network |
CN116208533A (en) * | 2023-02-23 | 2023-06-02 | 极限人工智能有限公司 | Endoscope image transmission delay monitoring method and system |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002345726A (en) * | 2001-05-30 | 2002-12-03 | Asahi Optical Co Ltd | Electronic endoscopic system and its utilization recording method for electronic endoscopic system |
JP2017056123A (en) * | 2015-09-18 | 2017-03-23 | Hoya株式会社 | Image recording system for electronic endoscope |
CN109347719A (en) * | 2018-09-11 | 2019-02-15 | 内蒙古工业大学 | A kind of image junk mail filtering method based on machine learning |
CN111012285A (en) * | 2019-12-06 | 2020-04-17 | 腾讯科技(深圳)有限公司 | Endoscope moving time determining method and device and computer equipment |
CN111523551A (en) * | 2020-04-03 | 2020-08-11 | 青岛进化者小胖机器人科技有限公司 | Binarization method, device and equipment for blue object |
CN111728613A (en) * | 2020-08-18 | 2020-10-02 | 安翰科技(武汉)股份有限公司 | Image-based position detection method, electronic device, and readable storage medium |
CN111931754A (en) * | 2020-10-14 | 2020-11-13 | 深圳市瑞图生物技术有限公司 | Method and system for identifying target object in sample and readable storage medium |
CN112200250A (en) * | 2020-10-14 | 2021-01-08 | 重庆金山医疗器械有限公司 | Digestive tract segmentation identification method, device and equipment of capsule endoscope image |
CN112446876A (en) * | 2020-12-11 | 2021-03-05 | 北京大恒普信医疗技术有限公司 | anti-VEGF indication distinguishing method and device based on image and electronic equipment |
CN112712057A (en) * | 2021-01-13 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Traffic signal identification method and device, electronic equipment and storage medium |
-
2021
- 2021-09-16 CN CN202111088686.0A patent/CN113768452A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002345726A (en) * | 2001-05-30 | 2002-12-03 | Asahi Optical Co Ltd | Electronic endoscopic system and its utilization recording method for electronic endoscopic system |
JP2017056123A (en) * | 2015-09-18 | 2017-03-23 | Hoya株式会社 | Image recording system for electronic endoscope |
CN109347719A (en) * | 2018-09-11 | 2019-02-15 | 内蒙古工业大学 | A kind of image junk mail filtering method based on machine learning |
CN111012285A (en) * | 2019-12-06 | 2020-04-17 | 腾讯科技(深圳)有限公司 | Endoscope moving time determining method and device and computer equipment |
CN111523551A (en) * | 2020-04-03 | 2020-08-11 | 青岛进化者小胖机器人科技有限公司 | Binarization method, device and equipment for blue object |
CN111728613A (en) * | 2020-08-18 | 2020-10-02 | 安翰科技(武汉)股份有限公司 | Image-based position detection method, electronic device, and readable storage medium |
CN111931754A (en) * | 2020-10-14 | 2020-11-13 | 深圳市瑞图生物技术有限公司 | Method and system for identifying target object in sample and readable storage medium |
CN112200250A (en) * | 2020-10-14 | 2021-01-08 | 重庆金山医疗器械有限公司 | Digestive tract segmentation identification method, device and equipment of capsule endoscope image |
CN112446876A (en) * | 2020-12-11 | 2021-03-05 | 北京大恒普信医疗技术有限公司 | anti-VEGF indication distinguishing method and device based on image and electronic equipment |
CN112712057A (en) * | 2021-01-13 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Traffic signal identification method and device, electronic equipment and storage medium |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115082739A (en) * | 2022-07-01 | 2022-09-20 | 苏州慧维智能医疗科技有限公司 | Endoscope evaluation method and system based on convolutional neural network |
CN115082739B (en) * | 2022-07-01 | 2023-09-01 | 苏州慧维智能医疗科技有限公司 | Endoscope evaluation method and system based on convolutional neural network |
CN116208533A (en) * | 2023-02-23 | 2023-06-02 | 极限人工智能有限公司 | Endoscope image transmission delay monitoring method and system |
CN116208533B (en) * | 2023-02-23 | 2023-10-20 | 极限人工智能有限公司 | Endoscope image transmission delay monitoring method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104470416B (en) | Image processing apparatus and endoscope apparatus | |
US8918164B2 (en) | Method and system for detecting colorimetric abnormalities in vivo | |
US9959618B2 (en) | Image processing apparatus, image processing method, and computer-readable recording medium | |
US8144993B2 (en) | Medical image processing method | |
US7907775B2 (en) | Image processing apparatus, image processing method and image processing program | |
CN113768452A (en) | Intelligent timing method and device for electronic endoscope | |
CN104540438A (en) | Image processing device and endoscopic instrument | |
US20090208071A1 (en) | Medical Image Processing Apparatus, Luminal Image Processing Apparatus, Luminal Image Processing Method, and Programs for the Same | |
WO2021147429A1 (en) | Endoscopic image display method, apparatus, computer device, and storage medium | |
CN106056588A (en) | Capsule endoscope image data redundancy removing method | |
CN106102554A (en) | Image processing apparatus, image processing method and image processing program | |
JPWO2019198637A1 (en) | Image processing equipment, endoscopic system, and image processing method | |
CN104700424B (en) | Medical color fujinon electronic video endoscope dead pixel points of images detection means | |
CN115553685B (en) | Method for judging entrance and exit of endoscope | |
CN103327883A (en) | Medical image processing device and medical image processing method | |
CN109242792B (en) | White balance correction method based on white object | |
CN111767958A (en) | Real-time enteroscopy withdrawal time monitoring method based on random forest algorithm | |
CN110136808B (en) | Auxiliary display system of shooting device | |
Ghosh et al. | Block based histogram feature extraction method for bleeding detection in wireless capsule endoscopy | |
Bourbakis et al. | A neural network-based detection of bleeding in sequences of WCE images | |
KR101551814B1 (en) | Method, system and recording medium for analysing tongue diagnosis using a quantifying evalution indicator of tongue diagnosis image | |
WO2021112436A1 (en) | Device and method for automatic calculation of cleanness of bowel | |
KR100904084B1 (en) | Grouping method of tongue coating region using color-based approach | |
KR102225426B1 (en) | Method and device for estimating amount of tongue coating and system thereof | |
EP3066976A1 (en) | Organ image capturing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |