CN112089392A - Capsule endoscope control method, device, equipment, system and storage medium - Google Patents

Capsule endoscope control method, device, equipment, system and storage medium Download PDF

Info

Publication number
CN112089392A
CN112089392A CN202011098021.3A CN202011098021A CN112089392A CN 112089392 A CN112089392 A CN 112089392A CN 202011098021 A CN202011098021 A CN 202011098021A CN 112089392 A CN112089392 A CN 112089392A
Authority
CN
China
Prior art keywords
target node
capsule endoscope
node
image
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011098021.3A
Other languages
Chinese (zh)
Inventor
王建平
阚述贤
吴良信
孔令松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jifu Medical Technology Co ltd
Original Assignee
Shenzhen Jifu Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jifu Medical Technology Co ltd filed Critical Shenzhen Jifu Medical Technology Co ltd
Priority to CN202011098021.3A priority Critical patent/CN112089392A/en
Publication of CN112089392A publication Critical patent/CN112089392A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00011Operational features of endoscopes characterised by signal transmission
    • A61B1/00016Operational features of endoscopes characterised by signal transmission using wireless means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00158Holding or positioning arrangements using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)

Abstract

The invention discloses a capsule endoscope control method, a device, equipment, a system and a storage medium, wherein the method comprises the following steps: according to the three-dimensional position relation model, taking the node to be scanned in the image shot by the capsule endoscope as a current node, and planning a current cruising path of the capsule endoscope in a target area; determining a target node to be scanned by the capsule endoscope from the current cruise path; receiving a first image of the capsule endoscope shot in real time in the target area; determining the position of the target node in the first image to obtain the position information of the target node; adjusting the position and the posture of the capsule endoscope according to the position information of the target node; and controlling the capsule endoscope to scan the target node. The capsule endoscope can be automatically controlled to scan in a target area, and the effectiveness and efficiency of controlling the capsule endoscope are improved.

Description

Capsule endoscope control method, device, equipment, system and storage medium
Technical Field
The invention relates to the technical field of medical instruments, in particular to a capsule endoscope control method, device, equipment, system and storage medium.
Background
The passive inspection mode of the capsule endoscope has the defect of high missing inspection risk, and a magnetically controlled capsule endoscope system capable of realizing active control is appeared on the market, in the system, an operator manually controls the movement and rotation of a second magnet in a magnetic control device according to experience, and then drives the movement and rotation of the capsule endoscope provided with a first magnet, however, the control method has the defects of poor control effect and low control efficiency of the capsule endoscope, and the like.
Disclosure of Invention
In order to solve the technical problems in the prior art, the invention provides a capsule endoscope control method, a device, equipment, a system and a storage medium, aiming at automatically controlling a capsule endoscope to scan in a target area and improving the effectiveness and efficiency of controlling the capsule endoscope.
The embodiment of the invention provides a capsule endoscope control method, which comprises the following steps:
s11: planning a current cruising path of the capsule endoscope in a target area by taking the node to be scanned in an image shot by the capsule endoscope as a current node according to a three-dimensional position relation model among the nodes to be scanned in the target area;
s12: determining a target node to be scanned by the capsule endoscope from the current cruising path;
s13: receiving a first image of the capsule endoscope taken in real time in the target area, wherein the first image is taken when a target node appears in the visual field of the capsule endoscope;
s14: determining the position of the target node in the first image to obtain target node position information;
s15: according to the target node position information, adjusting the position and the posture of the capsule endoscope to enable the target node to appear in the center of a second image shot by the capsule endoscope;
s16: and controlling the capsule endoscope to scan the target node.
In some embodiments, the method further comprises the steps of:
s17: after the target node is scanned, marking the target node as a scanned node;
repeating the steps S12 to S17 until the scanning of all the nodes to be scanned is completed.
In some embodiments, the method further comprises the steps of:
s18: judging whether the capsule endoscope deviates from the current cruising path or whether the target node is searched for exceeding the preset time;
when the capsule endoscope deviates from the current cruising path or a target node is searched for more than preset time, repeating the steps S11 to S18 until the scanning of all the nodes to be scanned is completed;
when the capsule endoscope does not deviate from the current cruising path and the target node is not searched for beyond a preset time, repeating the steps S12 to S18 until the scanning of all the nodes to be scanned is completed.
In some embodiments, the planning a current cruising path of the capsule endoscope in the target region by using the node to be scanned in the image taken by the capsule endoscope as a current node according to the three-dimensional position relationship model between the nodes to be scanned in the target region includes:
establishing the three-dimensional position relation model among the nodes to be scanned in the target area to obtain a three-dimensional network topological graph;
and taking the nodes to be scanned in the image shot by the capsule endoscope as current nodes, planning and traversing the optimal cruising path of all the nodes to be scanned according to the three-dimensional network topological graph, wherein the optimal cruising path comprises the scanning sequence of each node to be scanned, and taking the optimal cruising path as the current cruising path. In some embodiments, step S12 is preceded by: and controlling the capsule endoscope to scan the current node.
In some embodiments, step S13 is preceded by: and adjusting the position and the posture of the capsule endoscope according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in the first image shot by the capsule endoscope.
In some embodiments, the determining the position of the target node in the first image, and obtaining target node position information includes:
obtaining the name of the target node, a target node mask and a target node detection frame according to the AI model;
and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
In some embodiments, the determining the position of the target node in the first image, and obtaining target node position information includes:
performing node feature identification and node name judgment on the first image input node detection AI model, and identifying the target node and the name of the target node in the first image;
segmenting the identified target node through a node segmentation AI network model to generate a target node mask and a target node detection frame;
and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
In some embodiments, said controlling said capsule endoscope to scan said target node comprises: and controlling the capsule endoscope to perform cross scanning and/or annular scanning on the target node.
A capsule endoscope control device comprising:
the cruise path planning module is used for planning a current cruise path of the capsule endoscope in the target area by taking the nodes to be scanned in the image shot by the capsule endoscope as current nodes according to a three-dimensional position relation model among the nodes to be scanned in the target area;
the first determination module is used for determining a target node to be scanned by the capsule endoscope from the current cruise path;
the first receiving module is used for receiving a first image shot by the capsule endoscope in real time in the target area, wherein the first image is shot when a target node appears in the visual field range of the capsule endoscope;
the second determining module is used for determining the position of the target node in the first image to obtain the position information of the target node;
the first control module is used for adjusting the position and the posture of the capsule endoscope according to the position information of the target node so that the target node appears in the center of a second image shot by the capsule endoscope;
and the second control module is used for controlling the capsule endoscope to scan the target node.
In some embodiments, the apparatus further comprises:
and the marking module is used for marking the target node as a scanned node after the scanning of the target node is finished. In some embodiments, the apparatus further comprises:
and the judging module is used for judging whether the capsule endoscope deviates from the current cruising path or whether the target node is searched for more than preset time.
In some embodiments, the cruise path planning module comprises:
the first establishing unit is used for establishing the three-dimensional position relation model among the nodes to be scanned in the target area to obtain a three-dimensional network topological graph;
and the planning unit is used for planning and traversing the optimal cruising path of all the nodes to be scanned according to the three-dimensional network topological graph by taking the nodes to be scanned in the image shot by the capsule endoscope as the current nodes, wherein the optimal cruising path comprises the scanning sequence of each node to be scanned, and the optimal cruising path is taken as the current cruising path.
In some embodiments, the second control module is further configured to: and controlling the capsule endoscope to scan the current node.
In some embodiments, the first control module is further configured to: and adjusting the position and the posture of the capsule endoscope according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in the first image shot by the capsule endoscope.
In some embodiments, the second determining module comprises:
the AI unit is used for obtaining the name of the target node, a target node mask and a target node detection frame according to an AI model;
and the first determining unit is used for determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
In some embodiments, the second determining module comprises:
the identification unit is used for performing node feature identification and node name judgment on the first image input node detection AI model, and identifying the target node and the name of the target node in the first image;
the segmentation unit is used for segmenting the identified target nodes through the node segmentation AI model to generate a target node mask and a target node detection frame;
and a second determining unit, configured to determine, according to the target node mask and the target node detection frame, a position of the target node in the first image to obtain target node position information, where the target node position information includes a target node position and a target node size, and the target node size is a pixel number of an image in the target node detection frame.
A capsule endoscope control device comprising:
a memory for storing executable instructions;
and the processor is used for calling and executing the operation corresponding to each module or unit when the executable instruction stored in the memory is executed.
A capsule endoscope control system comprising: a capsule endoscope, a magnetic control device and a capsule endoscope control device; the capsule endoscope includes: the capsule endoscope is used for acquiring image data through the camera module and the control module and sending the image data to the outside of a target area through the radio frequency module, and the first magnet enables the capsule endoscope to be controlled by magnetic control equipment through magnetic acting force;
the capsule endoscope control device controls the position and the posture of the second magnet in the three-dimensional working area through the transmission mechanism so as to realize the position and posture adjustment of the capsule endoscope;
the capsule endoscope control device is used for planning a current cruising path of a capsule endoscope in a target area by taking the node to be scanned in an image shot by the capsule endoscope as a current node according to a three-dimensional position relation model among the nodes to be scanned in the target area, determining the target node to be scanned by the capsule endoscope from the current cruising path, receiving a first image shot by the capsule endoscope in real time in the target area, wherein the first image is an image shot when the target node appears in the visual field range of the capsule endoscope, determining the position of the target node in the first image to obtain the position information of the target node, and controlling the magnetic control device to change the position and the posture of a second magnet through a transmission mechanism according to the position information of the target node so as to realize the adjustment of the position and the posture of the capsule endoscope, and enabling the target node to appear in the center of a second image shot by the capsule endoscope, and controlling the capsule endoscope to scan the target node. In some embodiments, the system further comprises: and the wireless transceiving equipment is used for receiving the image data sent by the capsule endoscope, forming an integral image by using an image data packet in the image data and sending the image to the capsule endoscope control equipment.
A computer readable storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to carry out the operations performed in the method described above.
According to the capsule endoscope control method provided by the embodiment of the invention, according to a three-dimensional position relation model among nodes to be scanned in a target area, the nodes to be scanned in an image shot by the capsule endoscope are taken as current nodes, and a current cruising path of the capsule endoscope in the target area is planned; determining a target node to be scanned by the capsule endoscope from the current cruising path; receiving a first image of the capsule endoscope taken in real time in the target area, wherein the first image is taken when a target node appears in the visual field of the capsule endoscope; determining the position of the target node in the first image to obtain target node position information; and adjusting the position and the posture of the capsule endoscope according to the position information of the target node, so that the target node appears in the center of a second image shot by the capsule endoscope, and the capsule endoscope is controlled to scan the target node. The capsule endoscope can be automatically controlled to scan in a target area, and the effectiveness and efficiency of controlling the capsule endoscope are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention.
FIG. 1 is a diagram illustrating an exemplary embodiment of a capsule endoscope control method;
FIG. 2 is a schematic flow chart illustrating a method for controlling a capsule endoscope in accordance with an embodiment of the present invention;
FIG. 3 is a schematic flow chart of another method of controlling a capsule endoscope in accordance with an embodiment of the present invention;
FIG. 4 is a three-dimensional position relationship model between nodes to be scanned in a target region according to an embodiment of the present invention;
FIG. 5 illustrates possible cruise paths between nodes to be scanned in a target area according to an embodiment of the present invention;
FIG. 6 is an optimal cruise path between nodes to be scanned in a target area according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a mask and a detection box of a current node B and a mask and a detection box of a target node C, which are generated in an embodiment of the present invention;
FIG. 8 is a schematic representation of a target node C appearing within a first image taken by the capsule endoscope in an embodiment of the present invention;
FIG. 9 is a schematic view of a capsule endoscope having been aimed at target node C in an embodiment of the present invention;
FIG. 10 is a schematic view of a capsule endoscope in an optimal viewing position in an embodiment of the present invention;
FIG. 11 is a schematic diagram of controlling a capsule endoscope to perform a circular scan of a target node C in an embodiment of the present invention;
fig. 12 is a schematic diagram illustrating that the target node C and its neighboring area are completely scanned according to the embodiment of the present invention;
FIG. 13 is a schematic diagram of controlling a capsule endoscope to perform a cross scan on a target node C according to an embodiment of the present invention;
fig. 14 is a schematic diagram illustrating that a target node C and neighboring areas in four directions thereof are completely scanned according to an embodiment of the present invention;
FIG. 15 is a schematic diagram of a control device of a capsule endoscope in an embodiment of the present invention;
FIG. 16 is a schematic structural diagram of another capsule endoscope control device in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The capsule endoscope control method provided by the embodiment of the invention can be applied to the application environment shown in fig. 1, wherein the application environment comprises a capsule endoscope b1, a wireless transceiver b2, a capsule endoscope control device b5, a magnetic control device b4 and a graphic processing device b 3. The magnetron device b4, the wireless transceiver device b2 and the capsule endoscope control device b5 can be directly or indirectly connected through wired or wireless communication, and are not limited herein. Capsule endoscope b1 includes: the camera comprises a camera module, a control module, a radio frequency module and a first magnet. The magnetic control device b4 comprises a transmission mechanism and a second magnet. The first magnet and the second magnet may be electromagnets, permanent magnets or other kinds of magnets. The capsule endoscope control device b5 may be a local server, a cloud server, or a terminal device, and the terminal device may be, but is not limited to, various smart phones, tablet computers, notebook computers, desktop computers, smart speakers, smart watches, and the like. The graphic processing device b3 may be a local server, a cloud server, or a terminal device, and the terminal device may be, but is not limited to, various smart phones, tablet computers, notebook computers, desktop computers, smart speakers, smart watches, and the like. It should be noted that fig. 1 is only one application environment of the capsule endoscope control method provided by the embodiment of the present invention, and in other application environments, the graphics processing device and the wireless transceiver device may not be included, or the function of the graphics processing device is replaced by the capsule endoscope control device, or the function of the wireless transceiver device is replaced by another device having a data transmission function, which is not limited herein.
As shown in fig. 2, the capsule endoscope control method according to the embodiment of the present invention is described by taking an example in which the method is applied to the capsule endoscope control apparatus in fig. 1, and includes the steps of:
step S11: and planning a current cruising path of the capsule endoscope in the target area by taking the nodes to be scanned in the images shot by the capsule endoscope as current nodes according to the three-dimensional position relation model among the nodes to be scanned in the target area.
Specifically, before executing the capsule endoscope control method of the embodiment of the present invention, the capsule endoscope enters a target area and starts photographing the target area, and the capsule endoscope transmits an image photographed in real time. The capsule endoscope sends the images shot in real time to the wireless transceiver, and the wireless transceiver sends the images to the capsule endoscope control equipment. In an embodiment of the present invention, the target area is a closed space, such as a bionic stomach, a stomach model, an isolated animal stomach, or a human stomach. When aiming at a bionic stomach, a stomach model, an in-vitro animal stomach or a human stomach, the node to be scanned comprises a cardia, a fundus, a body, a stomach angle, a antrum and a pylorus. In some embodiments, the nodes to be scanned may also include the cardia, the anterior inferior cardia wall, the posterior inferior cardia wall, the fundus, the anterior superior corpus wall, the posterior superior corpus wall, the greater curvature of the superior corpus, the lesser curvature of the superior corpus, the anterior mid-corpus wall, the posterior mid-corpus wall, the greater curvature of the mid-corpus, the lesser curvature of the mid-corpus, the anterior inferior corpus wall, the posterior inferior corpus wall ", the greater curvature of the inferior corpus, the lesser curvature of the inferior corpus, the angle of the stomach, the anterior wall of the angle of the stomach, the posterior wall of the angle of the stomach, the anterior wall of the antrum, the posterior wall of the antrum, the greater curvature of the antrum, the lesser curvature of the antrum, and the.
According to a three-dimensional position relation model among all nodes to be scanned in a target area, a path planning algorithm of a three-dimensional network topological graph can be adopted, the nodes to be scanned in an image shot by a capsule endoscope are taken as current nodes, an optimal cruise path traversing all the nodes to be scanned is automatically planned to serve as the current cruise path, and the current cruise path comprises all the nodes to be scanned in the target area.
Step S12: and determining a target node to be scanned by the capsule endoscope from the current cruising path. Specifically, one node to be scanned closest to the current node in the current cruise path may be taken as the target node. Step S13: and receiving a first image which is shot by the capsule endoscope in real time in the target area, wherein the first image is shot when the target node appears in the visual field range of the capsule endoscope.
Step S14: and determining the position of the target node in the first image to obtain the position information of the target node. Step S15: and adjusting the position and the posture of the capsule endoscope according to the position information of the target node so that the target node appears in the center of a second image shot by the capsule endoscope.
Specifically, a stress model formed by a first magnet in the capsule endoscope and a second magnet in the magnetic control device is preset, namely a mapping relation exists between the position and the posture of the second magnet of the magnetic control device and the position and the posture of the capsule, and according to the mapping relation, the magnetic control device is controlled to change the position and the posture of the second magnet in a three-dimensional working area through a transmission mechanism, so that the position and the posture of the capsule endoscope are adjusted.
Step S16: and controlling the capsule endoscope to scan the target node.
According to the capsule endoscope control method provided by the embodiment of the invention, according to a three-dimensional position relation model among nodes to be scanned in a target area, the nodes to be scanned in an image shot by the capsule endoscope are taken as current nodes, and a current cruising path of the capsule endoscope in the target area is planned; determining a target node to be scanned by the capsule endoscope from the current cruising path; receiving a first image of the capsule endoscope taken in real time in the target area, wherein the first image is taken when a target node appears in the visual field of the capsule endoscope; determining the position of the target node in the first image to obtain target node position information; according to the position information of the target node, adjusting the position and the posture of the capsule endoscope to enable the target node to appear in the center of a second image shot by the capsule endoscope; and controlling the capsule endoscope to scan the target node. The capsule endoscope can be automatically controlled to scan in a target area, and the effectiveness and efficiency of controlling the capsule endoscope are improved.
As shown in fig. 3, in some embodiments, the capsule endoscope control method further includes step S17: and after the target node is scanned, marking the target node as a scanned node. Therefore, repeated scanning can be avoided, and the control efficiency of the capsule endoscope is improved.
After the step S17 is completed, the steps S12 to S17 are repeated until the scanning of all the nodes to be scanned is completed. The capsule endoscope can automatically cruise in a target area, all nodes to be scanned are completely scanned, and the effectiveness and efficiency of controlling the capsule endoscope are improved.
In some embodiments, the capsule endoscope control method further comprises the steps of:
step S18: and judging whether the capsule endoscope deviates from the current cruising path or whether the target node is searched for more than preset time. The method comprises the step of running through the whole process of automatic cruise scanning of the capsule endoscope, namely, the step of judging whether the capsule endoscope deviates from the current cruise path or not or searching whether a target node exceeds preset time or not is executed in real time in the whole process of automatic cruise scanning of the capsule endoscope.
When the capsule endoscope deviates from the current cruising path or a target node is searched for more than preset time, repeating the steps S11 to S18 until the scanning of all the nodes to be scanned is completed; when the capsule endoscope does not deviate from the current cruising path and the target node is not searched for beyond a preset time, repeating the steps S12 to S18 until the scanning of all the nodes to be scanned is completed.
In particular, the deviation of the capsule endoscope from the current cruising path, typically due to unintended postural adjustments by the subject, causes the capsule endoscope to move away from the previously scanned region (e.g., from the original upper stomach to the lower stomach). Judging according to the three-dimensional position relation model of each node to be scanned, if the node to be scanned shot by the capsule endoscope currently and the target node are not in the same area, that is, the capsule endoscope cannot scan according to the current cruising path, so that a new cruising path needs to be re-planned, and at this time, repeating the steps from S11 to S18 until the scanning of all the nodes to be scanned is completed.
For example, when the current node B is scanned, the target node in the current cruise path is C, and at this time, the position and posture of the capsule endoscope are adjusted according to the three-dimensional position relationship model, so that the target node C appears in the first image captured by the capsule endoscope in real time, but under special conditions (the posture of the subject is incorrect or the stomach structure is abnormal), the target node C cannot appear in the first image captured by the capsule endoscope in real time, and it is impossible to search for the target node C without limitation, and therefore, when the search for the target node exceeds the preset time, a new cruise path needs to be re-planned, and at this time, steps S11 to S18 are repeated until all the nodes to be scanned are completed. The preset time can be set as required, for example, 10 s.
When the capsule endoscope does not deviate from the current cruising path and the target node is not searched for beyond a preset time, repeating the steps S12 to S18 until the scanning of all the nodes to be scanned is completed.
According to the capsule endoscope control method provided by the embodiment of the invention, whether the capsule endoscope deviates from the current cruising path or not is monitored in real time, or whether the target node is searched for exceeds the preset time or not is monitored; when the capsule endoscope deviates from the current cruising path or a target node is searched for more than preset time, repeating the steps S11 to S18 until the scanning of all the nodes to be scanned is completed; when the capsule endoscope does not deviate from the current cruising path and the target node is not searched for beyond a preset time, repeating the steps S12 to S18 until the scanning of all the nodes to be scanned is completed. The capsule endoscope can automatically cruise in the target area, all nodes to be scanned are completely scanned, the scanning time of the capsule endoscope in the target area is shortened, and the effectiveness and the efficiency of controlling the capsule endoscope are improved.
In some embodiments, the step S11 of planning, according to the three-dimensional positional relationship model between the nodes to be scanned in the target region, a current cruise path of the capsule endoscope in the target region with the nodes to be scanned in the image captured by the capsule endoscope as a current node, where the current cruise path includes all the nodes to be scanned in the target region, includes the following steps:
s111: and establishing the three-dimensional position relation model among the nodes to be scanned in the target area to obtain a three-dimensional network topological graph.
S112: and taking the nodes to be scanned in the image shot by the capsule endoscope as current nodes, planning and traversing the optimal cruising path of all the nodes to be scanned according to the three-dimensional network topological graph, wherein the optimal cruising path comprises the scanning sequence of each node to be scanned, and taking the optimal cruising path as the current cruising path.
Specifically, a three-dimensional position relationship model (as shown in fig. 4) between nodes to be scanned in the target region is established, and in the embodiment of the present invention, taking the bionic stomach as an example, a three-dimensional position relationship model of each part of the bionic stomach is established, and each part is represented by using a coordinate point of a three-dimensional space, so that the three-dimensional position relationship between each part can be represented by a position vector under a specific coordinate system. The three-dimensional position relation model of each part can be regarded as a three-dimensional network topological graph, each node represents a corresponding part, connecting lines among the nodes represent feasible cruising paths among the corresponding parts (as shown in figure 5), the length of the connecting lines represents the distance of the cruising paths, one node is selected as a starting point, in the embodiment of the invention, a capsule endoscope enters a target area, a node to be scanned which is shot for the first time is taken as the starting point (the current node), a path planning algorithm of the three-dimensional network topological graph, such as a dynamic planning algorithm or a divide and conquer algorithm or a constraint optimization algorithm, is adopted to traverse the whole three-dimensional network topological graph by a depth-first traversal method, all cruising paths which can traverse all the nodes of the whole three-dimensional network topological graph are found, one path with the minimum weight sum is selected to be the optimal cruising path (as shown in figure 6), and taking the optimal cruise path as the current cruise path, wherein the optimal cruise path comprises the scanning sequence of each node to be scanned. The optimal cruise path is used as the current cruise path, namely the cruise scanning path of the capsule endoscope is shortest, so that the effectiveness of controlling the capsule endoscope is improved, and the scanning efficiency of the capsule endoscope is also improved. In some embodiments, when a capsule endoscope enters a target area, a node to be scanned which is shot for the first time is taken as a starting point (a current node), a path planning algorithm of a three-dimensional network topological graph, such as a dynamic planning algorithm or a divide-and-conquer algorithm or a constraint optimization algorithm, is adopted, a depth-first traversal method is adopted to traverse the whole three-dimensional network topological graph, all cruise paths which can traverse all nodes of the whole three-dimensional network topological graph are found, any one of the cruise paths is selected as the current cruise path, and the current cruise path comprises a scanning sequence of each node to be scanned. Further, in some embodiments, the target node to be scanned by the capsule endoscope is determined from the current cruise path, specifically, the target node after scanning is marked as a scanned node, and the target node is selected according to the scanning sequence of each node to be scanned in the current cruise path, so that the control efficiency of the capsule endoscope can be improved, the scanning efficiency of the capsule endoscope is improved, missing detection is avoided, and the effectiveness of controlling the capsule endoscope is improved.
In some embodiments, step S12 is preceded by: and controlling the capsule endoscope to scan the current node. In some embodiments, when the current node scan is complete, the current node is marked as a scanned node.
In some embodiments, step S13 is preceded by: and adjusting the position and the posture of the capsule endoscope according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in the first image shot by the capsule endoscope.
Specifically, the capsule endoscope cruises from the position to a target node, and the direction of the capsule endoscope towards the target node is adjusted according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in a first image shot by the capsule endoscope.
In some embodiments, the step of determining the position of the target node in the first image to obtain the position information of the target node may also be performed by a graphics processing device, and before the step is performed, the graphics processing device receives the first image, which is sent by the wireless transceiver device and is captured by the capsule endoscope in real time in the target area.
Further, step S14 determines the position of the target node in the first image to obtain the position information of the target node, which specifically includes the following steps:
s141: and inputting the first image into the node detection AI model to perform node feature identification and node name judgment, and identifying the target node and the name of the target node in the first image.
S142: and segmenting the identified target node through the node segmentation AI model to generate a target node mask and a target node detection frame.
S143: and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
Specifically, for the node detection AI model, any one of AI models such as a cyclic network model, a convolutional network model, a deep neural network model, a deep generation model, and a self-encoder model may be selected, the selected model is trained with an image set of the capsule endoscope at different positions of a target area, which is previously photographed by the capsule endoscope, after the node detection AI model satisfying one of recognition accuracy, sensitivity, specificity, or a combination thereof is obtained, a first image is input to the node detection AI model to perform node feature recognition and node name determination, and the target node and the name of the target node in the first image are recognized.
For the node segmentation AI model, any one of the AI models such as a circulating network model, a convolutional network model, a deep neural network model, a deep generation model and a self-encoder model can be selected, the selected model is trained by using an image set of the capsule endoscope at different positions of a target area, which is shot in advance, after the node segmentation AI model meeting one of identification precision, sensitivity and specificity or a combination of the identification precision, sensitivity and specificity is obtained, a first image is input into the node segmentation AI model for node segmentation, and a target node mask and a target node detection frame are generated. In the embodiment of the present invention, a node detection deep convolutional neural network model and a node segmentation deep convolutional neural network model are taken as examples for explanation, and the details are as follows:
selecting an image set of the capsule endoscope at different positions of a target area, wherein the image set is shot in advance by the capsule endoscope, the target area is a bionic stomach, a part corresponding to each image in the image set can be identified, and at least one part can be completely contained in the image; marking all parts in the selected image set, marking all parts completely, generating a marking frame file according to a marking area, dividing the marked image into a training set and a testing set, wherein the images in the training set and the images in the testing set are not overlapped; training the initial node detection depth convolution neural network model and the initial node segmentation depth convolution neural network model by using a training set respectively; the initial node detection deep convolutional neural network model is based on a natural scene detection network framework, the weight of the initial node detection deep convolutional neural network model is initialized to be a pre-training model weight of the natural scene detection network, and the weight of the initial node detection deep convolutional neural network model is fixed in the training process; the initial node segmentation depth convolution neural network model is directly trained by adopting a marking mask; the initial node detection deep convolutional neural network model and the initial node segmentation deep convolutional neural network model are mutually transmitted in a cascade mode through feature maps generated by each network convolutional layer in the training process, meanwhile, a detection frame generated by the initial node detection depth convolution neural network model acts on the initial node segmentation depth convolution neural network model, and finally a node mask is output, and the node mask output by the initial node segmentation deep convolutional neural network model acts on a detection frame output by the initial node detection deep convolutional neural network model, and parameters of the initial node detection depth convolution neural network model and the initial node segmentation depth convolution neural network model are respectively updated through loss function gradient back propagation to obtain a current node detection depth convolution neural network model and a current node segmentation depth convolution neural network model.
Respectively training a current node detection depth convolution neural network model and a current node segmentation depth convolution neural network model by utilizing a training set, testing the current node detection depth convolution neural network model and the current node segmentation depth convolution neural network model generated by single iterative training by utilizing a test set, respectively obtaining one or a combination of identification precision, sensitivity and specificity of the current node detection depth convolution neural network model and the current node segmentation depth convolution neural network model, respectively judging whether indexes respectively corresponding to the current node detection depth convolution neural network model and the current node segmentation depth convolution neural network model meet preset requirements by one or a combination of the identification precision, the sensitivity and the specificity, if so, terminating the training, and respectively taking the current node detection depth convolution neural network model and the current node segmentation depth convolution neural network model at the termination time as final nodes And detecting the depth convolutional neural network model and the node segmentation depth convolutional neural network model, if not, continuing training until a preset requirement is met, and respectively taking the current node detection depth convolutional neural network model and the current node segmentation depth convolutional neural network model which finally meet the preset requirement as a final node detection depth convolutional neural network model and a node segmentation depth convolutional neural network model.
When a capsule endoscope is used for examination, a first image acquired by the capsule endoscope is input into a node detection depth convolution neural network model for node feature identification and node name judgment, and the target node and the name of the target node in the first image are identified.
And inputting the first image into a node segmentation depth convolution neural network model, segmenting the identified target node, and generating a target node mask and a target node detection frame. Wherein the target node detection box may be a rectangle or a polygon. For example, referring to fig. 7, the mask and the detection box of the current node B and the mask and the detection box of the target node C are generated by the node segmentation deep convolutional neural network model, respectively.
Taking the image central point of the first image as an origin, and obtaining the position of the target node by taking the position relation between the coordinate of the central point of the target node detection frame and the coordinate of the origin, namely the position of the target node in the first image; and the pixel number of the image in the target node detection frame is the size of the target node in the first image, namely the size of the target node is obtained.
In some embodiments, the determining the position of the target node in the first image, and obtaining target node position information includes:
obtaining the name of the target node, a target node mask and a target node detection frame according to the AI model;
and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
It can be understood that node detection and node segmentation can be realized by one AI model, any one of the AI models such as a circulation network model, a convolution network model, a deep neural network model, a deep generation model and a self-encoder model can be selected, an image set of the selected model at different positions in a target area, which is shot in advance by a capsule endoscope, is used for training the selected model, and after the AI model meeting one of identification precision, sensitivity and specificity or a combination of the identification precision, sensitivity and specificity is obtained, a first image is input into the AI model for node feature identification, node name judgment, node mask generation and node detection frame; and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame. For a specific implementation process, please refer to the detailed description in the above embodiments, which is not repeated herein.
In some embodiments, step S15: and controlling the magnetic control equipment to change the position and the posture of the second magnet through a transmission mechanism based on the position and the size of the target node so as to adjust the position and the posture of the capsule endoscope, so that the target node appears in the center of a second image shot by the capsule endoscope. Specifically, referring to fig. 8, taking B as the current node (scanned node) and C as the target node as an example, the capsule endoscope navigates from the scanned node B (current node) to the target node C, first, a direction vector (including a direction and a distance) from the point B to the point C is calculated according to the three-dimensional position relationship model of the target area to determine the position relationship between the target node C and the scanned node B, then the position and the posture of the capsule endoscope are adjusted, the capsule endoscope is shifted to the direction of the target node C to make the target node C appear in the first image captured by the capsule endoscope, this adjustment process may not be once adjusted to make the target node C appear in the first image captured by the capsule endoscope, but the adjustment process may be performed according to the image captured by the capsule endoscope while adjusting the position and the posture of the capsule endoscope, when the first image input node is used for detecting the depth convolution neural network model, the target node can be identified and the name of the target node is output, and the first image input node is divided into the depth convolution neural network model, so that a target node mask and a target node detection frame can be generated, and the purpose that the target node appears in the first image shot by the capsule endoscope is achieved; when the first image input node is used for detecting the deep convolutional neural network model, a target node cannot be identified and a target node name is output, or the first image input node is divided into the deep convolutional neural network model, and a target node mask and a target node detection frame cannot be generated, the position and the posture of the capsule endoscope also need to be continuously adjusted, so that the target node appears in the first image of the capsule endoscope. As shown in fig. 8, the dashed box represents the field of view of the capsule endoscope. Then, the posture of the capsule endoscope is finely adjusted according to the position of the target node in the first image, that is, according to the direction vector between the center point of the target node detection frame and the center point of the first image, so that the capsule endoscope is shifted towards the center point of the target node detection frame until the distance between the center point of the target node detection frame and the center point of the first image is less than L/8 pixels (the resolution of the first image is set to be L × L), at this time, the capsule endoscope is already aligned with the target node C, as shown in fig. 9, the dashed frame is the visual field range of the capsule endoscope. At this time, the pixel area of the whole first image occupied by the target node detection frame is determined according to the size of the target node in the first image (compared with the verified value stored in the system), whether the current position of the capsule endoscope is the optimal observation position is judged according to the pixel area of the whole first image occupied by the target node detection frame, if the current position is the optimal observation position (shown in fig. 10), the capsule endoscope performs coverage scanning on the target node C at the optimal observation position, and if not, the position of the capsule endoscope is adjusted (close to or far away from the target node C) until the optimal observation position is found. After the target node C is searched and centered, the target node C appears in the center of the visual field of the capsule endoscope, namely the target node C appears in the center of a second image shot by the capsule endoscope. In the capsule endoscope control method provided by the embodiment of the invention, the node detection depth convolution neural network model and the node segmentation depth convolution neural network model learn all the characteristics required by detection and segmentation of the nodes to be scanned in the target area in advance, the node characteristic identification and node name judgment are carried out on the input image through the node detection depth convolution neural network model, so that the target nodes and the names of the target nodes in the image are identified, the identified target nodes are segmented through the node segmentation depth convolution neural network model, a target node mask and a target node detection frame are generated, and the position and the size of the target nodes in the image are obtained by combining the three-dimensional position relation model.
In some embodiments, step S16, controlling the capsule endoscope to scan the target node comprises: and controlling the capsule endoscope to perform cross scanning and/or annular scanning on the target node.
Specifically, when the target node appears in the center of the second image captured by the capsule endoscope, the pixel distances from the target node detection frame to the upper, lower, left and right sides of the second image are calculated according to the target node detection frame, the resolution of the second image is set to be R, if the pixel distances from the target node detection frame to the four sides of the second image are all equal to or more than R/4 pixels, it is indicated that the target node completely falls within the visual field range of the capsule endoscope, and at this time, the capsule endoscope directly scans the target node without performing additional scanning (circular scanning and/or cross scanning) on the target node. If the distance from the target node detection frame to any edge of the second image is less than L/4 pixels, the boundary of the target node is possibly beyond the visual field range of the capsule endoscope, and at the moment, annular scanning and/or cross scanning are carried out on the target node.
As shown in fig. 11, the annular scanning is specifically that the capsule endoscope control device controls the magnetic control device to change the posture of the second magnet through the transmission mechanism to adjust the posture of the capsule endoscope, so that the capsule endoscope deviates from the center of the target node by 15-30 degrees, and 360-degree annular scanning is performed around the target node, so that the target node and the adjacent area thereof are completely scanned, as shown in fig. 12, the dashed frame is the area scanned by the capsule endoscope.
As shown in fig. 13, the cross scanning is specifically that the capsule endoscope control device controls the magnetic control device to change the posture of the second magnet through the transmission mechanism to adjust the posture of the capsule endoscope, the capsule endoscope performs scanning in four directions of up, down, left and right in sequence, and when scanning in each direction, the distance from the target node detection frame to the image boundary in the direction exceeds R/2 pixels, and then the target node and the adjacent areas in the four directions thereof are completely scanned, as shown in fig. 14, the dashed frame is the area scanned by the capsule endoscope.
According to the capsule endoscope control method provided by the embodiment of the invention, when the target nodes do not fall into the visual field range of the capsule endoscope completely, the capsule endoscope control device controls the magnetic control device to change the posture of the second magnet through the transmission mechanism to adjust the posture of the capsule endoscope, and the capsule endoscope performs annular scanning and/or cross scanning on the target nodes, so that the comprehensive and integrity of the capsule endoscope scanning is ensured, and the effectiveness of controlling the capsule endoscope is improved.
In some embodiments, step S12 takes the current node as the current node, and controls the capsule endoscope to scan the current node, specifically, the full coverage scanning may also be performed on the current node by referring to the above-mentioned scanning manner on the target node.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the above-mentioned flowcharts may include steps, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps is not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a part of the steps in other steps.
The present invention also provides a capsule endoscope control device that can be used to execute the above-described capsule endoscope control method, based on the same idea as the capsule endoscope control method in the above-described embodiment. For convenience of illustration, only the parts related to the embodiments of the present invention are shown in the schematic structural diagram of the embodiments of the capsule endoscope control device, and those skilled in the art will understand that the illustrated structure does not constitute a limitation of the device, and may include more or less components than those illustrated, or combine some components, or arrange different components.
As shown in fig. 15, an embodiment of the present invention provides a capsule endoscope control device, which may adopt a software module or a hardware module, or a combination of the two modules, as a part of a computer device, and specifically includes a cruise path planning module, a first determining module, a first receiving module, a second determining module, a first control module, and a second control module, where:
and the cruise path planning module is used for planning a current cruise path of the capsule endoscope in the target area by taking the node to be scanned in the image shot by the capsule endoscope as a current node according to the three-dimensional position relation model among the nodes to be scanned in the target area.
The first determination module is used for determining a target node to be scanned by the capsule endoscope from the current cruise path.
The first receiving module is used for receiving a first image shot by the capsule endoscope in real time in the target area, wherein the first image is shot when a target node appears in the visual field range of the capsule endoscope.
And the second determining module is used for determining the position of the target node in the first image to obtain the position information of the target node.
And the first control module is used for adjusting the position and the posture of the capsule endoscope according to the position information of the target node so as to enable the target node to appear in the center of a second image shot by the capsule endoscope.
And the second control module is used for controlling the capsule endoscope to scan the target node.
According to the capsule endoscope control device provided by the embodiment of the invention, according to a three-dimensional position relation model among nodes to be scanned in a target area, the nodes to be scanned in an image shot by a capsule endoscope are taken as current nodes, and a current cruising path of the capsule endoscope in the target area is planned; determining a target node to be scanned by the capsule endoscope from the current cruising path; receiving a first image of the capsule endoscope taken in real time in the target area, wherein the first image is taken when a target node appears in the visual field of the capsule endoscope; determining the position of the target node in the first image to obtain target node position information; according to the target node position information, adjusting the position and the posture of the capsule endoscope to enable the target node to appear in the center of a second image shot by the capsule endoscope;
and controlling the capsule endoscope to scan the target node. The capsule endoscope can be automatically controlled to scan in a target area, and the effectiveness and efficiency of controlling the capsule endoscope are improved.
In some embodiments, the capsule endoscope control device further comprises a marking module for marking the target node as a scanned node after the target node is scanned.
As shown in fig. 16, in some embodiments, the capsule endoscope control device further includes a determination module, configured to determine whether the capsule endoscope deviates from the current cruising path or whether the target node is found for more than a preset time.
In some embodiments, the cruise path planning module comprises:
the first establishing unit is used for establishing the three-dimensional position relation model among the nodes to be scanned in the target area to obtain a three-dimensional network topological graph.
And the planning unit is used for planning and traversing the optimal cruising path of all the nodes to be scanned according to the three-dimensional network topological graph by taking the nodes to be scanned in the image shot by the capsule endoscope as the current nodes, wherein the optimal cruising path comprises the scanning sequence of each node to be scanned, and the optimal cruising path is taken as the current cruising path.
In some embodiments, the second control module is further configured to: and controlling the capsule endoscope to scan the current node.
In some embodiments, the first control module is further configured to: and adjusting the position and the posture of the capsule endoscope according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in the first image shot by the capsule endoscope.
In some embodiments, the second determining module comprises:
the AI unit is used for obtaining the name of the target node, a target node mask and a target node detection frame according to an AI model;
and the first determining unit is used for determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
In some embodiments, the second determining module comprises:
and the identification unit is used for performing node feature identification and node name judgment on the first image input node detection AI model, and identifying the target node and the name of the target node in the first image. And the segmentation unit is used for segmenting the identified target node through the node segmentation AI model to generate a target node mask and a target node detection frame.
And the determining unit is used for determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
For a detailed description of the capsule endoscope control device, reference may be made to the above description of the capsule endoscope control method, which is not repeated herein. The respective modules or the respective units in the capsule endoscope control device described above may be entirely or partially implemented by software, hardware, and a combination thereof. The modules or units may be embedded in hardware or independent from a processor in a computer device, or may be stored in a memory in the computer device in software, so that the processor can call and execute operations corresponding to the modules or units or sub-units. The embodiment of the invention provides capsule endoscope control equipment, which comprises a memory and a processor, wherein: a memory to store executable instructions.
And the processor is used for calling and executing the operation corresponding to each module or unit when the executable instruction stored in the memory is executed.
The embodiment of the invention provides a capsule endoscope control system, which comprises: capsule endoscope, magnetic control equipment and capsule endoscope controlgear, wherein:
the capsule endoscope includes: the capsule endoscope is used for collecting image data through the camera module and the control module and sending the image data to the outside of a target area through the radio frequency module, and the first magnet enables the capsule endoscope to be controlled by magnetic control equipment through magnetic acting force.
The capsule endoscope control device controls the position and the posture of the second magnet in the three-dimensional working area through the transmission mechanism so as to realize the position and posture adjustment of the capsule endoscope;
the capsule endoscope control equipment is used for invoking and executing corresponding operations of the modules or units to realize control over the capsule endoscope.
In some embodiments, the capsule endoscopic control system further comprises: and the wireless transceiver is used for receiving the image data sent by the capsule endoscope, forming an integral image by using an image data packet in the image data and sending the image to the capsule endoscope control equipment.
In some embodiments, the capsule endoscopic control system further comprises: the image processing device is used for receiving the images, which are sent by the wireless transceiving device and are shot by the capsule endoscope in real time, in the target area; the image processing device is used for determining the position of the target node in the image to obtain the position information of the target node; and the control device is used for sending the target node position information to the capsule endoscope control device.
The magnetic control device, the wireless transceiver device, the image processing device and the capsule endoscope control device can be directly or indirectly connected through wired or wireless communication.
The capsule endoscope control device can be a local server, a cloud server and a terminal device, and the terminal device can be but is not limited to various smart phones, tablet computers, notebook computers, desktop computers, smart sound boxes, smart watches and the like.
The graphic processing device may be a local server, a cloud server, or a terminal device, and the terminal device may be, but is not limited to, various smart phones, tablet computers, notebook computers, desktop computers, smart speakers, smart watches, and the like.
The wireless transceiver device may transmit and receive image data, or may store image data, and the specific structure and function of the wireless transceiver device are not limited.
For a detailed description of the capsule endoscope control system, reference may be made to the description of the corresponding parts in the capsule endoscope control method, and the detailed description is omitted here.
Embodiments of the present invention also provide a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which are loaded and executed by the processor to implement the operations of the capsule endoscope control method of the above embodiments.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk, an optical disk, or the like. Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solutions of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications all belong to the protection scope of the embodiments of the present invention.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention do not describe every possible combination.
In addition, any combination of various different implementation manners of the embodiments of the present invention is also possible, and the embodiments of the present invention should be considered as disclosed in the embodiments of the present invention as long as the combination does not depart from the spirit of the embodiments of the present invention.

Claims (21)

1. A capsule endoscope control method characterized by comprising the steps of:
s11: planning a current cruising path of the capsule endoscope in a target area by taking the node to be scanned in an image shot by the capsule endoscope as a current node according to a three-dimensional position relation model among the nodes to be scanned in the target area;
s12: determining a target node to be scanned by the capsule endoscope from the current cruising path;
s13: receiving a first image of the capsule endoscope taken in real time in the target area, wherein the first image is taken when a target node appears in the visual field of the capsule endoscope;
s14: determining the position of the target node in the first image to obtain target node position information;
s15: according to the target node position information, adjusting the position and the posture of the capsule endoscope to enable the target node to appear in the center of a second image shot by the capsule endoscope;
s16: and controlling the capsule endoscope to scan the target node.
2. The capsule endoscope control method according to claim 1, further comprising the steps of:
s17: after the target node is scanned, marking the target node as a scanned node;
repeating the steps S12 to S17 until the scanning of all the nodes to be scanned is completed.
3. The capsule endoscope control method of claim 2, further comprising the steps of:
s18: judging whether the capsule endoscope deviates from the current cruising path or whether the target node is searched for exceeding the preset time;
when the capsule endoscope deviates from the current cruising path or a target node is searched for more than preset time, repeating the steps S11 to S18 until the scanning of all the nodes to be scanned is completed;
when the capsule endoscope does not deviate from the current cruising path and the target node is not searched for beyond a preset time, repeating the steps S12 to S18 until the scanning of all the nodes to be scanned is completed.
4. The capsule endoscope control method according to claim 1, wherein the planning of the current cruising path of the capsule endoscope in the target area with the node to be scanned in the image taken by the capsule endoscope as the current node according to the three-dimensional positional relationship model between the nodes to be scanned in the target area comprises:
establishing the three-dimensional position relation model among the nodes to be scanned in the target area to obtain a three-dimensional network topological graph;
and taking the nodes to be scanned in the image shot by the capsule endoscope as current nodes, planning and traversing the optimal cruising path of all the nodes to be scanned according to the three-dimensional network topological graph, wherein the optimal cruising path comprises the scanning sequence of each node to be scanned, and taking the optimal cruising path as the current cruising path.
5. The capsule endoscope control method according to claim 1, further comprising, before step S12: and controlling the capsule endoscope to scan the current node.
6. The capsule endoscope control method according to claim 1, further comprising, before step S13: and adjusting the position and the posture of the capsule endoscope according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in the first image shot by the capsule endoscope.
7. The capsule endoscope control method according to claim 1, wherein the determining the position of the target node in the first image and obtaining target node position information comprises:
obtaining the name of the target node, a target node mask and a target node detection frame according to the AI model;
and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
8. The capsule endoscope control method according to claim 1, wherein the determining the position of the target node in the first image and obtaining target node position information comprises:
performing node feature identification and node name judgment on the first image input node detection AI model, and identifying the target node and the name of the target node in the first image;
segmenting the identified target node through a node segmentation AI model to generate a target node mask and a target node detection frame;
and determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
9. The capsule endoscope control method according to claim 1, wherein the controlling the capsule endoscope to scan the target node includes: and controlling the capsule endoscope to perform cross scanning and/or annular scanning on the target node.
10. A capsule endoscope control device, comprising:
the cruise path planning module is used for planning a current cruise path of the capsule endoscope in the target area by taking the nodes to be scanned in the image shot by the capsule endoscope as current nodes according to a three-dimensional position relation model among the nodes to be scanned in the target area;
the first determination module is used for determining a target node to be scanned by the capsule endoscope from the current cruise path;
the first receiving module is used for receiving a first image shot by the capsule endoscope in real time in the target area, wherein the first image is shot when a target node appears in the visual field range of the capsule endoscope;
the second determining module is used for determining the position of the target node in the first image to obtain the position information of the target node;
the first control module is used for adjusting the position and the posture of the capsule endoscope according to the position information of the target node so that the target node appears in the center of a second image shot by the capsule endoscope;
and the second control module is used for controlling the capsule endoscope to scan the target node.
11. The capsule endoscopic control apparatus of claim 10, further comprising:
and the marking module is used for marking the target node as a scanned node after the scanning of the target node is finished.
12. The capsule endoscopic control apparatus of claim 10, further comprising:
and the judging module is used for judging whether the capsule endoscope deviates from the current cruising path or whether the target node is searched for more than preset time.
13. The capsule endoscopic control apparatus of claim 10, wherein the cruise path planning module comprises:
the first establishing unit is used for establishing the three-dimensional position relation model among the nodes to be scanned in the target area to obtain a three-dimensional network topological graph;
and the planning unit is used for planning and traversing the optimal cruising path of all the nodes to be scanned according to the three-dimensional network topological graph by taking the nodes to be scanned in the image shot by the capsule endoscope as the current nodes, wherein the optimal cruising path comprises the scanning sequence of each node to be scanned, and the optimal cruising path is taken as the current cruising path.
14. The capsule endoscopic control apparatus of claim 10, wherein the second control module is further configured to: and controlling the capsule endoscope to scan the current node.
15. The capsule endoscopic control apparatus of claim 10, wherein the first control module is further configured to: and adjusting the position and the posture of the capsule endoscope according to the position relation between the current node and the target node in the three-dimensional position relation model, so that the target node appears in the first image shot by the capsule endoscope.
16. The capsule endoscopic control apparatus of claim 10, wherein the second determination module comprises:
the AI unit is used for obtaining the name of the target node, a target node mask and a target node detection frame according to an AI model;
and the first determining unit is used for determining the position of the target node in the first image according to the target node mask and the target node detection frame to obtain the target node position information, wherein the target node position information comprises a target node position and a target node size, and the target node size is the pixel number of the image in the target node detection frame.
17. The capsule endoscopic control apparatus of claim 10, wherein the second determination module comprises:
the identification unit is used for performing node feature identification and node name judgment on the first image input node detection AI model, and identifying the target node and the name of the target node in the first image;
the segmentation unit is used for segmenting the identified target nodes through the node segmentation AI model to generate a target node mask and a target node detection frame;
and a second determining unit, configured to determine, according to the target node mask and the target node detection frame, a position of the target node in the first image to obtain target node position information, where the target node position information includes a target node position and a target node size, and the target node size is a pixel number of an image in the target node detection frame.
18. A capsule endoscope control apparatus, characterized by comprising:
a memory for storing executable instructions;
and the processor is used for invoking and executing the operation corresponding to each module or unit in claims 10 to 17 when executing the executable instructions stored in the memory.
19. A capsule endoscope control system, comprising: a capsule endoscope, a magnetic control device and a capsule endoscope control device;
the capsule endoscope includes: the capsule endoscope is used for acquiring image data through the camera module and the control module and sending the image data to the outside of a target area through the radio frequency module, and the first magnet enables the capsule endoscope to be controlled by magnetic control equipment through magnetic acting force;
the capsule endoscope control device controls the position and the posture of the second magnet in the three-dimensional working area through the transmission mechanism so as to realize the position and posture adjustment of the capsule endoscope;
the capsule endoscope control device is used for planning a current cruising path of a capsule endoscope in a target area by taking the node to be scanned in an image shot by the capsule endoscope as a current node according to a three-dimensional position relation model among the nodes to be scanned in the target area, determining the target node to be scanned by the capsule endoscope from the current cruising path, receiving a first image shot by the capsule endoscope in real time in the target area, wherein the first image is an image shot when the target node appears in the visual field range of the capsule endoscope, determining the position of the target node in the first image to obtain the position information of the target node, and controlling the magnetic control device to change the position and the posture of a second magnet through a transmission mechanism according to the position information of the target node so as to realize the adjustment of the position and the posture of the capsule endoscope, and enabling the target node to appear in the center of a second image shot by the capsule endoscope, and controlling the capsule endoscope to scan the target node.
20. The capsule endoscopic control system of claim 19, further comprising: and the wireless transceiving equipment is used for receiving the image data sent by the capsule endoscope, forming an integral image by using an image data packet in the image data and sending the image to the capsule endoscope control equipment.
21. A computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to carry out the operations performed in the method of claims 1 to 9.
CN202011098021.3A 2020-10-14 2020-10-14 Capsule endoscope control method, device, equipment, system and storage medium Pending CN112089392A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011098021.3A CN112089392A (en) 2020-10-14 2020-10-14 Capsule endoscope control method, device, equipment, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011098021.3A CN112089392A (en) 2020-10-14 2020-10-14 Capsule endoscope control method, device, equipment, system and storage medium

Publications (1)

Publication Number Publication Date
CN112089392A true CN112089392A (en) 2020-12-18

Family

ID=73783410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011098021.3A Pending CN112089392A (en) 2020-10-14 2020-10-14 Capsule endoscope control method, device, equipment, system and storage medium

Country Status (1)

Country Link
CN (1) CN112089392A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112075914A (en) * 2020-10-14 2020-12-15 深圳市资福医疗技术有限公司 Capsule endoscopy system
CN113240726A (en) * 2021-05-20 2021-08-10 南开大学 Real-time measurement method for optical target size under endoscope
CN114184354A (en) * 2021-10-29 2022-03-15 深圳市资福医疗技术有限公司 Method and device for detecting optical resolution of capsule endoscope and storage medium
CN114259197A (en) * 2022-03-03 2022-04-01 深圳市资福医疗技术有限公司 Capsule endoscope quality control method and system
CN114463348A (en) * 2022-01-11 2022-05-10 广州思德医疗科技有限公司 Method for completing capsule endoscope stomach shooting through posture change, capsule endoscope and terminal
WO2022194126A1 (en) * 2021-03-19 2022-09-22 安翰科技(武汉)股份有限公司 Method for building image reading model based on capsule endoscope, device, and medium
WO2022233201A1 (en) * 2021-05-06 2022-11-10 Precision Robotics (Hong Kong) Limited Method, equipment and storage medium for navigating a tubular component in a multifurcated channel
CN115624308A (en) * 2022-12-21 2023-01-20 深圳市资福医疗技术有限公司 Capsule endoscope cruise control method, device and storage medium
CN116076995A (en) * 2023-02-03 2023-05-09 浙江势通机器人科技有限公司 Scanning control method and scanning control system for capsule endoscope
WO2024008042A1 (en) * 2022-07-04 2024-01-11 安翰科技(武汉)股份有限公司 Method for quantitatively controlling first viewing angle of capsule endoscope, system and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008237640A (en) * 2007-03-28 2008-10-09 Fujifilm Corp Capsule endoscope, capsule endoscope system, and operation control method of capsule endoscope
CN105769109A (en) * 2016-04-28 2016-07-20 深圳市鹏瑞智能技术应用研究院 Endoscope scanning control method and system
CN105902253A (en) * 2016-04-28 2016-08-31 深圳市鹏瑞智能图像有限公司 Endoscope insertion control method and system
CN107007242A (en) * 2017-03-30 2017-08-04 深圳市资福技术有限公司 A kind of capsule endoscopic control method and device
CN107248191A (en) * 2017-07-06 2017-10-13 南开大学 A kind of virtual endoscope suitable for complicated cavity is automatic and interactive route is planned and air navigation aid
CN109044250A (en) * 2018-08-28 2018-12-21 深圳市资福医疗技术有限公司 A kind of capsule endoscope motion control method, device and terminal device
CN109480746A (en) * 2019-01-14 2019-03-19 深圳市资福医疗技术有限公司 Intelligent control capsule endoscopic is in alimentary canal different parts working method and device
CN109620390A (en) * 2018-12-05 2019-04-16 聊城市光明医院 A kind of laparoscope smog removes system automatically
CN209059133U (en) * 2018-09-04 2019-07-05 重庆金山医疗器械有限公司 Controlled capsule type endoscope diagnostic and examination system based on image recognition
CN111067468A (en) * 2019-12-30 2020-04-28 北京双翼麒电子有限公司 Method, apparatus, and storage medium for controlling endoscope system
CN111091536A (en) * 2019-11-25 2020-05-01 腾讯科技(深圳)有限公司 Medical image processing method, apparatus, device, medium, and endoscope

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008237640A (en) * 2007-03-28 2008-10-09 Fujifilm Corp Capsule endoscope, capsule endoscope system, and operation control method of capsule endoscope
CN105769109A (en) * 2016-04-28 2016-07-20 深圳市鹏瑞智能技术应用研究院 Endoscope scanning control method and system
CN105902253A (en) * 2016-04-28 2016-08-31 深圳市鹏瑞智能图像有限公司 Endoscope insertion control method and system
CN107007242A (en) * 2017-03-30 2017-08-04 深圳市资福技术有限公司 A kind of capsule endoscopic control method and device
CN107248191A (en) * 2017-07-06 2017-10-13 南开大学 A kind of virtual endoscope suitable for complicated cavity is automatic and interactive route is planned and air navigation aid
CN109044250A (en) * 2018-08-28 2018-12-21 深圳市资福医疗技术有限公司 A kind of capsule endoscope motion control method, device and terminal device
CN209059133U (en) * 2018-09-04 2019-07-05 重庆金山医疗器械有限公司 Controlled capsule type endoscope diagnostic and examination system based on image recognition
CN109620390A (en) * 2018-12-05 2019-04-16 聊城市光明医院 A kind of laparoscope smog removes system automatically
CN109480746A (en) * 2019-01-14 2019-03-19 深圳市资福医疗技术有限公司 Intelligent control capsule endoscopic is in alimentary canal different parts working method and device
CN111091536A (en) * 2019-11-25 2020-05-01 腾讯科技(深圳)有限公司 Medical image processing method, apparatus, device, medium, and endoscope
CN111067468A (en) * 2019-12-30 2020-04-28 北京双翼麒电子有限公司 Method, apparatus, and storage medium for controlling endoscope system

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112075914A (en) * 2020-10-14 2020-12-15 深圳市资福医疗技术有限公司 Capsule endoscopy system
WO2022194126A1 (en) * 2021-03-19 2022-09-22 安翰科技(武汉)股份有限公司 Method for building image reading model based on capsule endoscope, device, and medium
GB2620529A (en) * 2021-05-06 2024-01-10 Prec Robotics Hong Kong Limited Method, equipment and storage medium for navigating a tubular component in a multifurcated channel
WO2022233201A1 (en) * 2021-05-06 2022-11-10 Precision Robotics (Hong Kong) Limited Method, equipment and storage medium for navigating a tubular component in a multifurcated channel
CN113240726A (en) * 2021-05-20 2021-08-10 南开大学 Real-time measurement method for optical target size under endoscope
CN114184354B (en) * 2021-10-29 2023-08-29 深圳市资福医疗技术有限公司 Method, device and storage medium for detecting optical resolution of capsule endoscope
CN114184354A (en) * 2021-10-29 2022-03-15 深圳市资福医疗技术有限公司 Method and device for detecting optical resolution of capsule endoscope and storage medium
CN114463348A (en) * 2022-01-11 2022-05-10 广州思德医疗科技有限公司 Method for completing capsule endoscope stomach shooting through posture change, capsule endoscope and terminal
CN114259197A (en) * 2022-03-03 2022-04-01 深圳市资福医疗技术有限公司 Capsule endoscope quality control method and system
CN114259197B (en) * 2022-03-03 2022-05-10 深圳市资福医疗技术有限公司 Capsule endoscope quality control method and system
WO2024008042A1 (en) * 2022-07-04 2024-01-11 安翰科技(武汉)股份有限公司 Method for quantitatively controlling first viewing angle of capsule endoscope, system and storage medium
CN115624308A (en) * 2022-12-21 2023-01-20 深圳市资福医疗技术有限公司 Capsule endoscope cruise control method, device and storage medium
CN116076995A (en) * 2023-02-03 2023-05-09 浙江势通机器人科技有限公司 Scanning control method and scanning control system for capsule endoscope
CN116076995B (en) * 2023-02-03 2023-09-01 浙江势通机器人科技有限公司 Scanning control method and scanning control system for capsule endoscope

Similar Documents

Publication Publication Date Title
CN112089392A (en) Capsule endoscope control method, device, equipment, system and storage medium
US11682195B2 (en) Digital histopathology and microdissection
CN112075914B (en) Capsule endoscopy system
CN111582207B (en) Image processing method, device, electronic equipment and storage medium
US10769788B2 (en) Few-shot learning based image recognition of whole slide image at tissue level
WO2019161813A1 (en) Dynamic scene three-dimensional reconstruction method, apparatus and system, server, and medium
US20220180521A1 (en) Image processing method and apparatus, and electronic device, storage medium and computer program
JP7061671B2 (en) How to generate at least one shape of the area of interest of a digital image and how to generate training data to train equipment and machine learning systems
JP2019512279A (en) System and method for processing multimodal images
KR102450931B1 (en) Image registration method and associated model training method, apparatus, apparatus
WO2022037259A1 (en) Image processing method and apparatus, electronic device, and computer readable storage medium
CN113256529A (en) Image processing method, image processing device, computer equipment and storage medium
CN111401193B (en) Method and device for acquiring expression recognition model, and expression recognition method and device
CN109523495A (en) Image processing method and device, equipment and storage medium
JP2021022236A (en) Classification reliability calculation device, area division device, learning device, classification reliability calculation method, learning method, classification reliability calculation program, and learning program
CN115984203A (en) Eyeball protrusion measuring method, system, terminal and medium
US12002262B2 (en) Digital histopathology and microdissection
CN109272485A (en) Method for repairing and mending, device and the electronic equipment of blood vessel three-dimensional model
WO2023053317A1 (en) Image matching apparatus, control method, and non-transitory computer-readable storage medium
CN106295563A (en) A kind of system and method airbound target flying quality assessed based on multi-vision visual
JP2023048873A (en) Information processing device, information processing method, and program
JP2020065114A (en) Video processing device, video processing method and video processing program
CN114098985A (en) Method, device, equipment and medium for spatial matching of patient and medical image of patient
JP2023091486A (en) Information processing device and information processing method
JP2022022525A (en) Information processing device, method for controlling information processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination