CN117440584B - Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium - Google Patents

Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium Download PDF

Info

Publication number
CN117440584B
CN117440584B CN202311760315.1A CN202311760315A CN117440584B CN 117440584 B CN117440584 B CN 117440584B CN 202311760315 A CN202311760315 A CN 202311760315A CN 117440584 B CN117440584 B CN 117440584B
Authority
CN
China
Prior art keywords
light source
surgical
control
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311760315.1A
Other languages
Chinese (zh)
Other versions
CN117440584A (en
Inventor
请求不公布姓名
陆汇海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bosheng Medical Technology Co ltd
Original Assignee
Shenzhen Bosheng Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bosheng Medical Technology Co ltd filed Critical Shenzhen Bosheng Medical Technology Co ltd
Priority to CN202311760315.1A priority Critical patent/CN117440584B/en
Publication of CN117440584A publication Critical patent/CN117440584A/en
Application granted granted Critical
Publication of CN117440584B publication Critical patent/CN117440584B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B13/00Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
    • G05B13/02Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
    • G05B13/0265Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
    • G05B13/027Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion using neural networks only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/36Indoor scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/034Recognition of patterns in medical or anatomical images of medical instruments
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Automation & Control Theory (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Databases & Information Systems (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The application relates to the technical field of image processing and discloses a surgical instrument segmentation auxiliary image exposure method, a system, equipment and a storage medium. The method comprises the following steps: acquiring an intelligent operation control system and building a light source association control rule; generating surgical illumination control requirement information and creating a surgical scene light source control scheme; extracting the light source illumination control change characteristics to obtain a plurality of light source illumination control change characteristics; performing dynamic tracking and scale self-adaptive processing to obtain a target operation area image, and performing image segmentation and instrument labeling to generate an operation instrument feature map; performing image exposure compensation calculation to generate surgical instrument exposure compensation parameters; the method and the device for optimizing the light source of the intelligent surgical control system have the advantages that error calculation and control strategy optimization are carried out, and a target scene light source control scheme of the intelligent surgical control system is generated.

Description

Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing, and in particular, to a method, a system, an apparatus, and a storage medium for exposing a surgical instrument segmentation auxiliary image.
Background
In a traditional surgical procedure, a doctor relies on illumination devices and cameras in the operating room to obtain visual information of the surgical scene. However, due to the existence of surgical instruments and the changes of different angles and light conditions, the problems of uneven illumination, instrument interference and the like of the surgical images can influence accurate perception and decision of doctors on a surgical scene, thereby increasing surgical risks and operation difficulty.
Traditional operating room illumination and exposure adjustments are typically performed manually by medical personnel, which can lead to inconsistencies and subjective errors. Second, the light source distribution in the operating room and the dynamic nature of the operating field make illumination and exposure adjustments more complex. Furthermore, environmental factors and external disturbances can adversely affect the illumination in the operating room, and consideration needs to be given to how to adaptively cope with these disturbances. Finally, accurate segmentation and labeling of surgical instruments is also a challenging task, requiring highly accurate image processing methods. Thus, how to implement automated illumination and exposure control during surgery, and how to precisely segment and label surgical instruments, remains an important issue to be addressed in current research.
Disclosure of Invention
The application provides a surgical instrument segmentation auxiliary image exposure method, a system, equipment and a storage medium.
In a first aspect, the present application provides a surgical instrument segmentation auxiliary image exposure method, including:
acquiring target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system, and building corresponding light source association control rules according to the target light source distribution information;
generating surgical illumination control requirement information through the intelligent surgical control system, and creating a surgical scene light source control scheme according to the surgical illumination control requirement information and the light source association control rule;
performing operation process illumination control and collecting light source illumination control change parameters through the operation scene light source control scheme, and extracting light source illumination control change characteristics of the light source illumination control change parameters to obtain a plurality of light source illumination control change characteristics corresponding to each target operation light source;
Performing dynamic tracking and scale self-adaptive processing on a target operation area to obtain a target operation area image, and performing image segmentation and instrument labeling on the target operation area image to generate an operation instrument feature map;
performing image exposure compensation calculation on the surgical instrument feature map according to the plurality of light source illumination control change features to generate surgical instrument exposure compensation parameters;
and performing error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters to generate a target scene light source control scheme of the intelligent surgical control system.
In a second aspect, the present application provides a surgical instrument segmentation auxiliary image exposure system comprising:
the acquisition module is used for acquiring target light source distribution information of a plurality of target surgical light sources in the intelligent surgical control system and building corresponding light source association control rules according to the target light source distribution information;
the creation module is used for generating operation illumination control requirement information through the intelligent operation control system and creating an operation scene light source control scheme according to the operation illumination control requirement information and the light source association control rule;
The extraction module is used for carrying out operation process illumination control through the operation scene light source control scheme, collecting light source illumination control change parameters, and carrying out light source illumination control change feature extraction on the light source illumination control change parameters to obtain a plurality of light source illumination control change features corresponding to each target operation light source;
the processing module is used for carrying out dynamic tracking and scale self-adaptive processing on a target operation area to obtain a target operation area image, and carrying out image segmentation and instrument marking on the target operation area image to generate a surgical instrument feature map;
the calculation module is used for carrying out image exposure compensation calculation on the surgical instrument feature map according to the plurality of light source illumination control change features to generate surgical instrument exposure compensation parameters;
and the optimization module is used for carrying out error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters, and generating a target scene light source control scheme of the intelligent surgical control system.
A third aspect of the present application provides a computer device comprising: a memory and at least one processor, the memory having instructions stored therein; the at least one processor invokes the instructions in the memory to cause the computer device to perform the surgical instrument segmentation auxiliary image exposure method described above.
A fourth aspect of the present application provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the surgical instrument segmentation aid image exposure method described above.
According to the technical scheme, the light source association control rule is constructed by acquiring the target light source distribution information and the illumination requirement, so that the accurate control on the operation scene is realized. This helps to ensure that the surgical field is adequately illuminated, thereby improving the visibility and accuracy of the surgical procedure. By monitoring the ambient light influencing factors and the user action influencing factors, the surgical lighting control requirements can be predicted. This allows the surgical illumination to be intelligently adjusted according to changes in the actual scene, providing a more desirable illumination effect. By dynamic tracking and scale self-adaptive processing, a high-quality target operation area image can be generated, and the image segmentation and instrument labeling can be performed on the target operation area image. And (3) performing exposure compensation calculation on the characteristic diagram of the surgical instrument through the light source illumination control change characteristic so as to ensure the clear visibility of the surgical instrument. This helps to reduce underexposure or overexposure problems in the image and improves image quality. According to the actual operation scene and the light source control requirement, a dynamic cooperative strategy is created. This allows the light sources to intelligently cooperate to meet surgical needs, improving the efficiency and safety of surgical procedures. By performing error calculation and control strategy optimization based on the surgical instrument exposure compensation parameters, the light source control can be continuously adjusted in real-time operation so as to maintain optimal illumination and image quality, thereby improving the accuracy of the surgical instrument segmentation auxiliary image exposure optimization and the intelligence of the surgical procedure light source control of the intelligent surgical control system.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained based on these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of one embodiment of a surgical instrument segmentation aid image exposure method in accordance with an embodiment of the present application;
FIG. 2 is a schematic representation of one embodiment of a surgical instrument segmentation aid image exposure system in accordance with an embodiment of the present application.
Detailed Description
The embodiment of the application provides a surgical instrument segmentation auxiliary image exposure method, a system, equipment and a storage medium. The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
For ease of understanding, a specific flow of an embodiment of the present application will be described below with reference to fig. 1, where an embodiment of a method for exposing a surgical instrument segmentation auxiliary image includes:
step 101, acquiring target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system, and building corresponding light source association control rules according to the target light source distribution information;
it will be appreciated that the subject of the present application may be a surgical instrument segmentation auxiliary image exposure system, or may be a terminal or server, and is not limited in this regard. The embodiment of the present application will be described by taking a server as an execution body.
Specifically, first, distribution information of a plurality of target surgical light sources is acquired from an intelligent surgical control system through a high-precision sensor and a data acquisition module. Such information includes the location of the light source, the light intensity, the color temperature, etc. Next, a first light source position center and a plurality of second light source position centers of the intelligent surgical control system are determined according to the acquired target light source distribution information. These light source location centers will be considered network nodes, building a target light source distribution network. The network is constructed to better understand and control the interaction and linkage effects between the light sources. And after the network construction is completed, carrying out network node cluster analysis on the target light source distribution network. This analysis may utilize machine learning or data mining techniques, such as K-means clustering algorithms, to determine similarities and associations between light sources. The clustering result reveals the similarity of different light source nodes in space position and function, and provides basis for subsequent node evaluation and control strategies. Node evaluation coefficients are then calculated for each network node, reflecting the importance and effect of each light source in surgical illumination, which will directly affect the priority and method of light source control. And finally, carrying out node association control analysis on the target light source distribution network according to the network node clustering result and the node evaluation coefficient. This analysis process will generate node-associated control relationships, i.e., determine which light sources should work in tandem in surgical illumination, as well as the control logic and coordination among them. And based on the association control relations, analyzing the light source association control rules of the first light source position center and the plurality of second light source position centers, thereby obtaining corresponding light source association control rules. The rules not only guide the regulation and control of the light source in the operation process, but also ensure the uniform, sufficient and proper illumination of the operation area, thereby assisting doctors in better performing operation and providing clearer and more accurate visual information in the image segmentation and instrument labeling processes.
Step 102, generating operation illumination control requirement information through an intelligent operation control system, and creating an operation scene light source control scheme according to the operation illumination control requirement information and a light source association control rule;
specifically, first, surgical scene information is acquired through an intelligent surgical control system, which includes spatial layout in an operating room, positions of an operating table and instruments, etc., and environmental light influencing factors and user action influencing factors occurring in a surgical procedure. Ambient light influencing factors include changes in natural and artificial light inside the operating room, while user action influencing factors involve the shielding and reflection of light by doctors and medical teams' activities during the procedure. By monitoring these factors in real time, the system can predict the need for surgical illumination control, such as increasing the illumination intensity, adjusting the light source direction, or changing the color temperature, etc., to obtain the surgical illumination control need information. Then, according to the information of the illumination control requirements of the operation, the operation scene is classified, such as emergency operation, conventional operation or minimally invasive operation, and each classification has a unique illumination requirement. This classification helps the system to more precisely adjust the light source settings to meet the specific needs of different surgical types. Meanwhile, the system can also determine the dynamic cooperative strategy of the operation light source according to the built light source association control rule. This means that the system can intelligently adjust the brightness, direction and color temperature of each light source according to the specific situation of the surgical scene, so as to ensure that the optimal illumination effect is obtained in the surgical area. In addition, the dynamic collaboration strategy of the light sources also needs to consider the interaction between the light sources, so as to avoid interference between the light sources or influence the operation of doctors. And finally, creating a comprehensive surgical scene light source control scheme by combining the surgical scene classification level and the surgical light source dynamic cooperative strategy. This approach not only ensures adequate illumination of the surgical field, but also accounts for changes that occur during the surgical procedure, such as surgical progress, surgeon position changes, etc., to adjust the light source in real time, ensuring visual clarity and accuracy during the surgical procedure.
Step 103, performing operation process illumination control and collecting light source illumination control change parameters through an operation scene light source control scheme, and performing light source illumination control change feature extraction on the light source illumination control change parameters to obtain a plurality of light source illumination control change features corresponding to each target operation light source;
specifically, first, surgical procedure lighting control is performed according to a surgical scene light source control scheme. The brightness, the direction and the color temperature of the light source are dynamically adjusted according to the requirements of the operation scene, so that the light rays of the operation area are uniform and proper. The sensor array in the intelligent operation control system monitors and collects the change parameters of the illumination control of the light source, such as the brightness, the energy consumption, the color and the like of the light source in real time. Then, the system classifies the collected light source illumination control variation parameters into a light source brightness variation parameter set and a light source energy consumption variation parameter set. This classification is to more effectively analyze and understand the performance and effect of the light source during the surgical procedure. And respectively performing curve fitting on the two parameter sets to obtain a light source brightness change curve and a light source energy consumption change curve. Curve fitting is a commonly used data analysis technique that helps the system understand the behavior pattern of the light source during the procedure, such as how the light source brightness and energy consumption change with the procedure and needs. Further, the system performs curve change characteristic operation on the curves to extract change characteristics of the brightness and the energy consumption of the light source. This will extract deeper, more meaningful information from the raw data, such as fluctuation patterns of the light source brightness, trend of energy consumption, etc. These features not only can help the system better understand the behavior of the light source, but can also provide references and guidance for lighting control. Finally, the system performs feature fusion on the brightness change features of the light source and the energy consumption change features of the light source. Feature fusion is the combining of features of different aspects together to obtain a more comprehensive, accurate description. In this way, the system is able to generate a plurality of light source illumination control variation features corresponding to each target surgical light source. These integrated features not only reflect the actual performance of the light source during the procedure, but also provide important feedback information for the intelligent surgical control system to achieve more optimal, more accurate illumination control in future procedures.
104, carrying out dynamic tracking and scale self-adaptive processing on the target operation area to obtain a target operation area image, and carrying out image segmentation and instrument labeling on the target operation area image to generate an operation instrument feature map;
specifically, firstly, an image acquisition terminal in an intelligent operation control system is used for acquiring an operation area calibration image of a target operation area to obtain the operation area calibration image. And then, performing feature recognition on the operation area calibration image through a preset CNN model. CNN models are trained to identify key features within the surgical field, such as surgical instruments, tissue types, etc., to accurately locate a target surgical tracking area. Next, a dynamic tracking of the target surgical tracking area is performed. Because the operation area can be changed along with the operation of doctors and the body position change of patients in the operation process, the dynamic tracking carries out quality evaluation through a preset tracking evaluation function, and the accuracy and the real-time performance of the tracking are ensured. The tracking quality assessment index will reflect the stability and accuracy of the tracking process. And then, performing scale self-adaptive processing on the target operation tracking area according to the tracking quality evaluation index so as to generate a target operation area image. The scale adaptive processing is to ensure that images maintain consistent recognition rate and accuracy at different scales, and in medical image processing, scale variation has a great influence on image recognition. And then, carrying out convolution operation on the target operation area image through a preset U-Net model so as to extract deeper features. The U-Net model has remarkable effect in medical image segmentation, and can extract complex characteristic information from images through convolution operation. These feature information are used to generate a first surgical area convolution feature map. And then, cross entropy loss optimization is carried out on the first operation region convolution feature map so as to further improve the accuracy of segmentation. Cross entropy loss is a commonly used optimization method that reduces the difference between model predictions and actual conditions, resulting in a second surgical area convolution profile. And finally, carrying out pixel-level classification and instrument labeling on the convolution feature images of the second operation region to generate a feature image of the surgical instrument. Pixel level classification is to ensure that each pixel is accurately labeled, and instrument labeling is to clearly identify the various surgical instruments in the surgical instrument feature map.
Step 105, performing image exposure compensation calculation on the surgical instrument feature map according to the illumination control change features of the plurality of light sources to generate surgical instrument exposure compensation parameters;
specifically, first, the operation region illumination brightness calculation is performed based on the plurality of light source illumination control variation characteristics. The calculation takes into account not only the brightness variation of each light source, but also the interaction between the light sources and the overall illumination distribution. Through the calculation, a comprehensive and accurate illumination brightness value of the operation area can be obtained, and the actual illumination condition of the operation scene is reflected. Next, an image exposure adjustment calculation is performed on the surgical instrument feature map. The system adjusts the exposure of the image according to the actual illumination condition of the operation area so as to ensure that the image of the operation instrument is clearly visible and has rich details. The exposure adjustment calculation takes into account not only the illumination intensity but also the color temperature and direction of the light source, in order to ensure color fidelity and contrast optimization of the image. Then, according to the adjusted exposure value, image exposure compensation calculation is carried out on the characteristic diagram of the surgical instrument, so that initial instrument exposure compensation parameters are generated, and the parameters are used as the basis of the subsequent optimization processing. The calculation of the initial instrument exposure compensation parameters requires a comprehensive consideration of the overall brightness, contrast and color saturation of the image to ensure that the visibility and identification of the surgical instrument in the image is maximized. Finally, the initial instrument exposure compensation parameters are adaptively gain controlled to generate final surgical instrument exposure compensation parameters. The adaptive gain control is a dynamic adjustment process that fine-adjusts exposure compensation parameters based on real-time feedback of the image to accommodate illumination changes and scene changes that occur during the procedure. In this way, the system can ensure that the image of the surgical instrument remains at the optimal exposure level throughout the surgical procedure, regardless of changes in illumination conditions.
And 106, performing error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters, and generating a target scene light source control scheme of the intelligent surgical control system.
Specifically, first, error calculation is performed on the current surgical scene light source control scheme based on the surgical instrument exposure compensation parameters, and the defects existing in the control scheme are identified and quantified. After the error calculation is completed, the system acquires control parameter error data, performs feature extraction and encoding processing on the data, and generates a control parameter error vector which is a quantized representation of the effectiveness of the current control scheme. This control parameter error vector is then input into a pre-set policy optimization network. The structure of this network includes an input layer, a genetic optimization layer, and an output layer, each layer having its specific functions. At the input level, the control parameter error vector is subjected to a vector normalization process in order to ensure consistency and comparability of the data during subsequent processing. After the normalization process is completed, the standard parameter error vector is fed into the genetic optimization layer. At the genetic optimization layer, the system uses genetic algorithm to perform optimization analysis of control strategy. The genetic algorithm is an optimization method based on natural selection and genetic principles, and can explore a large number of solutions through iterative search, and finally generate a plurality of candidate scene light source control schemes. These candidate schemes will be evaluated and compared to determine which schemes are most effective in reducing errors and improving the overall performance of the control scheme. Finally, the control scheme optimization analysis is carried out on the candidate scene light source control schemes through the output layer. The system will evaluate the effectiveness of each candidate, taking into account factors such as illumination uniformity, energy efficiency, and the degree of support to the surgical procedure. Through the series of analysis and optimization, the system finally obtains a target scene light source control scheme of the intelligent operation control system, and the scheme not only can effectively compensate the previous exposure error, but also can provide more accurate and reliable illumination control in future operations, thereby ensuring the safety and efficiency of the operation process.
In the embodiment of the application, the light source association control rule is constructed by acquiring the target light source distribution information and the illumination requirement, so that the accurate control on the surgical scene is realized. This helps to ensure that the surgical field is adequately illuminated, thereby improving the visibility and accuracy of the surgical procedure. By monitoring the ambient light influencing factors and the user action influencing factors, the surgical lighting control requirements can be predicted. This allows the surgical illumination to be intelligently adjusted according to changes in the actual scene, providing a more desirable illumination effect. By dynamic tracking and scale self-adaptive processing, a high-quality target operation area image can be generated, and the image segmentation and instrument labeling can be performed on the target operation area image. And (3) performing exposure compensation calculation on the characteristic diagram of the surgical instrument through the light source illumination control change characteristic so as to ensure the clear visibility of the surgical instrument. This helps to reduce underexposure or overexposure problems in the image and improves image quality. According to the actual operation scene and the light source control requirement, a dynamic cooperative strategy is created. This allows the light sources to intelligently cooperate to meet surgical needs, improving the efficiency and safety of surgical procedures. By performing error calculation and control strategy optimization based on the surgical instrument exposure compensation parameters, the light source control can be continuously adjusted in real-time operation so as to maintain optimal illumination and image quality, thereby improving the accuracy of the surgical instrument segmentation auxiliary image exposure optimization and the intelligence of the surgical procedure light source control of the intelligent surgical control system.
In a specific embodiment, the process of executing step 101 may specifically include the following steps:
(1) Acquiring target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system;
(2) Determining a first light source position center and a plurality of second light source position centers of the intelligent operation control system according to the target light source distribution information, and constructing a corresponding target light source distribution network by taking the first light source position center and the plurality of second light source position centers as network nodes;
(3) Performing network node clustering analysis on the target light source distribution network to obtain a network node clustering result, and respectively calculating node evaluation coefficients of each network node;
(4) According to the network node clustering result and the node evaluation coefficient, performing node association control analysis on the target light source distribution network to generate a node association control relation;
(5) And analyzing the light source association control rules of the first light source position center and the plurality of second light source position centers based on the node association control relationship to obtain corresponding light source association control rules.
Specifically, first, the distribution information of all the target surgical light sources in the surgical control system is acquired. Such information is typically provided by sensors and monitoring equipment in the operating room that are capable of detecting and recording key parameters such as the position, intensity, and illumination range of the surgical light source. These data not only provide the physical location of the light sources, but also reflect their frequency and importance of use at different stages of surgery. Next, the system determines a first light source location center and a plurality of second light source location centers in the intelligent surgical control system based on the target light source distribution information. The first light source is typically the primary lighting in the operating room, while the second light source includes auxiliary lighting and specialized operating lights. The location centers of these light sources are considered as network nodes, constituting a target light source distribution network. This network is a virtual representation depicting the spatial relationship between the light sources and the underlying control logic. Next, a network node cluster analysis is performed on this target light source distribution network. This analysis helps identify key nodes in the network, i.e., those light sources that have the greatest impact on the overall surgical lighting system. By means of algorithms, such as k-means clustering, the light sources can be divided into different categories or groups, each group representing a specific type of lighting need or function. The clustering results not only reveal interdependencies between the light sources, but also help determine the role of each light source in the surgical procedure. Next, a node evaluation coefficient for each network node is calculated. This evaluation factor is determined based on the importance of the node, the frequency of use and the role in the surgical illumination. The nodes of high evaluation coefficients are those of importance in the surgical procedure. Then, based on the network node clustering result and the node evaluation coefficient, the system further performs node association control analysis to generate a node association control relation. This determines how the different light sources interact with each other. For example, if two light sources are often used together, the control relationship between them will be set to linkage so that their brightness and angle can be adjusted simultaneously. Finally, based on the node association control relations, the system analyzes the light source association control rules of the first light source position center and the plurality of second light source position centers. The abstract control relationship is converted into concrete control instructions and rules. For example, if the evaluation factor of a certain secondary light source indicates that it is important in a particular type of procedure, the system will adjust its brightness and orientation to maximize support for the procedure. In this way, the light source association control rules can dynamically adjust the light source settings whenever a surgical type or condition changes to ensure optimal lighting conditions.
In a specific embodiment, the process of executing step 102 may specifically include the following steps:
(1) Acquiring operation scene information through an intelligent operation control system, and monitoring environment light influence factors and user action influence factors corresponding to the operation scene information;
(2) Performing operation illumination control demand prediction through the environmental light influence factors and the user action influence factors to obtain operation illumination control demand information;
(3) Determining a surgical scene classification level of the surgical scene information according to the surgical illumination control requirement information, and determining a surgical light source dynamic cooperative strategy of the surgical scene information according to the light source association control rule;
(4) And creating a surgical scene light source control scheme of the intelligent surgical control system according to the surgical scene classification level and the surgical light source dynamic cooperative strategy.
Specifically, first, surgical scene information is acquired through an intelligent surgical control system. Such information includes the size, shape, location and action of the surgical team, etc. At the same time, changes in ambient light, such as the angle of incidence and intensity of natural light, and lighting conditions inside the operating room are monitored, and user actions, i.e., movements and operations of the members of the surgical team, are tracked in real time. Then, the operation illumination control requirement prediction is performed through the environment light influence factors and the user action influence factors, so that the prediction of the operation illumination control requirement information is obtained, not only the current illumination condition is considered, but also the future changes in the operation process, such as the illumination requirement when the operation team member changes the position, are predicted. With this prediction, the system can adjust the illumination settings in advance to ensure that the surgical field remains at the ideal illumination state at all times. Next, a surgical scene classification level of the surgical scene information is determined according to the surgical illumination control requirement information. Surgical scene classification levels are classified based on the complexity of the surgery, the urgency, and the sensitivity of the lighting requirements. For example, some procedures requiring high precision and complex procedures, such as neurosurgery, are classified as higher-grade because they place greater demands on the accuracy and stability of illumination. While some conventional procedures are classified as lower-grade. Meanwhile, the system can determine the dynamic cooperative strategy of the surgical light source according to the light source association control rule. This means that the system will not only take into account the control of a single light source, but also how a plurality of light sources cooperate with each other for optimal lighting. Finally, the system creates a surgical scene light source control scheme based on the surgical scene classification level and the surgical light source dynamic collaboration policy. This solution is an overall lighting strategy that includes not only specific settings of the brightness and direction of the light source, but also a plan for dynamic adjustment of the light source. The system can automatically adjust illumination according to real-time change in the operation process, and ensures that the operation area is always in an optimal illumination state.
In a specific embodiment, the process of executing step 103 may specifically include the following steps:
(1) Performing operation process illumination control through an operation scene light source control scheme, and acquiring corresponding light source illumination control change parameters through a sensor array in an intelligent operation control system;
(2) Classifying parameters of the light source illumination control variation parameters to obtain a light source brightness variation parameter set and a light source energy consumption variation parameter set;
(3) Performing curve fitting on the light source brightness change parameter set to obtain a light source brightness change curve, and performing curve fitting on the light source energy consumption change parameter set to obtain a light source energy consumption change curve;
(4) Performing curve change characteristic operation on the light source brightness change curve to obtain a plurality of light source brightness change characteristics, and performing curve change characteristic operation on the light source energy consumption change curve to obtain a plurality of light source energy consumption change characteristics;
(5) And carrying out feature fusion on the brightness change features of the light sources and the energy consumption change features of the light sources to generate a plurality of light source illumination control change features corresponding to each target operation light source.
Specifically, first, surgical procedure lighting control is performed by a surgical scene light source control scheme, which includes brightness, direction, color, etc. of the light source. At the same time, the sensor array in the system can acquire the variation parameters related to the illumination control of the light source in real time. These sensors are highly sensitive and can detect small changes in the brightness of the light source, fluctuations in the power consumption, and other important parameters. Next, the system classifies the collected light source illumination control variation parameters to separate out different types of data sets. The light source brightness change parameter sets record the change condition of the brightness of each light source, which helps to ensure that the operation area always keeps proper illumination. The set of light source energy consumption variation parameters then reflects the variation in energy consumption of each light source during the procedure, which helps to evaluate the energy efficiency and sustained operation capability of the system. The system then performs a curve fit to the two parameter sets. By curve fitting, the system is able to convert the dispersed parameter points into a continuous curve, which helps to analyze and predict the behavior pattern of the light source. The light source brightness change curve reveals the change trend of the light source brightness along with time, and the light source energy consumption change curve shows the fluctuation condition of energy consumption along with time. These curves not only help to understand the current lighting state, but also predict future problems such as insufficient brightness or excessive power consumption. The system then performs a varying feature operation on the curves, extracting key feature parameters from the curves, such as peak, valley, average of brightness, peak and stability of energy consumption, and the like. The brightness change characteristic of the light source reflects the brightness adjustment capability of the light source in different operation stages, and the energy consumption change characteristic of the light source reflects the efficiency and stability of the light source in continuous use. Finally, the system performs feature fusion, combines the brightness change features and the energy consumption change features of a plurality of light sources, and generates a plurality of corresponding light source illumination control change features for each target operation light source. The result of this fusion is a comprehensive feature set that takes into account not only the lighting effect of the light source, but also its energy efficiency. In this way, the system is able to comprehensively evaluate the performance of each light source and make more accurate and efficient control decisions accordingly.
In a specific embodiment, the process of executing step 104 may specifically include the following steps:
(1) The method comprises the steps that through an image acquisition terminal in an intelligent operation control system, an operation area calibration image is acquired for a target operation area, and an operation area calibration image is obtained;
(2) Performing feature recognition on the operation region calibration image through a preset CNN model to obtain a target operation tracking region;
(3) Dynamically tracking the target operation tracking area, and performing tracking quality evaluation through a preset tracking evaluation function to obtain a tracking quality evaluation index;
(4) Performing scale self-adaptive processing on the target operation tracking area according to the tracking quality evaluation index to obtain a target operation area image;
(5) Performing convolution operation on the target operation area image through a preset U-Net model to obtain a first operation area convolution feature map, and performing cross entropy loss optimization on the first operation area convolution feature map to obtain a second operation area convolution feature map;
(6) And carrying out pixel-level classification and instrument labeling on the convolution feature map of the second operation region to generate a surgical instrument feature map.
Specifically, firstly, an operation area is accurately calibrated and image acquisition is performed through an image acquisition terminal in the system. For example, a high definition camera or other advanced image acquisition device is used to capture detailed visual information of the surgical field. The acquisition of the calibration image of the surgical area is the basis of the whole process, and ensures the precision and reliability of the subsequent processing. These images are then feature identified using a pre-set Convolutional Neural Network (CNN) model, which accurately identifies the target surgical tracking area from the images, which typically includes surgical incisions, instrument positions, and other critical surgical structures. The CNN model can improve accuracy and reliability of recognition by learning a large amount of surgical image data. The system then dynamically tracks the identified target surgical tracking area. Since the target area may change during the procedure due to the surgical procedure or small movements of the patient, dynamic tracking ensures that the system can update the position and status of the target area in real time. While tracking dynamically, the system evaluates the quality of tracking through a preset tracking evaluation function. These evaluation functions typically include indicators of tracking accuracy, stability, and responsiveness to rapid movements, which together form a tracking quality evaluation indicator. And then, the system performs scale self-adaptive processing on the target operation tracking area according to the tracking quality evaluation index. The system automatically adjusts the tracking scale and precision according to the size, shape and dynamic change of the target area so as to ensure that the target area can be accurately tracked all the time. And then, carrying out convolution operation on the target operation area image by using a preset U-Net model. U-Net is a convolutional neural network structure suitable for medical image segmentation that can effectively separate regions of interest from complex medical images. By convolving the target surgical field image, the system generates a first surgical field convolution feature map. The system then performs cross entropy loss optimization on this feature map, a technique commonly used to improve image segmentation accuracy. The second operation area convolution characteristic diagram obtained after optimization is finer and more accurate. And finally, carrying out pixel-level classification and instrument labeling on the convolution characteristic image of the second operation region. Image processing techniques are used to identify and mark surgical instruments, tissue types, and other critical surgical features. Pixel level classification ensures that each pixel is accurately assigned to a corresponding class, making details of the surgical instrument and associated tissue more clearly visible in the image. With this precise image processing, the system is able to generate the final surgical instrument feature map, which is the key to subsequent surgical navigation and support.
In a specific embodiment, the process of executing step 105 may specifically include the following steps:
(1) Calculating the illumination brightness of the operation area according to the illumination control change characteristics of the plurality of light sources to obtain an illumination brightness value of the operation area;
(2) According to the illumination brightness value of the operation area, performing image exposure adjustment calculation on the characteristic diagram of the operation instrument to obtain an adjusted exposure value;
(3) Performing image exposure compensation calculation on the surgical instrument feature map according to the adjusted exposure value to generate initial instrument exposure compensation parameters;
(4) And performing adaptive gain control on the initial instrument exposure compensation parameters to generate surgical instrument exposure compensation parameters.
Specifically, first, the illumination control variation characteristics of a plurality of light sources are comprehensively analyzed to calculate the illumination brightness value of the operation area. The calculation process not only requires the brightness and position of each light source, but also takes into account their illumination angles and mutual occlusion. In this way, the system can accurately estimate the total brightness of the surgical field and ensure that it meets the visual requirements during surgery. Then, according to the calculated illumination brightness value of the operation area, the system performs image exposure adjustment calculation on the characteristic diagram of the operation instrument, so as to ensure that the instrument and tissue details in the image are clearly visible under any illumination condition. If the illumination intensity is too high or too low, the system will adjust the exposure value of the image accordingly to prevent overexposure or underexposure of the image. For example, if the illumination is suddenly increased during surgery, the system may decrease the exposure value of the image to avoid losing detail in the image due to overexposure. Then, based on the adjusted exposure values, the system performs image exposure compensation calculations on the surgical instrument feature map, generating initial instrument exposure compensation parameters that are set to further optimize the visual effect of the image. This compensation calculation takes into account factors such as contrast, color saturation, and detail sharpness of the image. For example, if the surgical instrument appears too dark in the image, the system may increase its brightness by adjusting the compensation parameters so that it is more clearly visible. Finally, the system adaptively gain controls the initial instrument exposure compensation parameters to generate final surgical instrument exposure compensation parameters. Adaptive gain control means that the system will automatically adjust the compensation parameters based on the quality of the real-time image and the change in the surgical conditions. This adaptive mechanism ensures that the quality of the image is always guaranteed regardless of the change in the surgical conditions. For example, if the illumination conditions change during surgery, or the viewing angle of the surgical field is adjusted, the system adjusts the compensation parameters in real time to ensure image continuity and sharpness.
In a specific embodiment, the process of executing step 106 may specifically include the following steps:
(1) Performing error calculation on a surgical scene light source control scheme based on the surgical instrument exposure compensation parameters to obtain control parameter error data, and performing feature extraction and coding processing on the control parameter error data to generate a control parameter error vector;
(2) Inputting the control parameter error vector into a preset strategy optimization network, wherein the strategy optimization network comprises: an input layer, a genetic optimization layer, and an output layer;
(3) Carrying out vector standardization processing on the control parameter error vector through an input layer to obtain a standard parameter error vector;
(4) Inputting the standard parameter error vector into a genetic optimization layer, and performing control strategy optimization analysis through a genetic algorithm in the genetic optimization layer to generate a plurality of candidate scene light source control schemes;
(5) And carrying out control scheme optimization analysis on the plurality of candidate scene light source control schemes through the output layer to obtain a target scene light source control scheme of the intelligent operation control system.
Specifically, first, an error calculation is performed on the current surgical scene light source control scheme based on the exposure compensation parameters of the surgical instrument, and problems or deficiencies existing under the existing illumination control scheme, such as too bright or too dark light of certain areas, are identified. Through this error calculation, the system is able to derive control parameter error data reflecting the differences between the existing lighting control scheme and the ideal state. Next, feature extraction and encoding processing are performed on these control parameter error data, and control parameter error vectors are generated. The aim is to convert complex error data into a format more suitable for subsequent processing. By feature extraction, the system is able to identify the most critical error features, which are converted into a standardized mathematical representation, the error vector, by the encoding process. This control parameter error vector is then input into a pre-set policy optimization network. This network is a specially designed artificial intelligence system that includes an input layer, a genetic optimization layer, and an output layer. At the input layer, the control parameter error vector is vector normalized to ensure that the data can be used efficiently in the next processing. The normalization process involves adjusting the size and scope of the data to accommodate other parts of the network. Then, the normalized parameter error vector is input into the genetic optimization layer. The system uses genetic algorithm to control strategy optimization analysis. Genetic algorithms are biological evolutionary inspired optimization techniques that iteratively improve solutions by modeling natural selection and genetic mechanisms. In this process, the system generates a plurality of candidate scene illuminant control schemes and continuously optimizes these schemes through selection, crossover and mutation operations in the genetic algorithm. Finally, the control scheme optimization analysis is carried out on the plurality of candidate scene light source control schemes through the output layer so as to determine a final target scene light source control scheme. By comparing the effects of different schemes, such as illumination uniformity, energy efficiency, support for surgical operation, etc., the system selects the illumination control scheme that best suits the current surgical needs.
The method for exposing the auxiliary image for surgical instrument segmentation in the embodiment of the present application is described above, and the system for exposing the auxiliary image for surgical instrument segmentation in the embodiment of the present application is described below, referring to fig. 2, an embodiment of the system for exposing the auxiliary image for surgical instrument segmentation in the embodiment of the present application includes:
an acquisition module 201, configured to acquire target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system, and build corresponding light source association control rules according to the target light source distribution information;
a creation module 202, configured to generate, by using the intelligent surgical control system, surgical illumination control requirement information, and create a surgical scene light source control scheme according to the surgical illumination control requirement information and the light source association control rule;
the extraction module 203 is configured to perform operation process illumination control according to the operation scene light source control scheme, collect light source illumination control variation parameters, and perform light source illumination control variation feature extraction on the light source illumination control variation parameters to obtain a plurality of light source illumination control variation features corresponding to each target operation light source;
the processing module 204 is used for dynamically tracking and performing scale self-adaptive processing on a target operation area to obtain a target operation area image, and performing image segmentation and instrument labeling on the target operation area image to generate a surgical instrument feature map;
The calculation module 205 is configured to perform image exposure compensation calculation on the surgical instrument feature map according to the plurality of light source illumination control variation features, and generate surgical instrument exposure compensation parameters;
and the optimization module 206 is used for performing error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters, and generating a target scene light source control scheme of the intelligent surgical control system.
Through the cooperation of the components, the light source association control rule is constructed by acquiring the target light source distribution information and the illumination requirement, so that the accurate control of the operation scene is realized. This helps to ensure that the surgical field is adequately illuminated, thereby improving the visibility and accuracy of the surgical procedure. By monitoring the ambient light influencing factors and the user action influencing factors, the surgical lighting control requirements can be predicted. This allows the surgical illumination to be intelligently adjusted according to changes in the actual scene, providing a more desirable illumination effect. By dynamic tracking and scale self-adaptive processing, a high-quality target operation area image can be generated, and the image segmentation and instrument labeling can be performed on the target operation area image. And (3) performing exposure compensation calculation on the characteristic diagram of the surgical instrument through the light source illumination control change characteristic so as to ensure the clear visibility of the surgical instrument. This helps to reduce underexposure or overexposure problems in the image and improves image quality. According to the actual operation scene and the light source control requirement, a dynamic cooperative strategy is created. This allows the light sources to intelligently cooperate to meet surgical needs, improving the efficiency and safety of surgical procedures. By performing error calculation and control strategy optimization based on the surgical instrument exposure compensation parameters, the light source control can be continuously adjusted in real-time operation so as to maintain optimal illumination and image quality, thereby improving the accuracy of the surgical instrument segmentation auxiliary image exposure optimization and the intelligence of the surgical procedure light source control of the intelligent surgical control system.
The present application also provides a computer device including a memory and a processor, the memory storing computer readable instructions that, when executed by the processor, cause the processor to perform the steps of the surgical instrument segmentation auxiliary image exposure method in the above embodiments.
The present application also provides a computer readable storage medium, which may be a non-volatile computer readable storage medium, and may also be a volatile computer readable storage medium, where instructions are stored in the computer readable storage medium, which when executed on a computer, cause the computer to perform the steps of the surgical instrument segmentation auxiliary image exposure method.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, systems and units may refer to the corresponding processes in the foregoing method embodiments, which are not repeated herein.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (10)

1. A surgical instrument segmentation auxiliary image exposure method, characterized in that the surgical instrument segmentation auxiliary image exposure method comprises:
acquiring target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system, and building corresponding light source association control rules according to the target light source distribution information;
generating surgical illumination control requirement information through the intelligent surgical control system, and creating a surgical scene light source control scheme according to the surgical illumination control requirement information and the light source association control rule;
performing operation process illumination control and collecting light source illumination control change parameters through the operation scene light source control scheme, and extracting light source illumination control change characteristics of the light source illumination control change parameters to obtain a plurality of light source illumination control change characteristics corresponding to each target operation light source;
Performing dynamic tracking and scale self-adaptive processing on a target operation area to obtain a target operation area image, and performing image segmentation and instrument labeling on the target operation area image to generate an operation instrument feature map;
performing image exposure compensation calculation on the surgical instrument feature map according to the plurality of light source illumination control change features to generate surgical instrument exposure compensation parameters;
and performing error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters to generate a target scene light source control scheme of the intelligent surgical control system.
2. The method for exposing a surgical instrument segmentation auxiliary image according to claim 1, wherein the steps of obtaining target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system and building corresponding light source association control rules according to the target light source distribution information include:
acquiring target light source distribution information of a plurality of target surgical light sources in an intelligent surgical control system;
determining a first light source position center and a plurality of second light source position centers of the intelligent operation control system according to the target light source distribution information, and constructing a corresponding target light source distribution network by taking the first light source position center and the plurality of second light source position centers as network nodes;
Performing network node clustering analysis on the target light source distribution network to obtain a network node clustering result, and respectively calculating node evaluation coefficients of each network node;
according to the network node clustering result and the node evaluation coefficient, performing node association control analysis on the target light source distribution network to generate a node association control relation;
and analyzing the light source association control rules of the first light source position center and the plurality of second light source position centers based on the node association control relation to obtain corresponding light source association control rules.
3. The surgical instrument segmentation aid image exposure method according to claim 1, wherein the generating, by the intelligent surgical control system, surgical illumination control requirement information and creating a surgical scene light source control scheme according to the surgical illumination control requirement information and the light source association control rule includes:
acquiring operation scene information through the intelligent operation control system, and monitoring environment light influence factors and user action influence factors corresponding to the operation scene information;
performing operation illumination control demand prediction through the environment light influence factors and the user action influence factors to obtain operation illumination control demand information;
Determining a surgical scene classification level of the surgical scene information according to the surgical illumination control requirement information, and determining a surgical light source dynamic cooperative strategy of the surgical scene information according to the light source association control rule;
and creating a surgical scene light source control scheme of the intelligent surgical control system according to the surgical scene classification level and the surgical light source dynamic cooperative strategy.
4. The method for exposing a surgical instrument segmentation auxiliary image according to claim 1, wherein the performing surgical procedure illumination control and collecting a light source illumination control variation parameter by the surgical scene light source control scheme, and performing light source illumination control variation feature extraction on the light source illumination control variation parameter to obtain a plurality of light source illumination control variation features corresponding to each target surgical light source, includes:
performing operation process illumination control through the operation scene light source control scheme, and acquiring corresponding light source illumination control change parameters through a sensor array in the intelligent operation control system;
performing parameter classification on the light source illumination control variation parameters to obtain a light source brightness variation parameter set and a light source energy consumption variation parameter set;
Performing curve fitting on the light source brightness change parameter set to obtain a light source brightness change curve, and performing curve fitting on the light source energy consumption change parameter set to obtain a light source energy consumption change curve;
performing curve change characteristic operation on the light source brightness change curve to obtain a plurality of light source brightness change characteristics, and performing curve change characteristic operation on the light source energy consumption change curve to obtain a plurality of light source energy consumption change characteristics;
and carrying out feature fusion on the brightness change features of the light sources and the energy consumption change features of the light sources to generate a plurality of light source illumination control change features corresponding to each target operation light source.
5. The method for exposing a surgical instrument segmentation auxiliary image according to claim 1, wherein the dynamically tracking and scale adaptive processing are performed on a target surgical area to obtain a target surgical area image, and the image segmentation and instrument labeling are performed on the target surgical area image to generate a surgical instrument feature map, and the method comprises the steps of:
acquiring an operation area calibration image of a target operation area through an image acquisition terminal in the intelligent operation control system to obtain the operation area calibration image;
Performing feature recognition on the operation region calibration image through a preset CNN model to obtain a target operation tracking region;
dynamically tracking the target operation tracking area, and performing tracking quality evaluation through a preset tracking evaluation function to obtain a tracking quality evaluation index;
performing scale self-adaptive processing on the target operation tracking area according to the tracking quality evaluation index to obtain a target operation area image;
performing convolution operation on the target operation area image through a preset U-Net model to obtain a first operation area convolution feature map, and performing cross entropy loss optimization on the first operation area convolution feature map to obtain a second operation area convolution feature map;
and carrying out pixel-level classification and instrument labeling on the second operation region convolution feature image to generate a surgical instrument feature image.
6. The surgical instrument segmentation assist image exposure method as set forth in claim 1, wherein the performing an image exposure compensation calculation on the surgical instrument feature map based on the plurality of light source illumination control variation features to generate surgical instrument exposure compensation parameters includes:
calculating the illumination brightness of the operation area according to the illumination control change characteristics of the light sources to obtain an illumination brightness value of the operation area;
According to the illumination brightness value of the operation area, performing image exposure adjustment calculation on the operation instrument feature map to obtain an adjusted exposure value;
performing image exposure compensation calculation on the surgical instrument feature map according to the adjusted exposure value to generate initial instrument exposure compensation parameters;
and performing adaptive gain control on the initial instrument exposure compensation parameters to generate surgical instrument exposure compensation parameters.
7. The surgical instrument segmentation assist image exposure method as set forth in claim 1, wherein the performing error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameter to generate a target scene light source control scheme of the intelligent surgical control system includes:
performing error calculation on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters to obtain control parameter error data, and performing feature extraction and coding processing on the control parameter error data to generate a control parameter error vector;
inputting the control parameter error vector into a preset strategy optimization network, wherein the strategy optimization network comprises: an input layer, a genetic optimization layer, and an output layer;
Carrying out vector normalization processing on the control parameter error vector through the input layer to obtain a standard parameter error vector;
inputting the standard parameter error vector into the genetic optimization layer, and performing control strategy optimization analysis through a genetic algorithm in the genetic optimization layer to generate a plurality of candidate scene light source control schemes;
and carrying out control scheme optimization analysis on the plurality of candidate scene light source control schemes through the output layer to obtain a target scene light source control scheme of the intelligent operation control system.
8. A surgical instrument segmentation auxiliary image exposure system, the surgical instrument segmentation auxiliary image exposure system comprising:
the acquisition module is used for acquiring target light source distribution information of a plurality of target surgical light sources in the intelligent surgical control system and building corresponding light source association control rules according to the target light source distribution information;
the creation module is used for generating operation illumination control requirement information through the intelligent operation control system and creating an operation scene light source control scheme according to the operation illumination control requirement information and the light source association control rule;
the extraction module is used for carrying out operation process illumination control through the operation scene light source control scheme, collecting light source illumination control change parameters, and carrying out light source illumination control change feature extraction on the light source illumination control change parameters to obtain a plurality of light source illumination control change features corresponding to each target operation light source;
The processing module is used for carrying out dynamic tracking and scale self-adaptive processing on a target operation area to obtain a target operation area image, and carrying out image segmentation and instrument marking on the target operation area image to generate a surgical instrument feature map;
the calculation module is used for carrying out image exposure compensation calculation on the surgical instrument feature map according to the plurality of light source illumination control change features to generate surgical instrument exposure compensation parameters;
and the optimization module is used for carrying out error calculation and control strategy optimization on the surgical scene light source control scheme based on the surgical instrument exposure compensation parameters, and generating a target scene light source control scheme of the intelligent surgical control system.
9. A computer device, the computer device comprising: a memory and at least one processor, the memory having instructions stored therein;
the at least one processor invokes the instructions in the memory to cause the computer device to perform the surgical instrument segmentation auxiliary image exposure method of any one of claims 1-7.
10. A computer readable storage medium having instructions stored thereon, which when executed by a processor, implement the surgical instrument segmentation aid image exposure method of any of claims 1-7.
CN202311760315.1A 2023-12-20 2023-12-20 Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium Active CN117440584B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311760315.1A CN117440584B (en) 2023-12-20 2023-12-20 Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311760315.1A CN117440584B (en) 2023-12-20 2023-12-20 Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117440584A CN117440584A (en) 2024-01-23
CN117440584B true CN117440584B (en) 2024-02-20

Family

ID=89553949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311760315.1A Active CN117440584B (en) 2023-12-20 2023-12-20 Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117440584B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118283889B (en) * 2024-06-03 2024-08-23 诚峰智能光环境科技(江苏)有限公司 Self-adaptive control system of shadowless lamp

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104320881A (en) * 2014-10-28 2015-01-28 江苏天语雅思医疗设备有限公司 Intelligent dimming controller in LED shadowless lamp lighting system
CN106456272A (en) * 2014-03-17 2017-02-22 直观外科手术操作公司 Surgical system including a non-white light general illuminator
CN115379128A (en) * 2022-08-15 2022-11-22 Oppo广东移动通信有限公司 Exposure control method and device, computer readable medium and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI285047B (en) * 2005-11-24 2007-08-01 Sunplus Technology Co Ltd Method of automatic exposure control and automatic exposure compensated apparatus
IL262619B (en) * 2018-10-25 2020-05-31 Beyeonics Surgical Ltd System and method to automatically adjust illumination during a microsurgical procedure
US20210258507A1 (en) * 2019-11-14 2021-08-19 Transenterix Surgical, Inc. Method and system for depth-based illumination correction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106456272A (en) * 2014-03-17 2017-02-22 直观外科手术操作公司 Surgical system including a non-white light general illuminator
CN104320881A (en) * 2014-10-28 2015-01-28 江苏天语雅思医疗设备有限公司 Intelligent dimming controller in LED shadowless lamp lighting system
CN115379128A (en) * 2022-08-15 2022-11-22 Oppo广东移动通信有限公司 Exposure control method and device, computer readable medium and electronic equipment

Also Published As

Publication number Publication date
CN117440584A (en) 2024-01-23

Similar Documents

Publication Publication Date Title
Wang et al. Area determination of diabetic foot ulcer images using a cascaded two-stage SVM-based classification
CN117440584B (en) Surgical instrument segmentation auxiliary image exposure method, system, equipment and storage medium
Fuhl et al. Explainable online validation of machine learning models for practical applications
Sharma et al. Brain tumor segmentation using genetic algorithm and artificial neural network fuzzy inference system (ANFIS)
CN111079833B (en) Image recognition method, image recognition device and computer-readable storage medium
Ebrahimzadeh et al. Recognition of control chart patterns using an intelligent technique
CN113159227A (en) Acne image recognition method, system and device based on neural network
CN111275707B (en) Pneumonia focus segmentation method and device
Pramunendar et al. A Robust Image Enhancement Techniques for Underwater Fish Classification in Marine Environment.
CN115345938A (en) Global-to-local-based head shadow mark point positioning method, equipment and medium
Yadav et al. Computer‐aided diagnosis of cataract severity using retinal fundus images and deep learning
CN113569726B (en) Pedestrian detection method combining automatic data amplification and loss function search
de Chauveron et al. Artificial intelligence for oral squamous cell carcinoma detection based on oral photographs: A comprehensive literature review
Finzi et al. Topographic DCNNs trained on a single self-supervised task capture the functional organization of cortex into visual processing streams
CN118016279A (en) Analysis diagnosis and treatment platform based on artificial intelligence multi-mode technology in breast cancer field
CN112101438B (en) Left-right eye classification method, device, server and storage medium
CN117545122A (en) LED lamp array control method, device, storage medium and equipment
CN116510110A (en) Liquid level data analysis and liquid level control method based on bladder irrigation instrument
US20220319002A1 (en) Tumor cell isolines
Thomas et al. Eisoc with ifodpso and dcnn classifier for diabetic retinopathy recognition system
CN118072378B (en) Dynamic decision image segmentation method based on SAM basic model
EP4202523A1 (en) System, method and computer program for a surgical microscope system and corresponding surgical microscope system
Drazkowska et al. Application of Convolutional Neural Networks to femur tracking in a sequence of X-ray images
CN117953018B (en) Infrared induction screen following method, device, equipment and storage medium
Gojić et al. Overview of Deep Learning Methods for Retinal Vessel Segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant