CN112043299A - Control method and system of medical equipment - Google Patents

Control method and system of medical equipment Download PDF

Info

Publication number
CN112043299A
CN112043299A CN202011065636.6A CN202011065636A CN112043299A CN 112043299 A CN112043299 A CN 112043299A CN 202011065636 A CN202011065636 A CN 202011065636A CN 112043299 A CN112043299 A CN 112043299A
Authority
CN
China
Prior art keywords
image
display screen
medical device
medical
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011065636.6A
Other languages
Chinese (zh)
Inventor
钟灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202011065636.6A priority Critical patent/CN112043299A/en
Publication of CN112043299A publication Critical patent/CN112043299A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • A61B6/035Mechanical aspects of CT
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/56Details of data transmission or power supply, e.g. use of slip rings
    • A61B6/566Details of data transmission or power supply, e.g. use of slip rings involving communication between diagnostic systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the specification provides a control method and a control system of a medical device. The method comprises the following steps: acquiring a first image containing at least a partial region of a medical device, wherein the first image is shot by an image acquisition device arranged on a display screen, and the display screen is in signal connection with the medical device; identifying first image information of the first image; controlling at least one component of the medical device and/or the display screen movement based on the first image information.

Description

Control method and system of medical equipment
Technical Field
The present disclosure relates to the field of medical technology, and in particular, to a method and a system for controlling a medical device.
Background
In recent years, medical imaging techniques are widely used for clinical examination and medical diagnosis. For example, with the development of X-ray imaging technology, C-shaped X-ray imaging systems have become increasingly important in applications such as breast tomography, breast examination, tumor treatment, and the like. C-shaped X-ray imaging systems can be classified into large C, medium C, small C, and the like according to their application range in clinics. For example, the large C can be used for examination and treatment of head and neck vascular systems, chest vascular systems, abdominal vascular systems, limb vascular systems and the like, and the small C can be used for orthopedic diagnosis and treatment such as bone setting, reduction, nailing and the like or partial interventional treatment such as in-vivo foreign body extraction and the like. In the diagnosis and treatment process, a user can check the condition of a scanned target body in real time through a display screen connected with the medical equipment. The medical equipment part needs to avoid obstacles in the diagnosis and treatment process.
Disclosure of Invention
One aspect of the present specification provides a method of controlling a medical device. The method comprises the following steps: acquiring a first image containing at least a partial region of a medical device, wherein the first image is shot by an image acquisition device arranged on a display screen, and the display screen is in signal connection with the medical device; identifying first image information of the first image; controlling at least one component of the medical device and/or the display screen movement based on the first image information.
In some embodiments, the image capture device is fixedly mounted on the display screen.
In some embodiments, the image capture device is capable of moving and/or rotating relative to the display screen.
In some embodiments, the first image information comprises a user location; the controlling at least one component of the medical device and/or the display screen movement based on the first image information comprises: controlling the display screen motion based on the user position.
In some embodiments, after the movement of the display screen is completed, the display screen faces the user, and/or the distance between the display screen and the user is within a preset range.
In some embodiments, the method further comprises: acquiring a second image shot by the image acquisition device; identifying second image information of the second image; and controlling the image acquisition device to move based on the first image information and the second image information.
In some embodiments, the first image information comprises information of a first object and information of the at least one component; the controlling at least one component of the medical device and/or the display screen movement based on the first image information comprises: controlling the at least one component not to collide with the first object during the movement based on the information of the first object and the information of the at least one component.
In some embodiments, the information of the first object comprises at least one of a position, a shape, and a pose of the first object; the information of the at least one component includes at least one of a position, a shape, and a posture of the at least one component.
Another aspect of the present description provides a control system for a medical device. The system comprises: the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first image containing at least partial area of the medical equipment, the first image is shot by an image acquisition device arranged on the display screen, and the display screen is in signal connection with the medical equipment; the image identification module is used for identifying first image information of the first image; a control module for controlling at least one component of the medical device and/or the display screen movement based on the first image information.
Another aspect of the present specification provides a control apparatus for a medical device, including a processor for executing the medical device control method as described above.
Another aspect of the present specification provides a computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the medical device control method as described above.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a medical device control system according to some embodiments of the present description;
FIG. 2 is an exemplary flow chart of a medical device control method according to some embodiments of the present description;
FIG. 3 is an exemplary flow chart of a medical device control method according to further embodiments of the present description;
FIG. 4 is a schematic diagram of medical device control, shown in accordance with some embodiments of the present description;
FIG. 5 is a schematic diagram of a display screen control shown in accordance with some embodiments of the present description;
FIG. 6 is a schematic diagram of medical device component control according to some embodiments described herein;
FIG. 7 is an exemplary block diagram of a medical device control system, shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Although various references are made herein to certain modules or units in a system according to embodiments of the present description, any number of different modules or units may be used and run on the client and/or server. The modules are merely illustrative and different aspects of the systems and methods may use different modules.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario of a medical device control system according to some embodiments of the present description.
As shown in fig. 1, a medical device control system 100 according to some embodiments of the present application may include a medical device 110, a display screen 120, an image acquisition apparatus 123, at least one terminal device 130, a processing device 140, a storage device 150, and a network 160. The various components of the system 100 may be interconnected by a network 160. For example, the medical device 110 and the at least one terminal device 130 may be connected or communicate via the network 160. As another example, the medical device 110 and the display screen 120 may be connected or in communication via a network 160.
The medical device 110 may generate or provide image data related to a target subject by scanning the target subject. For illustrative purposes, in the present embodiment, the image data of the target subject acquired using the medical device 110 is referred to as medical image data, and the image data of the object in the treatment room acquired using the image acquisition apparatus 123 is referred to as an image (e.g., a first image, a second image). In some embodiments, the target subject may include a biological object and/or a non-biological object. For example, the target subject may include a particular part of the body, such as the head, chest, abdomen, etc., or a combination thereof. As another example, the target subject may be an artificial composition of organic and/or inorganic matter, living or non-living. In some embodiments, objects in a treatment room may include a combination of one or more of a user, a medical device, other objects in the room, and the like. For example, the objects in the treatment room may include equipment components, display screens, etc. of the area to which the medical equipment belongs. Also for example, objects in the treatment room may include medical personnel, target subjects, and the like. In some embodiments, the medical device 110 may include modules and/or components for performing imaging and/or correlation analysis. In some embodiments, the medical image data related to the target subject may include projection data, one or more scan images, etc. of the target subject.
In some embodiments, the medical device 110 may be a non-invasive biomedical imaging apparatus for disease diagnosis or research purposes. The medical device 110 may include a single modality scanner and/or a multi-modality scanner. The single modality scanner may include, for example, an ultrasound scanner, an X-ray scanner, a Computed Tomography (CT) scanner, a Magnetic Resonance Imaging (MRI) scanner, an ultrasound tester, a Positron Emission Tomography (PET) scanner, an Optical Coherence Tomography (OCT) scanner, an Ultrasound (US) scanner, an intravascular ultrasound (IVUS) scanner, a near infrared spectroscopy (NIRS) scanner, a Far Infrared (FIR) scanner, or the like, or any combination thereof. The multi-modality scanner may include, for example, an X-ray imaging-magnetic resonance imaging (X-ray-MRI) scanner, a positron emission tomography-X-ray imaging (PET-X-ray) scanner, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) scanner, a positron emission tomography-computed tomography (PET-CT) scanner, a digital subtraction angiography-magnetic resonance imaging (DSA-MRI) scanner, or the like. The scanners provided above are for illustration purposes only and are not intended to limit the scope of the present application. As used herein, the term "imaging modality" or "modality" broadly refers to an imaging method or technique that collects, generates, processes, and/or analyzes imaging information of a target subject.
In some embodiments, the medical device 110 may include a gantry, a detector, an examination region, a scanning bed, and a radiation source. A gantry may be used to support the detector and the source of radiation. The scanning couch may be used to position a subject for scanning, such as the examination couch 115 shown in fig. 4-6. In some embodiments, the scanning bed may be a device separate from the medical device 110. The target subject may include a patient, phantom, or other scanned object. The radiation source may emit X-rays toward the target body to irradiate the target body. The detector may be for receiving X-rays. In some embodiments, medical device 110 may be or include an X-ray imaging device, such as may include a DSA (Digital subtraction angiography), a Digital Radiography Device (DR), a Computed Radiography device (CR), a Digital fluoroscopy Device (DF), a CT scanner, a magnetic resonance scanner, a mammography machine, a C-arm device, and the like.
The display screen 120 may be used to observe the medical device 110 and/or data information of a target subject scanned by the medical device 110. For example, the medical staff may observe the lesion information of the detection site such as the chest, bone, and breast of the target subject through the display screen 120. In some embodiments, the display screen 120 may include a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) based display, a flat-panel back-panel display, a curved screen, a television device, a Cathode Ray Tube (CRT), a touch screen, or the like, or combinations thereof. In some embodiments, the display screen 120 may also include output devices such as speakers, printers, and/or input devices such as a keyboard, mouse, and the like.
In some embodiments, the display screen 120 may be mounted in a location convenient for viewing by a user (e.g., a healthcare worker). For example, the display screen 120 may be mounted on the ceiling directly above or beside the examination table by a hanger. As another example, the display screen 120 may be mounted on the medical device 110 frame, or disposed by a stand/cradle beside the examination table or C-arm of the medical device. For another example, the display screen 120 may be installed right in front of the position of the medical staff through a support/hanger, so that the medical staff can clearly view the medical image information of the scanned target subject in time. In some embodiments, the display screen 120 may be fixedly mounted, or slidably and/or rotatably mounted. For example, the display screen 120 may be mounted on or around the medical device by a rail or movable mount.
In some embodiments, the display screen 120 may be part of the medical device or a separate device having a signal connection with the medical device. In some embodiments, the display screen 120 may have a signal connection with the medical device 110 through the network 160. In some embodiments, the display screen 120 may have a direct signal connection with the medical device 110 through a cable, data line, or the like. In other embodiments, for the consultation of a plurality of medical staff, two or more display screens can be installed, so that each medical staff can conveniently check diagnosis and treatment information according to own observation angle or observation focus.
The image capturing device 123 may be used to capture an environmental image of the examination room. For example, the image acquisition device 123 may capture images of one or more components of a table, C-arm, gantry, etc. that contain medical equipment. For another example, the image capturing device 123 may capture images including medical equipment, target subjects, medical personnel, and the like. In some embodiments, the image capture device 123 may be and/or include any suitable device capable of capturing image data of an object within a procedure room. For example, the image capture device 123 may include a camera (e.g., a digital camera, an analog camera, etc.), a red-green-blue (RGB) sensor, an RGB-depth (RGB-D) sensor, or other device that may capture image data of objects within a procedure room. For another example, the image capturing device 123 may be used to obtain point cloud data of an object in the treatment room. The point cloud data may include at least two data points, each of which may represent a physical point on a surface of an object within the treatment room and may be described using a characteristic value of one or more physical points (e.g., a characteristic value related to a location and/or composition of a physical point). Exemplary image acquisition devices 123 capable of acquiring point cloud data may include 3D scanners, e.g., 3D laser imaging devices, structured light scanners (e.g., structured light laser scanners). For example only, structured light scanners may be used to scan objects within a treatment room to acquire point cloud data. During scanning, the structured light scanner may project a pattern of structured light (e.g., structured light spots, structured light grids) toward the object. The point cloud data may be acquired from structured light projected on the object. As yet another example, the image capture device 123 may be used to acquire depth image data of an object within a treatment room. The depth image data may refer to image data including depth information of each physical point on a body surface of the object, such as a distance from each physical point to a specific point (e.g., an optical center of the image capturing device 123). The depth image data may be acquired by a range sensing device, such as a structured light scanner, a time of flight (TOF) device, a stereo triangulation camera, a laser triangulation device, an interferometric device, a coded aperture device, a stereo matching device, and the like, or any combination thereof.
In some embodiments, the image capture device 123 may be fixedly mounted on the display screen 120. For example, the image capturing device 123 may be fixedly mounted on the display screen 120 by means of an embedded mounting or the like. For another example, the image capturing device 123 may be fixedly mounted on the bracket or the hanger of the display screen 120 by welding, riveting, screwing, or the like. In some embodiments, the image capture device 123 is slidably mounted on the display screen 120. For example, a slide rail is mounted at the edge of the display screen 120 or other position that does not affect the display (e.g., on a bracket or hanger of the display screen 120), and the image capturing device 123 is mounted on and can slide along the slide rail. In other embodiments, the image capture device 123 may be rotatably mounted to the display screen 120. For example, the image capturing device 123 may be mounted on the display screen 120 or its support/hanger via a rotary connection (e.g., a bearing, a turntable, etc.), and a rotary encoder may be disposed on the rotary connection. In some embodiments, the image capture device 123 is slidably and rotatably mounted on the display screen 120. For example, a slide rail is installed on the support of the display screen 120, and the image capturing device 123 may be mounted on the slide rail through a rotating connector and can slide and rotate along the slide rail. The installation position (for example, the edge, or the vertex angle, or other positions that do not hinder the display of the display screen information) of the image acquisition device on the display screen and the connection between the image acquisition device and the display screen can be any reasonable manner, which is not limited in the embodiments of the present specification. By mounting the image acquisition device on the display screen, the display screen can be prevented from blocking the image acquisition device from shooting images of objects (such as medical equipment) in the diagnosis and treatment room.
In other embodiments, the image acquisition arrangement 123 may be mounted on the medical device 110. For example, the image capturing device 123 may be mounted on a frame or a table of the medical apparatus 110 by fixing, sliding, and/or rotating. In some alternative embodiments, the image acquisition arrangement 123 may be a device separate from the medical device 110 and the display screen 120. For example, the image capturing device 123 may be fixedly installed on the ceiling, at a corner, or the like in the treatment room. For another example, the image capturing device 123 may be slidably and/or rotatably mounted on a ceiling, a floor, or the like of the position where the medical apparatus 110 is located, which is not limited in this specification.
In some embodiments, two or more image capturing devices 123 may be mounted, and the mounting positions and mounting manners of the two or more image capturing devices may be any combination of the above mounting positions and mounting manners. For example, one of the two image acquisition apparatuses is fixedly mounted on the display screen 120, and the other is slidably mounted on the medical device 110. As another example, both image capture devices are slidably and rotatably mounted on the display screen 120. As another example, one of the three image capturing devices is slidably and rotatably mounted on the display screen 120, one is rotatably mounted on the medical device 110, and the other is fixedly mounted on the ceiling of the treatment room.
In some embodiments, the image data captured by the image capture device 123 may be communicated to the processing apparatus 140 for further analysis. Additionally or alternatively, image data captured by image capture device 123 may be sent to a terminal device (e.g., terminal device 130) for display and/or a storage device (e.g., storage device 150) for storage.
In some embodiments, the image acquisition arrangement 123 may continuously or intermittently (e.g., periodically) acquire image data (e.g., a first image, a second image, etc.) within the room before, during, and/or after a scan of the target subject is performed by the medical device 110. In some embodiments, capturing image data by the image capture device 123, transmitting the captured image data to the processing apparatus 140, and analyzing the image data may be performed in substantially real-time, such that the image data may provide information indicative of a substantially real-time status within the treatment room.
The terminal device 130 may be in communication and/or connected with the medical device 110, the display screen 120, the image acquisition arrangement 123, the processing device 140 and/or the storage device 150. For example, a user may interact with the medical device 110 through the terminal device 130 to control one or more components of the medical device 110. In some embodiments, the terminal device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, and the like, or any combination thereof. For example, mobile device 131 may include a mobile joystick, a Personal Digital Assistant (PDA), a smart phone, or the like, or any combination thereof.
In some embodiments, terminal device 130 may include input devices, output devices, and the like. The input device may be selected from keyboard input, touch screen (e.g., with tactile or haptic feedback) input, voice input, eye tracking input, gesture tracking input, brain monitoring system input, image input, video input, or any other similar input mechanism. Input information received via the input device may be transmitted, for example, via a bus, to the processing device 140 for further processing. Other types of input devices may include cursor control devices, such as a mouse, a trackball, or cursor direction keys, among others. In some embodiments, an operator (e.g., a medical professional) may input instructions reflecting the medical image category selected by the target subject via an input device. Output devices may include a display, speakers, printer, or the like, or any combination thereof. The output device may be used to output images captured by the image acquisition device 123, and/or medical images scanned by the medical device 110, and/or images determined by the processing device 140, and/or the like. In some embodiments, the terminal device 130 may be part of the processing device 140.
The processing device 140 may process data and/or information obtained from the medical device 110, the image acquisition arrangement 123, the at least one terminal device 130, the storage device 150, or other components of the medical device control system 100. For example, the processing device 140 may acquire medical image data of the target subject from the medical device 110. For another example, the processing device 140 may acquire a captured image of an object in the treatment room from the image capturing apparatus 123. In some embodiments, the processing device 140 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data from the medical device 110, the image acquisition apparatus 123, the storage device 150, and/or the at least one terminal device 130 via the network 160. As another example, the processing device 140 may be directly connected to the medical device 110, the image acquisition apparatus 123, the at least one terminal device 130, and/or the storage device 150 to access information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the processing device 140 may include one or more processors (e.g., a single chip processor or a multi-chip processor). By way of example only, the processing device 140 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), an image processing unit (GPU), a physical arithmetic processing unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a micro-controller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof.
Storage device 150 may store data, instructions, and/or any other information. For example, the storage device 150 may store medical image data of the target subject acquired by the medical device 110, a first image, a second image, and the like captured by the image acquisition apparatus 123. In some embodiments, the storage device 150 may store data obtained from the medical device 110, the image acquisition arrangement 123, the at least one terminal device 130, and/or the processing device 140. In some embodiments, storage device 150 may store data and/or instructions that are used by processing device 140 to perform or use to perform the exemplary methods described in this application. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the storage device 150 may be implemented on a cloud platform.
In some embodiments, the storage device 150 may be connected to the network 160 to communicate with at least one other component in the medical device control system 100 (e.g., the processing device 140, the image acquisition arrangement 123, the at least one terminal device 130). At least one component in the medical device control system 100 may access data stored in the storage device 150 (e.g., first/second images containing medical device components, medical image data of a target subject, etc.) over the network 160. In some embodiments, the storage device 150 may be part of the processing device 140.
The network 160 may include any suitable network capable of facilitating information and/or data exchange for the medical device control system 100. In some embodiments, at least one component of the medical device control system 100 (e.g., the medical device 110, the image acquisition arrangement 123, the terminal device 130, the processing device 140, the storage device 150) may exchange information and/or data with at least one other component in the medical device control system 100 via the network 160. For example, the processing device 140 may obtain medical image data of the target subject from the medical device 110 via the network 160. The network 160 may include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN)), a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, etc., or any combination thereof. For example, network 160 may include a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a Bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, and the like, or any combination thereof. In some embodiments, network 160 may include at least one network access point. For example, the network 160 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which at least one component of the medical device control system 100 may connect to the network 160 to exchange data and/or information.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Many variations and modifications will occur to those skilled in the art in light of the teachings herein. The features, structures, methods, and other features of the example embodiments described herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the storage device 150 may be a data storage device comprising a cloud computing platform (e.g., public cloud, private cloud, community and hybrid cloud, etc.). However, such changes and modifications do not depart from the scope of the present application.
In some embodiments, during the diagnosis and treatment, it is necessary to control the components (e.g., C-arm) of the medical apparatus to move to the corresponding detection sites of the scanned target subject according to different conditions of different target subjects, so that the medical staff can inspect or treat the components. However, components of the medical device may collide with other components or objects during the movement, affecting the procedure. In some embodiments, the medical treatment space where the medical equipment is located can be scanned by a depth camera installed in the treatment room. However, the image obtained by this method has a large shooting range, low accuracy, and severe visual obstruction. In some embodiments, the medical staff can know the condition of the scanned target subject in real time through the display screen. However, the medical staff may move during the diagnosis and treatment process, and if the display screen is fixed, the medical staff may not clearly view the content of the display screen due to an uncomfortable picture angle, a long picture and the like.
Therefore, some embodiments of the present application provide a method for controlling a medical device, which may acquire an image of at least a partial region of the medical device through an image acquisition apparatus mounted on a display screen, and further analyze information such as a position of the medical device, a user, the display screen, and the like based on the acquired image to control a motion of at least one component of the medical device and/or the display screen. In some embodiments, by controlling the movement of at least one component of the medical apparatus based on the image information, collision of the components in the movement, even damage to the components due to collision, can be avoided, thereby ensuring smooth performance of diagnosis and treatment. In some embodiments, by controlling the movement of the display screen, the medical staff can be helped to see the content of the display screen clearly all the time in the diagnosis and treatment process.
FIG. 2 is an exemplary flow chart of a medical device control method according to some embodiments of the present description.
In particular, the medical device control method 200 may be performed by a medical device control system 100 (e.g., the processing device 140). For example, the medical device control method 200 may be stored in a storage means (e.g., the storage device 150) in the form of a program or instructions that when executed by the medical device control system 100 (e.g., the processing device 140) may implement the medical device control method 200. In some embodiments, the medical device control method 200 may be performed by the medical device control system 700.
In step 210, a first image of at least a partial region of a medical device is acquired, said first image being captured by an image capture device mounted on a display screen. In some embodiments, step 210 may be performed by acquisition module 710.
The at least partial region of the medical device may comprise a region in which a part of the medical device is located, or a region in which all parts of the medical device are located, or a part thereof. For example, the at least partial region of the medical device may be a region of an examination table, a C-arm, containing the medical device. For another example, at least a portion of the region of the medical device may be a region that contains all of the components of the medical device, or one third of the region, three quarters of the region, etc. In some embodiments, the first image may include a medical device (e.g., a component of the medical device) and an object external to the medical device. In some embodiments, the first image may include one or more biological objects and/or one or more non-biological objects. For example, the first image may include a target subject to be examined or operated on, and/or a medical device used to scan or treat the target subject, or a portion of the medical device (e.g., a gantry, a scanning table), and/or a medical professional examining or treating the target subject. Also for example, the first image may include a roof, floor, wall, surgical instrument, protective curtain, cable, etc., or a combination thereof, within the treatment room. In some embodiments, the object contained in the first image may be a movable object, and/or a stationary object.
In some embodiments, the image capture device may be fixedly mounted on the display screen and move with the movement of the display screen. In some embodiments, the image capture device may slide and/or rotate relative to the display screen. For example, the image capture device may be mounted on the display screen via a rail and/or a rotational connection. In some embodiments, a first image containing at least a partial region of the medical device may be acquired by adjusting the position and/or angle of the image acquisition arrangement. For example, the processing device may move the image acquisition arrangement by rotating and/or moving the image acquisition arrangement such that the position and angle of the image acquisition arrangement is such that the field of view covers one or more target objects of the examination table, medical staff, medical device components, etc.
In some embodiments, the processing device may acquire the first image captured by the image capture device in real-time or periodically. For example, the processing device 140 may acquire a first image of the captured region including at least a portion of the medical device from the image acquisition arrangement 123 at intervals (e.g., 1 second, 3 seconds, 10 seconds, 30 seconds, 1 minute, 5 minutes, 20 minutes, etc.). For another example, the processing device 120 may acquire the first image captured by the image capturing device 123 in real time (e.g., 10 frames per second, 24 frames per second, etc.) when the medical device diagnoses the target subject.
In some embodiments, a display screen on which the image acquisition arrangement is mounted has a signal connection with the medical device. When a medical device (e.g., medical device 110) scans a target subject, a medical image of the target subject may be transmitted to a display screen (e.g., display screen 120) in real time for presentation for a medical staff or other user to view. More contents about the image acquisition device, the display screen and the medical equipment can be referred to fig. 1 and the related description thereof, and are not described herein again.
Step 220, identifying first image information of the first image. In some embodiments, step 220 may be performed by image recognition module 720.
The first image information may reflect one or more types of information including a position, a shape, a posture, and the like of the object in the first image. In some embodiments, the object position may include a combination of one or more of a coordinate position of the object in the image, a spatial coordinate position in the treatment room, a relative coordinate position with another object, and the like. In some embodiments, the pose of the object may include a combination of one or more of orientation, motion, etc. of the object. In some embodiments, the first image information may include a user location. The user may include, but is not limited to, an operator of the medical device (e.g., a healthcare worker), a detector of the medical device (e.g., a target subject), or other relevant user. In some embodiments, the first image information may also include user characteristics (e.g., user height, whether the user is wearing glasses, user five-sense-organ ratio, etc.).
In some embodiments, the processing device may identify the first image information of the first image by an image recognition algorithm. For example, image recognition algorithms may include, but are not limited to, statistical pattern recognition, structural pattern recognition, fuzzy pattern recognition, and the like. In some embodiments, the first image information of the first image may be identified by a trained image recognition model. For example, the image recognition model may include, but is not limited to, one or any combination of Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), RCNNs (regions with CNNs), Fast-RCNNs, BP Neural Networks, K-nearest neighbor algorithms (KNNs), Support Vector Machines (SVMs), and the like.
Step 230, controlling the display screen to move based on the first image information. In some embodiments, step 230 may be performed by control module 730.
In some embodiments, the processing device may control the display screen movement based on the user position in the first image information. For example, the processing device may move the display screen to a position diagonally above, or directly in front of, the user's position. In some embodiments, after the movement of the display screen is completed, the display screen faces the user, and/or the distance between the display screen and the user is within a preset range. For example, at least a portion of the display interface of the display screen after the movement is opposite the user. As another example, the relative distance between the display interface of the display screen and the user (e.g., the face of the user) after the movement is within the range of 100 and 200 cm. In some embodiments, the preset range may be any reasonable value that is convenient for a user to view the display screen, and this specification does not limit this.
In some embodiments, the processing device may control display screen motion based on the user's face (or head) position. For example, the processing device may control the display screen movement based on the relative position (e.g., distance, angle, etc.) of the center point of the user's face (or head) to the center point of the display screen. For example only, the processing device may calculate a spatial coordinate position of the face (or head) of the user based on the coordinate position of the face (or head) of the user in the first image and the spatial coordinate position of the image capturing device in the clinic, and then obtain a relative distance between a center point of the face (or head) of the user and a center point of the display screen in combination with the spatial coordinate position of the display screen in the clinic, and when the relative distance is greater than a preset distance threshold, control the display screen to move such that a distance between the center point of the moved display screen and the center point of the face (or head) of the user is less than or equal to the preset distance threshold. In some embodiments, a line connecting the center point of the display screen after the movement and the center point of the face (or head) of the user is perpendicular or approximately perpendicular to the display screen.
In some embodiments, the processing device may control display screen movement based on user position. For example, when the coordinate position of the user's face (or head) in the first image is biased to the left (or right) of the vertical center line of the first image, the processing device may calculate a first distance between the center point of the user's face (or head) and the center point of the display screen based on the image capture device position and the display screen position, and in response to the first distance being greater than a preset distance threshold, the processing device may control the display screen to move horizontally to the right (or left) by a certain distance (e.g., by a distance that is the square of the square difference of the first distance and the preset distance threshold, or by a distance that is such that the first distance is within a certain range). In some embodiments, the processing device may control the display screens to move in one or more directions respectively, so that the distance between the moved display screens and the user is within a preset range. For example, the processing device may control the display screen to move leftward and downward by corresponding distances based on the position of the user's face (or head), so that the distance between the center point of the display screen and the center point of the user's face (or head) after the movement is within the range of 100 and 120 cm.
In some embodiments, the processing device may control the display screen to rotate based on the user position. For example, the processing device may calculate an included angle between a connection line of a center point of the face (or the head) of the user and a center point of the display screen and a perpendicular line of the display screen based on the coordinate value of the face (or the head) of the user, and when the included angle is not 0, control the display screen to rotate to a specific direction by the same angle as the included angle. In some embodiments, the processing device may control the display to rotate in one or more directions, respectively. For example, the processing device may control the display screen to rotate a corresponding angle to the left and down, respectively, based on the user's face (or head) position.
In some embodiments, the processing device may control the display screen to rotate and move based on the user position. In some embodiments, the movement and/or rotation of the display screen may be controlled by one or more motors by which the processing device may control the movement and/or rotation of the display screen.
In some embodiments, after controlling the movement of the display screen, the processing device may again capture the image captured by the image capture device to determine whether the display screen is moved into position. For example, the processing device may obtain a third image and determine whether the display screen is moved in place based on user information in the third image. By way of example only, assuming that the image capture device is mounted at the midpoint of the upper/lower edge of the display screen, it may be determined whether the display screen is moved into position based on the position of the user in the third image relative to the center point of the third image. For example, if the user's position in the third image is 0 from the line point in the third image, or the deviation is within a certain range (e.g., 0-5cm), the display screen is considered to be moved into position. In some embodiments, when there is a deviation in the moved display screen, the processing device may again control its movement to adjust it. It is to be understood that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application.
In some embodiments, the user may manually adjust the angle and/or position of the display screen. For example, an adjusting button is arranged at a position convenient for a user to operate, and the user can adjust and control the orientation of the display screen at any time through the adjusting button. In some embodiments, after the user manually adjusts the angle and/or position of the display screen, the system may record user-preferred parameters and adjust the position and/or angle of the display screen according to the user-preferred parameters as the user moves.
For convenience of understanding, the control process of the display screen will be described below with reference to fig. 4 and 5 by taking the medical device as a C-arm device as an example. Wherein the original position of the display screen is shown in fig. 4 and the moved position of the display screen is shown in fig. 5. In fig. 4 and 5, a denotes a medical staff, B denotes a subject to be examined, 115 denotes an examination couch, 117 denotes a C-arm of a medical apparatus, and F denotes a photographing region of the image pickup device 123. In some embodiments, the couch 115 may be one of the components of a medical device (e.g., medical device 110). In some embodiments, the couch 115 may be a separate component from the medical device.
As shown in FIG. 4, in the initial state, medical person A is directly opposite the display interface of display screen 120. In this state, the medical staff a can clearly view the contents in the display screen. Medical staff a may move to the position shown in fig. 5 in the direction of C-arm 117 during the diagnosis process, and if display screen 120 remains in the initial state, medical staff a may not clearly view the contents in the display screen due to the angle, distance, and the like. In some embodiments provided herein, the processing device may obtain a first image captured by the image capturing device 123 and including the medical person a, the target subject B, the examination couch 115, and the C-arm 117, identify first image information in the first image, and control the movement of the display screen based on the current position of the medical person a in the first image information. As shown in FIG. 5, the moved display screen is oriented toward healthcare worker A, and the display screen 120 may remain in a forward facing relationship with healthcare worker A. The movement of the display screen is controlled by the user information in the image shot by the image acquisition device, so that the user can still clearly watch the content in the display screen after the position of the user is changed, the operation efficiency of the user is improved, and the smooth diagnosis and treatment process is ensured.
And step 240, acquiring a second image shot by the image acquisition device. In some embodiments, step 240 may be performed by acquisition module 710.
In some embodiments, the processing device may control the movement of the image capture device based on the moved display screen to maintain stability of the image captured by the image capture device. In some embodiments, the processing device may acquire the second image captured by the image capture device after controlling the display screen to move. The second image may contain the medical device, and/or other objects outside of the medical device. In some embodiments, the second image may contain content that is identical, partially identical, or completely different from the content contained in the first image. For example, the first image and the second image may each contain a C-arm of a medical device, a medical professional, an examination couch, or the like. As another example, the first image may comprise a C-arm of a medical device, a medical person, an examination couch, and the second image may comprise an examination couch. In some embodiments, when no motion of the display screen occurs (e.g., step 230 is not performed), the processing device may also acquire a second image captured by the image capture device and perform subsequent steps (e.g., steps 250 and 260).
Step 250, identifying second image information of the second image. In some embodiments, step 250 may be performed by image recognition module 720.
The second image information may reflect information such as the pose, shape, and position of the medical device, and/or other objects within the treatment room. The content of the second image information and the identification method thereof are similar to the first image information, and further details can be found in step 220, which is not described herein again.
Step 260, controlling the motion of the image acquisition device based on the first image information and the second image information. In some embodiments, step 260 may be performed by control module 730.
In some embodiments, the processing device may control the movement of the image capture device based on a difference between the first image information and the second image information. By way of example only, when the objects contained in the first and second images are the same (e.g., both include a healthcare worker, a couch, a C-arm), the processing device may control the movement of the image capture device based on a change in position of one or more of the objects in the first and second images by comparing the positional information of the objects in the first image to the positional information of the objects in the second image. For example, the object and its position in the image captured by the image capturing device after the movement are the same as or substantially the same as the object and its position in the first image (e.g., both are located in the center of the image) (e.g., both are located within a certain circular area of the center of the image). In some embodiments, the processing device may automatically plan a motion trajectory of the image capture device based on the difference between the second image information and the first image information, and control the motion of the image capture device based on the motion trajectory. For example, the processing device may determine a movement distance, a movement direction, and the like of the image capturing apparatus by comparing displacement changes of corresponding objects in the first image information and the second image information.
In some embodiments, the processing device may control the movement of the image capture device based on a motion profile of the display screen. For example, the processing device may control the image capture device to rotate by the same angle in the opposite direction and/or to move by the same distance in the opposite direction based on the rotation angle and/or the movement distance of the display screen. For example, after the display screen 120 is moved with the image acquisition arrangement towards the medical professional a, the processing device may control the image acquisition arrangement 123 to move in reverse based on the movement trajectory of the display screen 120, such that the image acquisition arrangement 123 is still able to acquire images of the component of the medical device, such as the C-arm 117. It will be appreciated that when the image capture device is mounted on the display screen, the image capture device may move with the movement of the display screen. Through the motion based on the display screen, adjust image acquisition device's position and/or angle, can make image acquisition device's shooting scope cover fixed one or more objects all the time to the environment around the medical equipment of help better monitoring is avoided the medical equipment part to bump in the motion.
In some embodiments, after controlling the movement of the image capture device, the processing device may again acquire an image (e.g., a fourth image) captured by the image capture device to calibrate the position of the image capture device. If the position of the image acquisition device after movement still has deviation, the processing equipment can further adjust the image acquisition device. In some embodiments, the user may also manually (e.g., via an associated adjustment button, etc.) adjust the angle and/or position of the image capture device.
It should be noted that the above description of method 200 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present description. Various modifications and alterations to method 200 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are intended to be within the scope of the present description. For example, the position adjustment of the display screen and the image acquisition device can be automatically controlled, and can also be manually regulated and controlled by a user according to needs. For another example, the processing device may control the movement of the image capturing device based on the movement of the medical device such that the range captured by the image capturing device always covers the medical device. For another example, the images captured by the image capturing device may be displayed on the display screen at the same time.
FIG. 3 is an exemplary flow chart of a medical device control method according to further embodiments of the present description.
The medical device control method 300 may be performed by a medical device control system 100, such as the processing device 140. For example, the medical device control method 300 may be stored in a storage means (e.g., the storage device 150) in the form of a program or instructions that when executed by the medical device control system 100 (e.g., the processing device 140) may implement the medical device control method 300. In some embodiments, the medical device control method 300 may be performed by the medical device control system 700. As shown in fig. 3, a medical device control system (e.g., system 100, system 700) may control at least one component of a medical device based on images captured by an image capture device 123 mounted on a display screen.
In step 310, the processing device may acquire a first image containing at least a partial region of the medical device, the first image being captured by an image acquisition arrangement mounted on a display screen. Based on the first image that contains at least partial region of medical equipment that image acquisition device shot, can avoid it to collide with other parts of medical equipment or other objects in the room of diagnosing when control medical equipment part, the influence is diagnose. For more details of the first image, reference may be made to step 210, which is not described in detail herein.
At step 320, the processing device may identify first image information for the first image. In some embodiments, step 320 may be performed by image recognition module 720.
In some embodiments, the first image information may include information of the first object and information of at least one component of the medical device. The first object may include one or more biological objects and/or one or more non-biological objects in a medical treatment room. For example, the first object may include medical equipment, medical personnel, accompanying personnel, a target subject (e.g., a patient), a display screen, an examination bed, a roof, a floor, a light fixture, a wall, a surgical drape, a lead curtain, a cable, a stand, a connector, an ancillary instrument, or other people, things, etc., that are temporarily present, or any combination thereof. The information of the first object may include a position, a shape, a posture, etc., or any combination thereof, of the first object. In some embodiments, the at least one component of the medical device may include one or more components that may move or rotate during scanning of the target subject. For example, the at least one component of the medical device may include one or more of a gantry, radiation source, scanning bed, detector, etc. of the medical device. The information of the at least one component of the medical device may include a position, a shape, a status, etc., or any combination thereof, of the at least one component. In some embodiments, the first object may be a moving or stationary object and the at least one component of the medical device may be a moving or stationary component. In some embodiments, at least one component of the medical device is movable.
In some embodiments, the first image information may include information of one or more sets of combinations of the first object and the at least one component of the medical device. For each set of the first object and the at least one component of the medical device, the processing device may determine whether a collision is likely to occur between the set of the first object and the at least one component of the medical device. For example, the processing device may select the first object and the at least one component of the medical device to be monitored for collision avoidance from at least two sets of first object and medical device at least one component combinations.
In some embodiments, the processing device may select the set of first objects and the at least one component of the medical device according to one or more selection rules. For example, a collision between two objects may not be considered if the collision has no significant effect on the treatment or scan performed or to be performed. By way of example only, if the object has a soft texture or a flexible texture, such as a surgical towel, a lead curtain, a cable, etc., then it may not be selected as the first object. For another example, at least one of the selected first object or the at least one component of the medical device may be movable. Generally, a collision is unlikely to occur between two stationary objects. As yet another example, collision detection of a first object (such as a healthcare worker) may be omitted from collision detection if the distance of the first object to the target subject or medical device exceeds a threshold distance.
In some embodiments, the processing device may identify the first object and the at least one component of the medical device and information thereof based on the first image. For example, the processing device may identify the first object and the at least one component of the medical device based on the first image using an object detection algorithm. Illustratively, the object detection algorithm may include the Region-CNN algorithm, a single-shot multi-box object detection (SSD) algorithm, a one-time-you-look (YOLO) network, and the like. The processing device may also acquire characteristic information related to the first object and each object in the at least one component of the medical device. Exemplary characteristic information of an object may include a position, shape, state, texture of the object, whether the object is movable, a tag indicating whether the object (e.g., patient, medical device) needs to be monitored to prevent collisions, etc., or any combination thereof. The processing device may select a first object to be monitored and at least one component of the medical device from a plurality of objects based on the characteristic information. For example, the processing device may select the table as the first object and the C-arm as the medical device part based on the characteristic information of all objects in the treatment room. In some embodiments, the processing device may adjust the position and/or angle of the image acquisition arrangement based on the determined first object to be monitored and the at least one component of the medical device such that a shooting range of the image acquisition arrangement may cover the first object and the at least one component of the medical device. In the medical device control method for avoiding a collision (or collision detection), all objects in a medical treatment room may be monitored to prevent a collision.
Step 330, controlling the at least one component not to collide with the first object during the motion process based on the information of the first object and the information of the at least one component. In some embodiments, step 330 may be performed by control module 730.
In some embodiments, a processing device may determine a relative positional relationship of a first object and at least one component of a medical device based on information of the first object and the at least one component of the medical device to determine whether the at least one component of the medical device may collide with the first object during motion. For example only, if the first object is a bed and the component of the medical device is a C-arm, the processing device may determine whether the C-arm collides with the bed during movement (e.g., up and down movement) based on the identified information of the position, shape, and posture of the bed and the C-arm in the first image. When the C-shaped arm possibly collides with the examination bed, the processing equipment can control the C-shaped arm to change the motion track so as to avoid the collision with the examination bed; when the C-shaped arm does not collide with the examination bed, the processing equipment can control the C-shaped arm to move according to the original track. In another example, the first object may be an object (e.g., a commodity shelf) other than the medical device, the at least one component may be a robot arm and a C-arm of the medical device, and the processing device may respectively determine whether the robot arm and the C-arm may collide with the commodity shelf in the first image based on the recognized commodity shelf in the first image and information of the position, shape, and posture of the robot arm and the C-arm. When the mechanical arm and the C-shaped arm are possibly collided with the first object, the processing equipment can respectively control the mechanical arm and/or the C-shaped arm to change the motion track so as to avoid collision with the first object; when neither the robot arm nor the C-arm collides with the first object, the processing device may control the robot arm and the C-arm to move according to the original trajectory.
In some embodiments, the processing device may adjust the motion trajectory of the medical device component to re-plan a new motion trajectory that would enable the moving component to bypass the obstacle. For example, the processing device may compare a minimum distance of the first object from a motion trajectory of at least one component of the medical device with a preset threshold, plan a new motion trajectory for the at least one component of the medical device if the minimum distance is less than or close to the preset threshold, and slow down the motion speed in the area at risk of collision. In some embodiments, the motion trajectory of at least one component of the medical device may be defined by one or more motion parameters. For example, the motion parameters may include, but are not limited to, a current location, a distance of motion, a direction of motion, a speed of motion, and the like, or any combination thereof.
In some alternative embodiments, the processing device may control, based on the at least two first images, at least one component of the medical device not to collide with the respective first object during the movement. For example, the processing device may determine a motion trajectory of the first object and/or the at least one component of the medical device based on a change in position of a set of the first object and the at least one component of the medical device in the at least two first images, respectively, determine whether a collision of the first object and the at least one component of the medical device may occur based on the motion trajectory of the first object and the at least one component of the medical device, and control at least one of the first object and the at least one component of the medical device to change the motion trajectory when a collision is likely to occur. In some embodiments, the at least two first images may comprise a set of image data acquired by the image acquisition device at two specific points in time. Alternatively, the at least two first images may comprise at least one set of image data at a series of points in time. Each image of the at least one set of image data may correspond to one of the at least two points in time. The time interval between each pair of successive time points of the at least two time points may be fixed or not.
In some embodiments, the processing device may acquire at least one image containing the first object and at least one image containing at least one component of the medical device, respectively. The processing device may control the at least one component of the medical device not to collide with the first object during the movement based on information of the first object and the at least one component of the medical device in the image corresponding to the first object and the image corresponding to the at least one component of the medical device. For example, the processing device may acquire the position of the first object and the at least one component of the medical device and/or the motion trajectory of the first object and the at least one component of the medical device from the at least one image corresponding to the first object and the at least one image corresponding to the at least one component of the medical device, respectively, and control the at least one component to move based on the position of the first object and the at least one component of the medical device and/or the motion trajectory thereof.
In some embodiments, an image acquisition arrangement (e.g., image acquisition arrangement 123) may acquire sets of image data in a time sequence during a scan (e.g., a DSA scan or a roadmap scan) of a target subject performed by a medical device (e.g., medical device 110). Based on sets of image data of two or more objects, such as the first object and at least one component of the medical device, the processing device may continuously or intermittently (e.g., periodically) monitor the motion of the objects. When it is monitored that a collision between two or more objects may occur, the processing device may control at least one of the two or more objects to change the motion trajectory in real time to avoid the collision between the two or more objects.
For convenience of understanding, the control process of the medical device components will be described below with reference to fig. 6 by taking the medical device as a C-arm device as an example. Where a denotes a medical staff, 115 denotes an examination couch, S1 denotes an original movement locus of the C-arm device, S2 denotes a movement locus of the C-arm device updated based on the first image, Z denotes a first object, and F denotes a photographing region of the image capturing apparatus 123.
As shown in fig. 6, the image acquisition device 123 may acquire a first image including the examination table in the photographing region F thereof, the first object Z, and a C-arm device (not shown in the figure) moving along the trajectory S1. Based on this first image, the processing device may determine that a collision with the first object Z may occur if the C-arm device continues to move along its original movement trajectory S1. At this time, the processing apparatus may adjust the trajectory S1 to the obstacle avoidance trajectory S2 based on the position of the first object Z to bypass the first object Z and avoid collision therebetween.
In some alternative embodiments, the processing device may send alert information to the user upon determining that the at least one component of the medical device and the first object are likely to collide based on the first image, the at least one component of the medical device or the first object being manually controlled by the user based on the alert information to avoid collision of the two. For example, the warning information may include information such as the position and shape of the obstacle. In some embodiments, the alert means may include text, voice, image, video, tactile alert, etc., or any combination thereof.
It should be noted that the above description of method 300 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present description. Various modifications and alterations to method 300 will become apparent to those skilled in the art in light of the present description. However, such modifications and variations are intended to be within the scope of the present description. For example, the first image may be captured continuously or intermittently, may be captured periodically by the image capturing device automatically, or may be captured manually by an operator at any time. For another example, an image capture device may be mounted to the medical device while at least one component of the medical device is controlled based on images captured by the image capture device on the medical device and the image capture device on the display screen. In some alternative embodiments, the method 300 may also be applied to control the motion trajectory of other devices in the treatment room, for example, for some omni-directional automatic disinfection devices, the motion trajectory of the disinfection nozzle may be controlled.
FIG. 7 is an exemplary block diagram of a medical device control system, shown in accordance with some embodiments of the present description.
As shown in fig. 7, the medical device control system 700 may include an acquisition module 710, an image recognition module 720, and a control module 730. In some embodiments, the medical device control system 700 may be implemented by the medical device control system 100 (e.g., the processing device 140) shown in fig. 1.
The acquisition module 710 may be configured to acquire a first image comprising at least a portion of a region of a medical device. The first image is captured by an image capture device mounted on a display screen having a signal connection with the medical device. In some embodiments, the acquisition module 710 may acquire other images taken by the image acquisition device. For example, the acquisition module 710 may acquire a second image taken by the image acquisition apparatus after the display screen is moved, or an image taken by the image acquisition apparatus containing the first object and/or an image containing at least one component of the medical device, and the like.
The image recognition module 720 may be used to recognize the first image information of the first image acquired by the acquisition module 710. In some embodiments, the image identification module 720 may identify second image information for the second image. In some embodiments, the image recognition module 720 may recognize image information in other images, for example, the image recognition module 720 may recognize image information in an image containing at least one component of the medical device, image information in an image containing the first object, and the like.
The control module 730 may be used to control at least one component of the medical device and/or the display screen movement. For example, the control module 730 may control at least one component of the medical device 110 and/or the display screen 120 to move based on the first image information identified by the image identification module 720. In some embodiments, the control module 730 may control the image capturing device to move based on the second image information identified by the image identification module 720. In some embodiments, the control module 730 may be used to control other components, such as, for example, a couch independent of a medical device, etc., without limitation thereto.
It should be understood that the system and its modules shown in FIG. 7 may be implemented in a variety of ways. For example, in some embodiments, system 700 and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system 700 and its modules is merely for convenience of description and should not be construed as limiting the present disclosure to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. In some embodiments, the obtaining module 710, the image recognizing module 720 and the control module 730 can be different modules in one system, or one module can implement the functions of two or more modules. In some embodiments, the obtaining module 710, the image recognizing module 720 and the control module 730 may share one storage module, and each module may also have its own storage module. In some embodiments, the number of control modules 730 may be the same as the number of medical device components, each controlled independently; one control module 730 may also be shared by all medical device components. Such variations are within the scope of the present disclosure.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) based on the image acquired by the image acquisition device arranged on the display screen, the medical equipment component can be accurately controlled, and the medical equipment is prevented from colliding with other objects; (2) the position of the display screen is regulated and controlled based on the position of the user, a more comfortable observation angle is provided for an operator, and the user can conveniently watch the condition of a target main body (such as focus information of a detected main body) in the diagnosis and treatment process; (3) the position of the image acquisition device is adjusted based on the position of the display screen, so that the stability of the image can be ensured, and the continuous monitoring and control of the medical equipment component are realized. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (11)

1. A method of controlling a medical device, the method comprising:
acquiring a first image containing at least a partial region of a medical device, wherein the first image is shot by an image acquisition device arranged on a display screen, and the display screen is in signal connection with the medical device;
identifying first image information of the first image;
controlling at least one component of the medical device and/or the display screen movement based on the first image information.
2. The method of claim 1, wherein the image capture device is fixedly mounted on the display screen.
3. The method according to claim 1, characterized in that the image acquisition device is movable and/or rotatable relative to the display screen.
4. The method of claim 1, wherein the first image information comprises a user location;
the controlling at least one component of the medical device and/or the display screen movement based on the first image information comprises:
controlling the display screen motion based on the user position.
5. The method of claim 4, wherein the display screen is oriented towards the user and/or the distance between the display screen and the user is within a predetermined range after the movement of the display screen is completed.
6. The method of claim 1, further comprising:
acquiring a second image shot by the image acquisition device;
identifying second image information of the second image;
and controlling the image acquisition device to move based on the first image information and the second image information.
7. The method of claim 1, wherein the first image information comprises information of a first object and information of the at least one component;
the controlling at least one component of the medical device and/or the display screen movement based on the first image information comprises:
controlling the at least one component not to collide with the first object during the movement based on the information of the first object and the information of the at least one component.
8. The method of claim 7, wherein the information of the first object comprises at least one of a position, a shape, and a pose of the first object;
the information of the at least one component includes at least one of a position, a shape, and a posture of the at least one component.
9. A control system for a medical device, the system comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first image containing at least partial area of the medical equipment, the first image is shot by an image acquisition device arranged on the display screen, and the display screen is in signal connection with the medical equipment;
the image identification module is used for identifying first image information of the first image;
a control module for controlling at least one component of the medical device and/or the display screen movement based on the first image information.
10. A control apparatus for a medical device comprising a processor for performing the method of any one of claims 1 to 8.
11. A computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method of any one of claims 1 to 8.
CN202011065636.6A 2020-09-30 2020-09-30 Control method and system of medical equipment Pending CN112043299A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011065636.6A CN112043299A (en) 2020-09-30 2020-09-30 Control method and system of medical equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011065636.6A CN112043299A (en) 2020-09-30 2020-09-30 Control method and system of medical equipment

Publications (1)

Publication Number Publication Date
CN112043299A true CN112043299A (en) 2020-12-08

Family

ID=73605518

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011065636.6A Pending CN112043299A (en) 2020-09-30 2020-09-30 Control method and system of medical equipment

Country Status (1)

Country Link
CN (1) CN112043299A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102165492A (en) * 2008-09-29 2011-08-24 修复型机器人公司 Tracking of hair follicles
US20130072787A1 (en) * 2011-09-16 2013-03-21 Translucent Medical, Inc. System and method for virtually tracking a surgical tool on a movable display
CN103027699A (en) * 2011-09-30 2013-04-10 西门子公司 Method for controlling the movement of an x-ray apparatus and x-ray system
US20140357984A1 (en) * 2013-05-30 2014-12-04 Translucent Medical, Inc. System and method for displaying anatomy and devices on a movable display
CN106354161A (en) * 2016-09-26 2017-01-25 湖南晖龙股份有限公司 Robot motion path planning method
US20190117318A1 (en) * 2017-10-25 2019-04-25 Luc Gilles Charron Surgical imaging sensor and display unit, and surgical navigation system associated therewith
CN109998674A (en) * 2017-11-24 2019-07-12 西门子医疗有限公司 Medical imaging computed tomography apparatus and the method intervened based on imaging
CN110561399A (en) * 2019-09-16 2019-12-13 腾讯科技(深圳)有限公司 Auxiliary shooting device for dyskinesia condition analysis, control method and device
CN110604579A (en) * 2019-09-11 2019-12-24 腾讯科技(深圳)有限公司 Data acquisition method, device, terminal and storage medium
EP3639782A1 (en) * 2018-10-16 2020-04-22 Karl Storz SE & Co. KG Control arrangement for controlling a movement of a robot arm and treatment device comprising a control arrangement
CN111182847A (en) * 2017-09-05 2020-05-19 柯惠Lp公司 Robotic surgical system and method and computer readable medium for control thereof
CN111432744A (en) * 2017-09-21 2020-07-17 德普伊新特斯产品公司 Surgical instrument with display system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102165492A (en) * 2008-09-29 2011-08-24 修复型机器人公司 Tracking of hair follicles
US20130072787A1 (en) * 2011-09-16 2013-03-21 Translucent Medical, Inc. System and method for virtually tracking a surgical tool on a movable display
CN103027699A (en) * 2011-09-30 2013-04-10 西门子公司 Method for controlling the movement of an x-ray apparatus and x-ray system
US20140357984A1 (en) * 2013-05-30 2014-12-04 Translucent Medical, Inc. System and method for displaying anatomy and devices on a movable display
CN106354161A (en) * 2016-09-26 2017-01-25 湖南晖龙股份有限公司 Robot motion path planning method
CN111182847A (en) * 2017-09-05 2020-05-19 柯惠Lp公司 Robotic surgical system and method and computer readable medium for control thereof
CN111432744A (en) * 2017-09-21 2020-07-17 德普伊新特斯产品公司 Surgical instrument with display system
US20190117318A1 (en) * 2017-10-25 2019-04-25 Luc Gilles Charron Surgical imaging sensor and display unit, and surgical navigation system associated therewith
CN109998674A (en) * 2017-11-24 2019-07-12 西门子医疗有限公司 Medical imaging computed tomography apparatus and the method intervened based on imaging
EP3639782A1 (en) * 2018-10-16 2020-04-22 Karl Storz SE & Co. KG Control arrangement for controlling a movement of a robot arm and treatment device comprising a control arrangement
CN110604579A (en) * 2019-09-11 2019-12-24 腾讯科技(深圳)有限公司 Data acquisition method, device, terminal and storage medium
CN110561399A (en) * 2019-09-16 2019-12-13 腾讯科技(深圳)有限公司 Auxiliary shooting device for dyskinesia condition analysis, control method and device

Similar Documents

Publication Publication Date Title
CN111789607B (en) Imaging system and method
CN108968996B (en) Apparatus, method and storage medium providing motion-gated medical imaging
CN110009709B (en) Medical image imaging method and system
CN111374675A (en) System and method for detecting patient state in medical imaging session
US10827999B2 (en) Dynamic analysis apparatus and system for measuring temporal changes in blood vessels
US10321886B2 (en) Workstation, medical imaging apparatus having the same and method for controlling thereof
CN113647967A (en) Control method, device and system of medical scanning equipment
US11717243B2 (en) Medical information processing apparatus, medical diagnostic apparatus, medical information processing system, medical information processing method, medical imaging apparatus, and medical treatment apparatus
CN112716509B (en) Motion control method and system for medical equipment
EP4014875B1 (en) Method for controlling a medical imaging examination of a subject, medical imaging system and computer-readable data storage medium
CN113081013B (en) Spacer scanning method, device and system
CN112043299A (en) Control method and system of medical equipment
WO2023036243A1 (en) Medical devices, methods and systems for monitoring the medical devices
WO2022022723A1 (en) Method and system for determining parameter related to medical operation
WO2024067629A1 (en) Methods, systems, and mediums for scanning
WO2022028439A1 (en) Medical device control method and system
EP3597104A1 (en) Gestural scan parameter setting
CN115147430A (en) Scanning parameter setting device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination