CN114077243B - Motion control method and system for medical auxiliary equipment - Google Patents

Motion control method and system for medical auxiliary equipment Download PDF

Info

Publication number
CN114077243B
CN114077243B CN202010789591.0A CN202010789591A CN114077243B CN 114077243 B CN114077243 B CN 114077243B CN 202010789591 A CN202010789591 A CN 202010789591A CN 114077243 B CN114077243 B CN 114077243B
Authority
CN
China
Prior art keywords
medical
information
target
medical auxiliary
auxiliary equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010789591.0A
Other languages
Chinese (zh)
Other versions
CN114077243A (en
Inventor
冯娟
王振玮
陆学飞
杨乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202010789591.0A priority Critical patent/CN114077243B/en
Priority to PCT/CN2021/092905 priority patent/WO2021228057A1/en
Publication of CN114077243A publication Critical patent/CN114077243A/en
Application granted granted Critical
Publication of CN114077243B publication Critical patent/CN114077243B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the application discloses a motion control method and a motion control system of medical auxiliary equipment, wherein the motion control method comprises the following steps: acquiring scene information of medical auxiliary equipment in a preset space; determining a target location of the medical auxiliary device based at least on the context information; planning a motion path of the medical auxiliary equipment according to the target position and the scene information; acquiring indication information indicating the movement of the medical auxiliary equipment; and controlling the medical auxiliary equipment to move along a corresponding movement path according to the indication information. The motion control method provided by the application can automatically acquire the scene information in the preset space to determine the target position of the medical auxiliary equipment, plan the motion path according to the determined target position and control the motion of the motion path, avoid the limitation caused by manual operation, and improve the safety and efficiency of clinical operation.

Description

Motion control method and system for medical auxiliary equipment
Technical Field
The application relates to the field of medical auxiliary equipment, in particular to a motion control method and a motion control system of medical auxiliary equipment.
Background
In practical applications of the medical auxiliary equipment, the medical auxiliary equipment often needs to be moved to different stations to adapt to different clinical operations, for example, when the mobile C-arm equipment is used for performing operations or examinations such as bone surgery, urinary tract, angiogram and the like, the mobile C-arm equipment needs to be moved to different positions. The medical auxiliary equipment is accurately and safely adjusted to the corresponding position, and the medical auxiliary equipment is greatly helpful for improving the working efficiency of medical staff and the safety of clinical operation.
Accordingly, there is a need for a method and system for motion control of medical auxiliary devices that reduces the workload of medical personnel, improves operating efficiency, and improves safety of clinical procedures.
Disclosure of Invention
One of the embodiments of the present application provides a motion control method of a medical auxiliary device, the method including: acquiring scene information of medical auxiliary equipment in a preset space; determining a target location of the medical auxiliary device based at least on the context information; planning a motion path of the medical auxiliary equipment according to the target position and the scene information; acquiring indication information indicating the movement of the medical auxiliary equipment; and controlling the medical auxiliary equipment to move along a corresponding movement path according to the indication information.
One of the embodiments of the present application provides a motion control system for a medical auxiliary device, the system comprising: the scene information acquisition module is used for acquiring scene information of the medical auxiliary equipment in a preset space; a target location determination module for determining a target location of the medical assistance device based at least on the context information; a motion path planning module for planning a motion path of the medical auxiliary equipment according to the target position and the scene information; the indication information acquisition module is used for acquiring indication information indicating the movement of the medical auxiliary equipment; and the motion control module is used for controlling the medical auxiliary equipment to move along the corresponding motion path according to the indication information.
One of the embodiments of the present application also provides a motion control apparatus for a medical auxiliary device, including at least one processing device and at least one storage device; the at least one memory device is configured to store computer instructions; the at least one processing device is configured to execute at least some of the computer instructions to perform operations as described above.
One of the embodiments of the present application also provides a computer-readable storage medium storing computer instructions that, when executed by a processor, perform the operations described above.
Drawings
The application will be further described by way of exemplary embodiments, which will be described in detail with reference to the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic illustration of an application scenario of a motion control system of a medical assistance device according to some embodiments of the present application;
FIG. 2 is an exemplary flow chart of a method of motion control of a medical assistance device according to some embodiments of the application;
fig. 3 is an exemplary block diagram of a motion control system of a medical assistance device according to some embodiments of the application.
Detailed Description
In order to more clearly illustrate the technical solution of the embodiments of the present application, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is apparent to those of ordinary skill in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions, or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in the specification and in the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in the present application to describe the operations performed by a system according to embodiments of the present application. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a schematic view of an application scenario of a motion control system of a medical assistance device according to some embodiments of the present application. The medical assistance device motion control system 100 may include a medical assistance device 110, a network 120, at least one terminal 130, a processing device 140, a storage device 150, and a medical bed 160. The various components in the system 100 may be interconnected by a network 120. For example, the medical assistance device 110 and the at least one terminal 130 may be connected or in communication via the network 120.
In some embodiments, the medical assistance device 110 may include imaging devices, analysis devices, other medical devices (e.g., surgical assistance devices), and the like. For example, the medical auxiliary device 110 may include a digital radiography device (digital radiography, DR), a computer radiography device (computed radiography, CR), a digital fluoroscopic radiography device (digital fluorography, DF), a biochemical immunoassay analyzer, a CT scanner, a magnetic resonance scanner, an electrocardiograph, a B-ultrasound, an endoscope and its scrubber-sterilizer, a shadowless lamp, a mobile C-arm, a medical crane, etc. In some embodiments, the medical assistance device 110 may be a C-arm device (or C-arm) that may include a gantry, a detector, a detection region, and a radiation generating apparatus (e.g., a radiation source). The gantry may be used to support the detector and the radiation generating device. The medical subject may be placed on a medical bed 160 for scanning. The medical body may include a patient, phantom, or other scanned object. The bed surface of the medical bed 160 may be parallel to the ground. The radiation generating device may emit X-rays towards the medical body. By scanning the medical subject, the medical assistance device 110 may acquire scan data to generate (or reconstruct) an image.
Network 120 may include any suitable network capable of facilitating the exchange of information and/or data by medical auxiliary device motion control system 100. In some embodiments, at least one component of the medical-assist device motion control system 100 (e.g., the medical-assist device 110, the processing device 140, the storage device 150, the at least one terminal 130) may exchange information and/or data with at least one other component of the medical-assist device motion control system 100 via the network 120. For example, the processing device 140 may obtain its surroundings from the medical auxiliary device 110 and/or the camera device at a fixed location in the preset space through the network 120 (e.g.Ambient environment information, scene information of a preset space, etc.). For another example, processing device 140 may obtain information indicative of a medical practitioner (e.g., doctor) from at least one terminal 130 via network 120. Network 120 may alternatively include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN)), a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. For example, network 120 may include a wired network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), bluetooth TM Network, zigBee TM A network, a Near Field Communication (NFC) network, etc., or any combination thereof. In some embodiments, network 120 may include at least one network access point. For example, network 120 may include wired and/or wireless network access points, such as base stations and/or internet switching points, through which at least one component of medical assistance device motion control system 100 may connect to network 120 to exchange data and/or information.
The at least one terminal 130 may be in communication and/or connected with the medical assistance device 110, the processing device 140, and/or the storage device 150. For example, a healthcare worker may enter indication information (e.g., image information, text information, etc.) via at least one terminal 130 to control movement of the medical assistance device 110. For another example, healthcare personnel may enter information related to a medical plan (e.g., protocol information for a medical task, etc.) through at least one terminal 130 and be stored by the processing device 140 in the storage device 150. For another example, the target location and the motion path determined by the processing device 140 may be displayed on the terminal 130. In some embodiments, the at least one terminal 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof. For example, mobile device 131 may include a mobile control handle, a Personal Digital Assistant (PDA), a smart phone, or the like, or any combination thereof.
In some embodiments, at least one terminal 130 can include input devices, output devices, and the like. The input device may be selected from keyboard input, touch screen (e.g., with haptic or tactile feedback) input, voice input, eye tracking input, gesture tracking input, brain monitoring system input, image input, video input, or any other similar input mechanism. Input information received via the input device may be transferred via, for example, a bus to the processing device 140 for further processing. Other types of input devices may include cursor control devices, such as a mouse, a trackball, or cursor direction keys. In some embodiments, the healthcare worker may enter relevant information for the medical plan via the input device. In some embodiments, the healthcare worker may also enter indication information (e.g., image information, text information, instruction selection information, etc.) via the input device to control movement of the medical assistance device 110, e.g., the healthcare worker may control the medical assistance device 110 to scan the chest by entering a picture of the chest of the medical subject via the input device. The output device may include a display, speakers, printer, etc., or any combination thereof. The output device may be used to output the target position, motion path, etc., as determined by the processing device 140. In some embodiments, at least one terminal 130 may be part of processing device 140.
In other embodiments, the medical assistance device 110 may not include the terminal 130. For example, after the processing device 140 acquires the scene information of the preset space, the target position is determined according to the scene information, and immediately after the processing device 140 determines the movement path, the movement of the medical auxiliary device 110 is directly instructed without the need for the medical staff to control through the terminal 130. For another example, a medical staff member may instruct the medical assistance device 110 to move through limb operation instruction information, which may be acquired by a camera device provided at a certain fixed position in a preset space and/or on the medical assistance device 110, in such a way that the terminal 130 is not required for instructing the medical assistance device to move as well.
The processing device 140 may process data and/or information obtained from the medical assistance device 110, the storage device 150, the at least one terminal 130, or other components of the medical assistance device motion control system 100. For example, processing device 140 may retrieve information from storage device 150 regarding the target location of medical assistance device 110. In some embodiments, the processing device 140 may be a single server or a group of servers. The server farm may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data from the medical auxiliary device 110, the storage device 150, and/or the at least one terminal 130 via the network 120. As another example, the processing device 140 may be directly connected to the medical assistance device 110, the at least one terminal 130, and/or the storage device 150 to access information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
Storage device 150 may store data, instructions, and/or any other information. For example, scene information of a preset space, protocol information of a medical task, a target position of the medical assistance device 110, a movement path, and the like. In some embodiments, the storage device 150 may store data obtained from the medical auxiliary device 110, the at least one terminal 130, and/or the processing device 140. In some embodiments, the storage device 150 may store data and/or instructions that the processing device 140 uses to perform or use to accomplish the exemplary methods described herein. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, storage device 150 may be implemented on a cloud platform.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with at least one other component (e.g., the processing device 140, the at least one terminal 130) in the medical auxiliary device motion control system 100. At least one component of the medical accessory motion control system 100 may access data (e.g., target locations, motion paths, etc., as determined by the processing device 140) or instructions (e.g., various types of instructional information, etc.) stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be part of the processing device 140.
It should be noted that the foregoing description is provided for the purpose of illustration only and is not intended to limit the scope of the present application. Many variations and modifications will be apparent to those of ordinary skill in the art, given the benefit of this disclosure. The features, structures, methods, and other features of the described exemplary embodiments of the application may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 150 may be a data storage device including a cloud computing platform, such as a public cloud, a private cloud, a community, a hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present application.
Currently, the adjustment of the position of the medical auxiliary device 110 is mainly controlled manually, i.e. requires manual operation by a medical staff, such as the current mobile C-arm device. Specifically, when performing operations or examinations such as bone surgery, urinary and cardiovascular radiography by using a mobile C-arm device, a medical staff usually controls the mobile C-arm device to move to a corresponding station for clinical operations (e.g., operations, scanning, etc.), because of limited space in an operating room or an examination room and a plurality of devices used in the operation or examination process, the medical staff cannot fully and accurately observe environmental information due to limited fields of view of the medical staff, and the medical staff has different operation proficiency for different medical auxiliary devices 110, such as that certain medical auxiliary devices (e.g., mobile C-arm) have larger size and weight and poorer flexibility, in which case the medical auxiliary devices 110 may collide with other devices or medical bodies (e.g., medical staff, patients, etc.), or the stations where the medical auxiliary devices are located do not meet the requirements of the examination and the operation.
In some embodiments, the motion control method of the medical auxiliary equipment 110 provided by the application can adopt related technologies such as automatically acquiring scene information, intelligently planning a motion path, intelligently avoiding obstacles and the like, so that the medical auxiliary equipment 110 can be better applied to the clinical operation environment, and the problems of safety and low efficiency caused by the limitation of manual operation of medical staff are avoided.
Fig. 2 is an exemplary flow chart of a method of motion control of a medical assistance device according to some embodiments of the application. In particular, the motion control method 200 of the medical assistance device 110 may be performed by the processing device 140. For example, the motion control method 200 of the medical assistance device 110 may be stored in a storage means (e.g., the storage device 150) in the form of a program or instructions that, when executed by the motion control system 100 (e.g., the processing device 140) of the medical assistance device 110, may implement the motion control method 200 of the medical assistance device 110.
Step 210, acquiring scene information of the medical auxiliary equipment in a preset space. In some embodiments, step 210 is performed by scene information acquisition module 310.
In step 210, the processing device 140 may acquire scene information of the medical assistance device 110 within a preset space. In particular, step 210 may be performed by a scene information acquisition module 310 in the processing device 140.
The preset space may be a range of areas in which the medical assistance device 110 can move. In some embodiments, the preset space may be an area of the medical subject undergoing surgery and/or examination, e.g., an operating room, a radiological examination room, etc.
The scene information may be understood as environmental information in a preset space. For example, the scene information may include medical subject (e.g., patient, subject, healthcare worker, etc.) information, non-medical subject information (e.g., medical assistance device 110 information, obstacle information, etc.) and the like located in a preset space. Further description of obstacle information may be found in other embodiments of the present application and will not be repeated here. In some embodiments, the scene information may include location information of one or more subjects in the preset space and time information corresponding to the location information, for example, the scene information may include a specific location where a medical subject in the preset space is located at a certain time. In some embodiments, the location information of the one or more subjects may be represented by coordinates. Specifically, with a main body (for example, the medical bed 160) fixed with respect to a predetermined space as an origin, a space rectangular coordinate system may be established, and the position information of the main body may be represented by using the distances between all main bodies located in the predetermined space and the X, Y, Z axis of the origin as coordinates of the main body.
In some embodiments, the scene information may be acquired by an imaging device, which may be disposed at a fixed location within the preset space, or on the medical auxiliary device 110. The image pickup device may acquire image information of all subjects (including medical subjects and non-medical subjects) located in a preset space, including, for example, video images and/or still pictures, etc., and transmit the image information and time information corresponding to the image information to the processing apparatus 140. In some embodiments, the processing device 140 may create a stereoscopic image (e.g., a three-dimensional map, etc.) of the preset space by analyzing the image information and obtain scene information corresponding to each subject. Specifically, the stereo image in the preset space includes a plurality of pixels, each main body in the preset space is formed by a plurality of corresponding pixels, and the processing device 140 assigns a value to each pixel according to the result of analyzing the image information, so as to obtain the scene information corresponding to each main body.
In some embodiments, the image pickup device may include a plurality of image pickup devices, and the plurality of image pickup devices may be arranged according to a certain rule. For example, the separation distance between two adjacent image pickup devices is the same. For another example, the plurality of imaging devices are arranged in a line array or a plane array. The image pickup device may be provided at a certain fixed position in the preset space and the medical auxiliary equipment 110 at the same time to perform different functions, for example, the image pickup device provided at a certain fixed position is used to acquire scene information that is farther from the medical auxiliary equipment 110, and the image pickup device provided on the medical auxiliary equipment 110 is used to acquire scene information (for example, surrounding environment information) that is closer to the medical auxiliary equipment 110. In some embodiments, the camera may capture images periodically or in real-time to obtain scene information. For example, the processing device 140 may capture images at fixed length intervals of 100 milliseconds, 500 milliseconds, 1 second, 2 seconds, etc., to obtain scene information. In some embodiments, the processing device 140 may adjust parameters of the image capturing apparatus based on the acquired environmental information to acquire more comprehensive and accurate scene information, and the parameters of the exemplary image capturing apparatus may include a position facing a camera of the image capturing apparatus, a direction facing the camera, an angle, an aperture, a focal length, a sensitivity, a white balance, and the like.
In some embodiments, the processing device 140 may obtain more accurate and comprehensive scene information in combination with other technologies, in addition to obtaining scene information through a camera device. For example, infrared recognition technology, laser recognition technology, ultrasonic recognition technology, and the like.
Step 220, determining a target location of the medical assistance device based at least on the scene information. In some embodiments, step 220 is performed by the target location determination module 320.
In step 220, the processing device 140 may determine a target location of the medical assistance device 110 based at least on the context information. In particular, step 220 may be performed by a target location determination module 320 in the processing device 140. In some embodiments, the processing device 140 may determine the target location of the medical assistance device 110 based on the context information, and details may be found in step 220a. In other embodiments, the processing device 140 may also determine the target location of the medical assistance device 110 according to the protocol information of the medical task and the scene information, and details of this may be found in step 220b.
Step 220a, determining a target position of the medical auxiliary device 110 according to the scene information.
In step 220a, the processing device 140 may determine a target location of the medical assistance device 110 from the context information. In particular, step 220a may be performed by a target location determination module 320 in the processing device 140.
The target location may be understood as an end point of a certain movement of the medical assistance device 110 when performing a medical task, for example, when a chest scan is required, the medical assistance device 110 may start scanning when moving to a certain designated location, and for example, when the medical task is completed, the medical assistance device 110 moves to a certain designated location to wait (e.g., stop working, wait for a next clinical operation, etc.), and the designated location is the target location.
In some embodiments, the target location includes one or more target sub-locations and/or target remote locations. The target sub-position may refer to a position that the medical assistance device 110 must be moved or adjusted to before performing a clinical operation (e.g., assisting in performing a suturing operation, a chest scan, etc.) to enable the clinical operation, and the designated position is the target sub-position, so the target sub-position may be understood as the position of the medical assistance device 110 when performing a clinical operation. It will be appreciated that a medical task may include one or more clinical procedures, and that a medical task may be performed by different medical aids 110, or may be performed by the same medical aid 110, e.g., a C-arm that first performs an X-ray fluoroscopy of a medical subject and then assists in performing a surgical (e.g., assisting in performing a suture procedure) procedure. In some embodiments, the location of the medical assistance device 110 varies as different clinical procedures are performed, so the target location may include multiple target sub-locations, for example, where the medical subject lies flat on the medical bed 160 while a brain scan is being performed, the medical assistance device 110 needs to be located near the chest of the medical subject, which is one of the target sub-locations; for another example, in performing a leg suturing operation, the medical auxiliary device 110 (e.g., a shadowless lamp, etc.) needs to be positioned near the leg of the medical subject to perform the operation, which is yet another target sub-position.
The target remote location may be understood as a clinical operation or medical task that requires the medical assistance device 110 to be on standby at an idle location remote from the medical subject or medical bed 160, which is the target remote location. For example, after one scan is completed, the medical auxiliary device 110 is no longer needed, and the medical auxiliary device 110 moves to a certain free area (for example, a corner) to stand by, and the corner is the target remote location. In some embodiments, the target-remote location may also include a plurality of target-remote locations, e.g., the target-remote location may be a corner, a doorway, etc.
In some embodiments, the processing device 140 may preset the target location based on the scene information. The preset may be understood as a target position determined before the medical task is not performed or related information (e.g., protocol information, etc.) of the medical task is not acquired. In some embodiments, the target location may be determined from the context information, including: the processing device 140 presets one or more target sub-positions and/or target remote positions according to the scene information. In this embodiment, after the scene information is acquired, the processing device 140 may preset one or more target sub-positions and/or target remote positions according to the scene information, where the one or more target sub-positions and/or target remote positions may be suitable for most or more common clinical operations, for example, after the scene information is acquired, the scene information includes relevant information of the medical bed 160 (for example, a specific position of the medical bed 160, a length, a width, a height, etc. of the medical bed 160) or relevant information of a human body or a human body model, and the one or more target sub-positions are set directly according to the relevant information of the medical bed 160. Exemplary target sub-positions may include a first station for head scanning: the head section of the medical bed 160 (refers to the end of the medical body where the head is located when lying on the medical bed 160); second station for chest scanning: the middle section of the medical bed 160 (refers to the position of the chest when the medical subject lies flat on the medical bed 160); third station for assisting in performing leg sewing operations: the tail section of the medical bed 160 (refers to the end of the medical body where the feet are located when lying on the medical bed 160). When the medical assistance device 110 is moved to a preset corresponding station, a corresponding clinical procedure may begin (e.g., a thoracic scan may be performed after movement to a second station). For another example, after the scene information is acquired, a certain free area in the preset space is directly taken as the target distant position. The preset target sub-position can be suitable for most clinical operations, the universality is stronger, and when the clinical operations to be performed are common (such as auxiliary leg suture operations, chest scanning and the like), the step of determining the target position can be reduced by preset the target position, and the execution efficiency of medical tasks is improved. For another example, the scene information includes human body or human body model information, and when the corresponding part of the human body is identified by the human body or the human body model information, the station is directly determined according to the corresponding part.
In some embodiments, the target far-away position may also be preset, for example, the target far-away position may be fixedly set in a certain spatial area (such as a certain corner) of the preset space. The medical assistance device 110 can automatically return to the target remote location each time a medical task or a clinical operation in a medical task is performed, without repeatedly determining the target location, and the efficiency of the clinical operation is effectively improved.
In some embodiments, the determining the target location of the medical device from the context information may further comprise: and determining a target position corresponding to the current operation type station according to the current scene information and a preset space threshold value. The preset target location may be suitable for most clinical procedures, but in some cases the preset target location may be occupied by other equipment or unsuitable for current clinical procedures. Thus, the processing device 140 may also incorporate a preset spatial threshold when determining the target location, further improving the efficiency of the clinical procedure. The preset spatial threshold may refer to a preset spatial volume capable of accommodating the medical auxiliary device 110, for example, 1 cubic meter, 2 cubic meters, 3 cubic meters, etc. In some embodiments, the spatial threshold may be determined from the volume of the medical assistance device 110, and the spatial threshold may be at least greater than or equal to the volume of the medical assistance device 110 to ensure that the medical assistance device 110 is received when a location in the preset space reaches the preset spatial threshold.
In some embodiments, the processing device 140 may determine a target location corresponding to the current surgical operation type station based on the current context information and a preset spatial threshold. Specifically, after the processing device 140 acquires the current scene information and the preset spatial threshold (for example, 2 cubic meters), a plurality of positions (for example, a first station, a second station, a third station, etc.) where the spatial volume reaches the spatial threshold may be screened out, and then one or more positions (for example, a third station or a second station close to the third station) corresponding to the current operation type (for example, the current operation type is leg stitching) of the positions are screened out from the plurality of positions as the target positions.
Step 220b, determining the target position of the medical auxiliary equipment according to the protocol information and the scene information. In some embodiments, step 220b may be performed by the target location determination module 320.
In step 220b, the processing device 140 may determine a target location of the medical assistance device 110 based on the protocol information and the context information. In particular, step 220 may be performed by a target location determination module 320 in the processing device 140.
In some embodiments, the processing device 140 may obtain protocol information for the medical task and determine the target location based on the protocol information and the context information. In some embodiments, the protocol information can reflect one or more clinical operating stations in the medical task. For example, the protocol information of a clinical procedure requiring a chest scan, a leg stapling procedure, etc. to be performed by a medical task may include "the medical auxiliary device 110 performs a chest scan at the chest location of the medical subject", "the medical auxiliary device 110 performs a leg scan at the leg location of the medical subject, and assistance in performing a leg stapling procedure. The protocol information described above reflects a plurality of clinical operative stations in a medical task, such as "chest position", "leg position". In some embodiments, the processing device 140 may obtain the target location corresponding to the clinical operation site from the plurality of clinical operation sites and the scene information analyzed in the protocol information. The corresponding clinical procedure may begin when the medical assistance device 110 is moved to the target location. For example, by analyzing the clinical operating station "chest position", the processing device 140 needs to move to the middle section of the medical bed 160, thus targeting the middle section of the medical bed 160; or a corresponding portion of the human body as a target location of the imaging assembly, and the medical assistance device 110 is moved to the corresponding location. For another example, the processing device 140 analyzes the "leg position" and the medical assistance device 110 needs to move to the end section of the medical bed 160, thus targeting the end section of the medical bed 160. In some embodiments, protocol information may be retrieved by processing device 140 directly from medical tasks stored in storage device 150, or may be entered manually by a healthcare worker through terminal 130.
In some embodiments, the processing device 140 may determine the target location differently based on the acquired protocol information. Since the clinical operation content included in each medical task is different, the corresponding protocol information may also be different. For example, the second medical task may include clinical operations such as leg scanning and suture surgery, but the leg scanning of the second medical task is directed to a lower leg portion, the leg scanning of the first medical task is directed to a sole portion, the protocol information of the first medical task may be "sole position", and the protocol information of the second medical task may be "lower leg position", so that the processing device 140 determines that the target position is different when analyzing the protocol information corresponding to the first medical task and the protocol information corresponding to the second medical task.
It should be noted that the above description of the determining target location steps 220a and 220b is only a specific example and should not be considered as the only viable embodiment. It will be apparent to those skilled in the art that various modifications and changes in form and detail of the specific manner and steps of carrying out the determination of the target location may be made without departing from this principle, but remain within the scope of the above description. For example, the processing device 140 of the present application may determine the target position through the step 220a, may determine the target position through the step 220b, may determine the target position through the step 220a and the step 220b, and may select the optimal target position by analyzing and comparing the target positions determined through the two steps. Such variations are within the scope of the application.
Step 230, planning a motion path of the medical auxiliary device 110 according to the target position and the scene information. In some embodiments, step 230 may be performed by motion path planning module 330.
In step 230, processing device 140 may plan a path of motion for the medical assistance device 110 based on the target location and the context information. In particular, step 230 may be performed by a motion path planning module 330 in the processing device 140.
The path of motion of the medical assistance device 110 may indicate a route or trajectory of the medical assistance device 110 from its starting position to the target position. The processing device 140 may determine that the path that the medical assistance device 110 points to the target location at its starting location is the path of motion of the medical assistance device 110. In some embodiments, the starting position of one motion path may be the target position of the last motion path, i.e. the target position of the first motion is used as the starting position of the second motion, for example, the processing device 140 determines the first station for brain scanning and the second station for leg scanning, respectively, and according to the medical task, the medical auxiliary device 110 performs brain scanning first and then leg scanning, so that after brain scanning is completed, the medical auxiliary device 110 needs to move from the first station to the second station, when the first station is the starting position of the motion path of the medical auxiliary device 110.
In some embodiments, the motion path includes one or more motion sub-paths between the target sub-positions and/or between the target sub-positions and the target remote positions. In some embodiments, the sub-path of movement between target sub-positions may refer to a route or trajectory of movement of the medical assistance device 110 between a plurality of clinical operating stations. For example, the movement sub-path may include a route or trajectory for the medical assistance device 110 to move from a first station (a first section of the medical bed 160) to a second station (a middle section of the medical bed 160). In some embodiments, the movement sub-path between the target sub-position and the target remote position may be a route or track that the medical assistance device 110 needs to wait for a medical task to be performed or for the clinical procedure to be performed after completion of the clinical procedure, from the clinical procedure station to the standby position. For example, after the medical assistance device 110 completes a brain scan at the first station, it is no longer necessary that the medical assistance device 110 will move to some free area (e.g., a corner) for standby, the corner being the target remote location, and the path or trajectory from the first station to the corner is also referred to as the movement sub-path.
In some embodiments, the number of target sub-paths is related to the number of target locations (target sub-locations and/or target remote locations). For example, when there is only one target location (e.g., only one target sub-location or only one target remote location), there is typically only one target sub-path, and for example, when there are multiple target locations (e.g., there are two or more target sub-locations), there is typically multiple target sub-paths.
In some embodiments, the processing device 140 may preset the one or more motion sub-paths according to the scene information and the preset one or more target sub-positions and/or target remote positions. In this embodiment, the processing device 140 may determine the movement sub-path according to one or more preset target positions (for example, the first station, the second station, the third station, etc.), and before the medical task is not acquired or performed, the processing device 140 may directly set one or more preset target positions as the end position of a certain movement of the medical auxiliary device 110 to one or more movement sub-paths. The target position (including the target sub-position and the target far-away position) and the motion path (including the motion sub-path) preset by the processing device 140 can be generally suitable for most clinical operations, can meet most medical tasks, have strong universality, save the step of planning the motion path, and improve the efficiency of clinical operations.
Step 240, obtaining indication information indicative of the motion of the medical assistance device 110. In some embodiments, step 240 may be performed by the indication information acquisition module 340.
In step 240, the processing device 140 may obtain indication information indicative of the movement of the medical assistance device 110. In particular, step 250 may be performed by the indication information obtaining module 340 in the processing device 140.
The indication information may be instructions for instructing the medical assistance device 110 to perform a movement. In some embodiments, the indication information may be issued by a healthcare worker, for example, instructions or programs entered by the healthcare worker, or the like.
In some embodiments, the indication information includes input information from a healthcare worker; the input information includes at least one of voice information, text information, instruction selection information, image information, and limb operation instruction information. Wherein, the voice information can be voice, password, dialogue and the like sent by medical staff, and the exemplary voice information can be: "move to the first station", "return to the standby position", "start chest scan", etc. The text information may be text entered by the healthcare worker into the terminal 130 (e.g., a control handle, a computer, etc.), and exemplary text information may be: moving to a first station and then to a second station, performing suture operation, and the like. The instruction selection information may be one or more instructions displayed on the terminal 130 for selection by the healthcare worker, different instructions corresponding to different clinical procedures, for example, instruction one corresponding to the movement to the first station, instruction two corresponding to the chest scanning. The image information may be a picture, image, etc. entered into the terminal 130 by a healthcare worker, for example, the healthcare worker entering an image of the chest into the terminal 130 may instruct the medical assistance device 110 to X-ray scan the chest. The limb manipulation indication information may be information that the limb motion of the healthcare worker exhibits, the different limb motions corresponding to different instructions, for example, the healthcare worker's finger toward the trailing end (or third position) of the medical bed 160, indicating that the medical assistance device 110 needs to be moved to the trailing end of the medical bed 160, and, for example, the healthcare worker puts down the surgical tool, indicating that the surgery is over, and the medical assistance device 110 needs to be moved to the standby position (e.g., target away position). In some embodiments, the meaning of the indication information is related to the protocol information, and the meaning of the indication information corresponding to the different protocol information is different, for example, although the voice information may indicate that the medical auxiliary device 110 performs the suturing operation, the first protocol information is a leg suturing operation, and the second protocol information is a hand suturing operation, so when the hand of the medical staff points to the tail section of the medical bed 160, the target position of the movement of the medical auxiliary device is different, for example, in the limb operation indication information, the medical staff puts down the surgical tool, which may indicate that the surgical tool is completed, the medical auxiliary device 110 needs to return to the standby position, and may also indicate that the surgical tool is replaced. In some embodiments, the healthcare worker may use the voice information, the text information, the instruction selection information, the image information, and the limb operation instruction information to instruct the medical auxiliary device 110 to move at the same time, so as to improve the clinical operation efficiency of the healthcare worker, for example, use the voice information and the limb operation instruction information at the same time, and so on.
In some embodiments, processing device 140 may determine whether the current procedure type operation is complete based on limb operation indication information of the healthcare worker; and determining the next target position according to the protocol information and the current scene information. When the processing device 140 acquires the medical task, clinical operation information (e.g., a surgical operation type, a clinical operation station, etc.) corresponding to the limb operation instruction information may be determined according to protocol information of the medical task, and may be stored in the storage device 150 so as to be retrieved at any time. Specifically, after the processing device 140 acquires the corresponding protocol information, the position of the lower leg is determined as the target position of the lower leg suture operation when the medical main body is lying on the medical bed 160, and the processing device 140 can determine whether the lower leg suture operation is finished according to the limb operation instruction information sent by the medical staff, and when the lower leg suture operation is finished, the target position of the thigh suture operation is determined again. For example, a medical practitioner may manually point to the thigh position of the medical subject, or for example, the medical practitioner may lay down a suturing tool, or for example, the medical practitioner may make an "OK" gesture, and the limb operation indication information may be captured by the camera device and transmitted to the processing device 140 for analysis to determine that the thigh suturing operation is completed and that the thigh suturing operation is required. The processing device 140 may, in turn, determine the thigh position as the target position for thigh suturing operation when the medical subject is lying on the medical bed 160, so that the medical auxiliary device 110 can smoothly perform the next suturing operation. In some embodiments, the meaning of the limb manipulation indication information may be customized by the healthcare worker, e.g., the healthcare worker may set "lower the suture surgical tool for 10 seconds" as an indication of the end of the current surgical type manipulation. In other embodiments, the meaning of the limb operation instruction information may also be determined by the processing device 140 after analyzing the historical limb operation instruction information stored in the storage device.
In some embodiments, the indication information may further include instructions or information automatically issued by the processing device 140 according to the medical task, for example, the processing device 140 may determine the target position and the movement path after acquiring the protocol information of the medical task, and when the target position and the movement path are determined, the processing device 140 may issue the indication information to the medical auxiliary device 110 to instruct the medical auxiliary device 110 to move to the target position corresponding to the clinical operation station without the medical staff issuing the indication information.
Step 250, controlling the medical auxiliary device 110 to move along the corresponding movement path according to the indication information. In some embodiments, step 250 may be performed by motion control module 350.
In step 250, the processing device 140 may control the medical auxiliary device 110 to move along the corresponding movement path according to the indication information. In particular, step 250 may be performed by a motion control module 350 in the processing device 140.
The medical assistance device 110 may include a drive mechanism and the processing device 140 may send the indication information to the drive mechanism of the medical assistance device 110. In some embodiments, the drive mechanism may drive a movement mechanism (e.g., wheels, tracks, mechanical feet, etc.) of the medical assistance device 110 based on the received indication information to move the medical assistance device 110 along the movement path to the target location. In some embodiments, the processing device 140 may determine the speed at which the medical assistance device 110 is moved. For example, the processing device 140 may control the medical auxiliary device 110 to move to the target location along the path of movement at a constant speed (e.g., 1m/s, 2m/s, etc.). In some embodiments, the processing device 140 may determine the speed of the medical assistance device 110 based on the ambient environment information in response to the ambient environment information of the medical assistance device 110 during movement of the medical assistance device 110 to the target location. For example, there may be an obstacle around the medical assistance device 110, the closer the obstacle is to the medical assistance device 110, the slower the medical assistance device 110 moves.
The surrounding environment information may be environment information in a space having the medical assistance device 110 as a center and a first threshold (e.g., 2m, 3m, 5m, etc.) as a radius, and the surrounding environment information may include obstacle information and non-obstacle information (e.g., road surface information, etc.) in a direction in which the medical assistance device 110 moves toward the target position. In some embodiments, the processing device 140 may obtain ambient information of the medical assistance device 110. The processing device 140 may determine a motion path of the medical assistance device 110 based on the ambient environment information; wherein the surrounding environment information includes obstacle information in a movement direction of the medical assistance apparatus 110 toward the target position. In some embodiments, the ambient information is obtained by an identification device disposed on the medical assistance device 110; the identification device comprises an infrared identification device, a laser identification device or an image identification device. For example, the infrared recognition device may include an infrared scanner, an infrared range finder, or the like, the laser recognition device may include a laser scanner, or the like, and the image recognition device may include an image pickup device (e.g., a camera) or the like. In some embodiments, one or more cameras may be provided on the medical assistance device 110 to acquire images of the surroundings of the medical assistance device 110, and the processing device 140 may analyze the acquired images to identify surrounding environmental information of the medical assistance device 110, for example, to identify whether a subject is an obstacle.
In some embodiments, an obstacle may refer to a subject having a distance from the medical assistance device 110 that is within a second threshold (e.g., 0.5m, 1m, 1.5m, etc.), i.e., a subject having a distance from the medical assistance device 110 that is within the second threshold during movement of the medical device toward the target location may be referred to as an obstacle. In some embodiments, the obstacle may be a medical subject (e.g., a person), a non-medical subject (e.g., other medical assistance device 110), or the like. In some embodiments, whether the subject is an obstacle may also be determined by whether the subject is moving within a time threshold, e.g., the subject is within a second threshold (e.g., 0.5m, 1m, 1.5m, etc.) from the medical assistance device 110, and the subject may be considered an obstacle if it is not moving within a time threshold (e.g., 5s,10s,15s, etc.). In some particular embodiments, the obstacle information may include whether an obstacle is present, a category of obstacle (e.g., person, wall, medical bed 160, door, etc.), a height of the obstacle (e.g., 3 meters, 1 meter, 0.5 meters, 0.2 meters, 0.05 meters), etc., any combination thereof. In some embodiments, when it is determined that there is an obstacle in the movement direction of the medical assistance device 110 toward the target position, the processing device 140 may determine whether the movement path needs to be updated based on the height and/or the type of the obstacle, if the movement path needs to be updated, the processing device 140 may plan the obstacle avoidance path, and if the movement path does not need to be updated, the movement is continued according to the original movement path. In some embodiments, processing device 140 may update the path of motion of medical assistance device 110 to an obstacle avoidance path. For example, when the height of the obstacle is above a certain threshold (e.g., 0.05 meters), the processing device 140 determines that the medical assistance device 110 cannot continue to move against the obstacle, and the movement path needs to be updated, the processing device 140 may determine the obstacle avoidance path. For another example, when the height of the obstacle is below the threshold (i.e., 0.05 meters), the processing device 140 determines that the medical assistance device 110 may continue to move against the obstacle, and the processing device 140 may not update the path of movement of the medical assistance device 110, at which time the processing device 140 may increase the drive torque applied by the drive mechanism to the medical assistance device 110. For another example, when it is determined that the obstacle is a person, the medical assistance device 110 may stop waiting, and move toward the target position according to the original movement path after the person in front of the movement direction disappears. In some embodiments, when an obstacle is determined to be a person, the medical assistance device 110 may issue an alarm prompt to alert a healthcare worker that the medical assistance device 110 is encountering the obstacle or is trapped in a certain location. Exemplary alert cues may be a combination of one or more of voice, flashing, vibration, etc.
In some embodiments, processing device 140 may plan an obstacle avoidance path of medical assistance device 110 based on ambient environmental information (e.g., obstacle information, etc.). For example, during movement of the medical assistance device 110 to the target location along the original path of movement, when there is an insurmountable obstacle in the direction of movement of the medical assistance device 110, the processing device 140 may control the medical assistance device 110 to change its direction of movement, and if the insurmountable obstacle is encountered again during movement to the target location, the medical assistance device 110 may change the direction of movement again until the target location is eventually reached. In some embodiments, processing device 140 may obtain a distance between medical assistance device 110 and the obstacle, and processing device 140 may plan a new obstacle avoidance path based on the distance between medical assistance device 110 and the obstacle. For example, when the obstacle is 1 meter from the medical assistance device 110, the medical assistance device 110 may move 0.5 meter toward the obstacle, then bypass the obstacle along the circumference of a circle centered on the obstacle with a radius of 0.5 meter, and then move toward the target location. For another example, when the obstacle is located in front of the medical assistance device 110 and at a distance of 1 meter, the medical assistance device 110 may move 0.5 meter toward the obstacle and then 0.5 meter to the left until the obstacle is bypassed and then moved to the right by 0.5 meter and then continue to move toward the target location. In some embodiments, when an obstacle (e.g., a person) appears directly in front of the direction of motion, the processing device 140 may control the medical auxiliary device 110 to sound a turn alert, alert the medical subject to the medical auxiliary device 110, and avoid collision with it.
It should be noted that the above description of the process 200 is for purposes of illustration and description only and is not intended to limit the scope of the present application. Various modifications and changes to flow 200 may be made by those skilled in the art in light of the teachings of the present application. However, such modifications and variations are still within the scope of the present application. For example, step 240 may precede step 220 (steps 220a, 220 b), i.e., processing device 140 may first obtain indication information indicative of movement of the medical assistance device 110, and then determine the target location of the medical assistance device 110. For example, after a chest scan is completed, the healthcare worker issues a voice command to the medical assistance device 110: "end of scan", the processing device 140 determines that the voice information indicates that the current clinical operation is ended based on the protocol information, and the medical assistance device 110 needs to perform the next clinical operation (for example, leg scan, assistance in performing leg surgery, etc.) or to perform standby, and since the voice information does not include a target position corresponding to the next clinical operation (for example, leg scan, assistance in performing leg surgery, etc.) or to perform standby, the processing device 140 needs to acquire the current scene information and temporarily determine the target position based on the scene information. Such variations are within the scope of the application.
Fig. 3 is an exemplary block diagram of a motion control system of a medical assistance device according to some embodiments of the application. As shown in fig. 3, the motion control system 300 of the medical assistance device 110 may include a scene information acquisition module 310, a target location determination module 320, a motion path planning module 330, an indication information acquisition module 340, and a motion control module 350.
The scene information acquiring module 310 may be disposed at a fixed location in the preset space and/or on the medical auxiliary device 110, and may be used to acquire scene related information in the preset space, where the scene related information may include scene information and surrounding environment information. In some embodiments, the scenario information acquisition module 310 may also be used to acquire protocol information for a medical task that can reflect one or more clinical operating stations in the medical task.
The target location determination module 320 may be used to determine a target location of the medical assistance device 110 based on the context information. In some embodiments, the target location determination module 320 may also be configured to preset one or more target sub-locations and/or target remote locations based on the context information. In some embodiments, the target position determining module 320 may be further configured to determine a target position corresponding to the current surgical operation type station according to the current scene information and the preset position. In some embodiments, the target location determination module 320 may also be configured to determine a target location of the medical assistance device 110 based on the protocol information and the context information.
The motion path planning module 330 may be used to plan a motion path of the medical assistance device 110 based on the target location and the scene information. In some embodiments, the motion path planning module 330 may be further configured to preset the one or more motion sub-paths according to the scene information and the preset one or more target sub-positions and/or target distant positions.
The indication information acquisition module 340 may be configured to acquire indication information indicative of the movement of the medical assistance device 110. In some embodiments, the indication information acquisition module 340 may be further configured to determine whether the current operation type is completed according to the limb operation indication information of the medical staff, and determine the next target position according to the protocol information and the current scene information.
A motion control module 350 may be used to control the motion of the medical auxiliary device 110 along a corresponding motion path according to the indication information. In some embodiments, the motion control module 350 may also be configured to rely on ambient information of the medical assistance device 110; determining a path of motion of the medical assistance device 110; wherein the surrounding environment information includes obstacle information in a movement direction of the medical assistance apparatus 110 toward the target position.
In other embodiments of the present application, a motion control apparatus for a medical assistance device is provided, comprising at least one processing device 140 and at least one storage device 150; the at least one memory device 150 is configured to store computer instructions, and the at least one processing device 140 is configured to execute at least some of the computer instructions to implement the method of motion control of the medical auxiliary device 110 as described above.
In still further embodiments of the present application, a computer readable storage medium for motion control of a medical assistance device 140 is provided, the storage medium storing computer instructions which, when executed by a processing device 140, implement a method of motion control of a medical assistance device 140 as described above.
It should be noted that the above description of the motion control system of the medical accessory 110 and the apparatus/modules thereof is for descriptive convenience only and is not intended to limit the application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, it is possible to combine various devices/modules arbitrarily or to construct a subsystem in connection with other devices/modules without departing from such principles. For example, in some embodiments, for example, the scene information acquisition module 310, the target location determination module 320, and the motion path planning module 330 disclosed in fig. 3 may be different modules in one apparatus (e.g., the processing device 140), or may be one module to implement the functions of two or more modules described above. For example, the target position determining module 320 and the motion path planning module 330 may be two modules, or may be one module having functions of receiving signals and processing information at the same time. For another example, each device may have a respective memory module. For another example, the respective devices may share one memory module. Such variations are within the scope of the application.
The possible beneficial effects of the embodiment of the application include but are not limited to: (1) The scene information of the preset space can be automatically acquired, the movement path reaching the target position can be planned, the limitation caused by manual operation is avoided, and the safety and the efficiency of clinical operation are improved; (2) In the process of moving the medical auxiliary equipment to the target position, the obstacle avoidance path can be updated in real time according to the surrounding environment information, so that the safety of clinical operation is improved. It should be noted that, the advantages that may be generated by different embodiments may be different, and in different embodiments, the advantages that may be generated may be any one or a combination of several of the above, or any other possible advantages that may be obtained.
It should be noted that, the advantages that may be generated by different embodiments may be different, and in different embodiments, the advantages that may be generated may be any one or a combination of several of the above, or any other possible advantages that may be obtained.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements and adaptations of the application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within the present disclosure, and therefore, such modifications, improvements, and adaptations are intended to be within the spirit and scope of the exemplary embodiments of the present disclosure.
Meanwhile, the present application uses specific words to describe embodiments of the present application. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the application. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the application may be combined as suitable.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application is not intended to limit the sequence of the processes and methods unless specifically recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of example, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the application. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in order to simplify the description of the present disclosure and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are required by the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations in some embodiments for use in determining the breadth of the range, in particular embodiments, the numerical values set forth herein are as precisely as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited herein is hereby incorporated by reference in its entirety. Except for the application history file that is inconsistent or conflicting with this disclosure, the file (currently or later attached to this disclosure) that limits the broadest scope of the claims of this disclosure is also excluded. It is noted that the description, definition, and/or use of the term in the appended claims controls the description, definition, and/or use of the term in this application if there is a discrepancy or conflict between the description, definition, and/or use of the term in the appended claims.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the application. Thus, by way of example, and not limitation, alternative configurations of embodiments of the application may be considered in keeping with the teachings of the application. Accordingly, the embodiments of the present application are not limited to the embodiments explicitly described and depicted herein.

Claims (14)

1. A method of motion control of a medical assistance device, the method comprising:
Acquiring scene information of medical auxiliary equipment in a preset space;
determining a target position of the medical auxiliary equipment corresponding to a current clinical operation station at least according to the current scene information and a space threshold value determined based on the volume of the medical auxiliary equipment; the method comprises the following steps:
screening out a plurality of positions of which the space volume reaches the space threshold value based on the scene information and the space threshold value;
screening at least one target position corresponding to a current clinical operation station based on the plurality of positions, wherein the target position comprises at least one target sub-position and at least one target far-away position;
planning a motion path of the medical auxiliary equipment according to the target position and the scene information, wherein the motion path comprises a path between the target sub-position and the target far-away position;
acquiring indication information indicating the movement of the medical auxiliary equipment;
and controlling the medical auxiliary equipment to move along a corresponding movement path according to the indication information.
2. The method according to claim 1, wherein the scene information is acquired by an imaging device, which is arranged at a certain fixed position within the preset space or on the medical auxiliary equipment.
3. The method of claim 1, wherein the motion path comprises one or more motion sub-paths between the target sub-positions.
4. The method of claim 3, wherein the determining a target location of the medical assistance device corresponding to a current clinical operating site based on the current context information and a spatial threshold determined based on a volume of the medical assistance device comprises: one or more target sub-positions and target far-away positions are preset according to the scene information.
5. The method of claim 4, wherein planning a path of motion of the medical assistance device based on the target location and the context information comprises:
and before the medical task is not acquired or is not executed, presetting one or more movement sub-paths according to the scene information and the preset one or more target sub-positions and/or target far-away positions.
6. The method of claim 1, wherein the indication information comprises input information from a healthcare worker; the input information includes at least one of voice information, text information, instruction selection information, image information, and limb operation instruction information.
7. The method according to claim 1, wherein the method further comprises:
acquiring protocol information of a medical task; the protocol information can reflect one or more clinical operating stations in the medical task;
determining a target position of medical auxiliary equipment corresponding to the current clinical operation station according to the protocol information and the current scene information;
and comparing the target position with the target position determined according to the current scene information and the space threshold value, and selecting the optimal target position.
8. The method of claim 1 or 7, wherein the obtaining indication information indicative of the movement of the medical assistance device comprises:
and determining whether the current operation type operation is finished according to the limb operation indication information of the medical staff.
9. The method of claim 1, wherein controlling the medical assistance device to move along a corresponding movement path in accordance with the indication information comprises:
determining a motion path of the medical auxiliary equipment according to the surrounding environment information of the medical auxiliary equipment; wherein the surrounding environment information includes obstacle information in a movement direction of the medical auxiliary equipment toward the target position.
10. The method of claim 9, wherein the ambient information is obtained by an identification device provided on the medical auxiliary device; the identification device comprises an infrared identification device, a laser identification device or an image identification device.
11. The method of claim 1, wherein the medical assistance device comprises a C-arm device.
12. A motion control system for a medical assistance device, the system comprising:
the scene information acquisition module is used for acquiring scene information of the medical auxiliary equipment in a preset space;
a target position determining module for determining a target position of the medical auxiliary device corresponding to a current clinical operation station based at least on the current context information and a spatial threshold determined based on a volume of the medical auxiliary device; the method comprises the following steps:
screening out a plurality of positions of which the space volume reaches the space threshold value based on the scene information and the space threshold value;
screening at least one target position corresponding to a current clinical operation station based on the plurality of positions, wherein the target position comprises at least one target sub-position and at least one target far-away position;
A motion path planning module, configured to plan a motion path of the medical auxiliary device according to the target position and the scene information, where the motion path includes a path between the target sub-position and the target far-away position;
the indication information acquisition module is used for acquiring indication information indicating the movement of the medical auxiliary equipment;
and the motion control module is used for controlling the medical auxiliary equipment to move along the corresponding motion path according to the indication information.
13. A motion control device for a medical accessory, the device comprising at least one processor and at least one memory;
the at least one memory is configured to store computer instructions;
the at least one processor is configured to execute at least some of the computer instructions to implement the operations of any one of claims 1 to 11.
14. A computer-readable storage medium storing computer instructions which, when executed by a processor, perform the operations of any one of claims 1 to 11.
CN202010789591.0A 2020-05-09 2020-08-07 Motion control method and system for medical auxiliary equipment Active CN114077243B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010789591.0A CN114077243B (en) 2020-08-07 2020-08-07 Motion control method and system for medical auxiliary equipment
PCT/CN2021/092905 WO2021228057A1 (en) 2020-05-09 2021-05-10 Motion control method and system for medical device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010789591.0A CN114077243B (en) 2020-08-07 2020-08-07 Motion control method and system for medical auxiliary equipment

Publications (2)

Publication Number Publication Date
CN114077243A CN114077243A (en) 2022-02-22
CN114077243B true CN114077243B (en) 2023-12-05

Family

ID=80279652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010789591.0A Active CN114077243B (en) 2020-05-09 2020-08-07 Motion control method and system for medical auxiliary equipment

Country Status (1)

Country Link
CN (1) CN114077243B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023186066A1 (en) * 2022-03-31 2023-10-05 Shanghai United Imaging Healthcare Co., Ltd. Mobile medical devices

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860209A (en) * 1983-11-24 1989-08-22 Kabushiki Kaisha Toyota Chuo Kenkyusho Running command system for unmanned vehicle
CN101518447A (en) * 2009-03-02 2009-09-02 北京积水潭医院 Precision improving method for spinal surgery computer navigation system
CN105117000A (en) * 2015-07-29 2015-12-02 青岛海信医疗设备股份有限公司 Method and device for processing medical three-dimensional image
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN109363872A (en) * 2018-12-17 2019-02-22 上海联影医疗科技有限公司 Medical imaging system, scanning bed control method, device and storage medium
CN109389620A (en) * 2017-08-08 2019-02-26 西门子保健有限责任公司 For tracking the method and tracking system of medical object
CN109946715A (en) * 2019-04-09 2019-06-28 云鲸智能科技(东莞)有限公司 Detection method, device, mobile robot and storage medium
CN111053565A (en) * 2019-12-28 2020-04-24 上海联影医疗科技有限公司 Positioning method, positioning device, C-arm system and medium
CN111053564A (en) * 2019-12-26 2020-04-24 上海联影医疗科技有限公司 Medical equipment movement control method and medical equipment
CN111298439A (en) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 Data processing method, device, medium and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160072033A (en) * 2014-12-12 2016-06-22 한국항공우주연구원 Apparatus and method of controlling moving object, and computer readable recodring medium recording program for implementing the method
US11247074B2 (en) * 2018-04-26 2022-02-15 Rodney Wiersma Systems and methods for determining 6DoF correction trajectories in compensating movement of a treatment target in a patient

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860209A (en) * 1983-11-24 1989-08-22 Kabushiki Kaisha Toyota Chuo Kenkyusho Running command system for unmanned vehicle
CN101518447A (en) * 2009-03-02 2009-09-02 北京积水潭医院 Precision improving method for spinal surgery computer navigation system
CN105117000A (en) * 2015-07-29 2015-12-02 青岛海信医疗设备股份有限公司 Method and device for processing medical three-dimensional image
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN109389620A (en) * 2017-08-08 2019-02-26 西门子保健有限责任公司 For tracking the method and tracking system of medical object
CN109363872A (en) * 2018-12-17 2019-02-22 上海联影医疗科技有限公司 Medical imaging system, scanning bed control method, device and storage medium
CN109946715A (en) * 2019-04-09 2019-06-28 云鲸智能科技(东莞)有限公司 Detection method, device, mobile robot and storage medium
CN111053564A (en) * 2019-12-26 2020-04-24 上海联影医疗科技有限公司 Medical equipment movement control method and medical equipment
CN111053565A (en) * 2019-12-28 2020-04-24 上海联影医疗科技有限公司 Positioning method, positioning device, C-arm system and medium
CN111298439A (en) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 Data processing method, device, medium and electronic equipment

Also Published As

Publication number Publication date
CN114077243A (en) 2022-02-22

Similar Documents

Publication Publication Date Title
US11925423B2 (en) Guidance for positioning a patient and surgical robot
US9642584B2 (en) Method for controlling the movement of an x-ray apparatus and x-ray system
RU2640566C2 (en) Personal and automatic correction of x-ray system based on optical detection and interpretation of three-dimensional scene
US20170209110A1 (en) Apparatus and methods for a projection display device on x-ray imaging devices
CN117959116A (en) Medical imaging system, medical imaging device, and storage medium
CN110507337B (en) Medical equipment control system, medical equipment control method and device
CN113116365A (en) Image acquisition method, device and system and storage medium
WO2020028704A1 (en) Improved imaging systems and methods
JP6345471B2 (en) X-ray diagnostic imaging equipment
CN111870268A (en) Method and system for determining target position information of beam limiting device
CN114077243B (en) Motion control method and system for medical auxiliary equipment
JP2022508276A (en) Positioning of medical X-ray imaging equipment
US20230414185A1 (en) Imaging systems and methods
WO2021228057A1 (en) Motion control method and system for medical device
CN111815514A (en) Image acquisition method and device, readable storage medium and image acquisition equipment
WO2023183854A1 (en) Improved imaging systems and methods
CN112716509B (en) Motion control method and system for medical equipment
CN113081013B (en) Spacer scanning method, device and system
EP3970619A1 (en) Method to improve a radiography acquisition workflow
CN114374784A (en) Intelligent medical live broadcast control method, system and storage medium
CN114067994A (en) Target part orientation marking method and system
JP6916925B1 (en) Shooting range setting device, medical device, and program
US20240090864A1 (en) Radiographic imaging support system, radiographic imaging support method, and recording medium
WO2023141800A1 (en) Mobile x-ray positioning system
CN118141394A (en) Image scanning method and system for composite operating room

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant