CN114077243A - Motion control method and system for medical auxiliary equipment - Google Patents

Motion control method and system for medical auxiliary equipment Download PDF

Info

Publication number
CN114077243A
CN114077243A CN202010789591.0A CN202010789591A CN114077243A CN 114077243 A CN114077243 A CN 114077243A CN 202010789591 A CN202010789591 A CN 202010789591A CN 114077243 A CN114077243 A CN 114077243A
Authority
CN
China
Prior art keywords
medical
information
target
medical auxiliary
auxiliary equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010789591.0A
Other languages
Chinese (zh)
Other versions
CN114077243B (en
Inventor
冯娟
王振玮
陆学飞
杨乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202010789591.0A priority Critical patent/CN114077243B/en
Priority to PCT/CN2021/092905 priority patent/WO2021228057A1/en
Publication of CN114077243A publication Critical patent/CN114077243A/en
Application granted granted Critical
Publication of CN114077243B publication Critical patent/CN114077243B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision

Abstract

The embodiment of the application discloses a motion control method and a motion control system for medical auxiliary equipment, wherein the motion control method comprises the following steps: acquiring scene information of the medical auxiliary equipment in a preset space; determining a target position of the medical auxiliary device at least according to the scene information; planning a motion path of the medical auxiliary equipment according to the target position and the scene information; acquiring indication information indicating the movement of the medical auxiliary equipment; and controlling the medical auxiliary equipment to move along the corresponding movement path according to the indication information. The motion control method can automatically acquire scene information in a preset space to determine the target position of the medical auxiliary equipment, plan the motion path according to the determined target position and control the motion path to move, avoids limitation caused by manual operation, and improves safety and efficiency of clinical operation.

Description

Motion control method and system for medical auxiliary equipment
Technical Field
The present application relates to the field of medical auxiliary devices, and in particular, to a method and a system for controlling a motion of a medical auxiliary device.
Background
In practical applications of medical auxiliary equipment, it is often necessary to move the medical auxiliary equipment to different stations to adapt to different clinical operations, for example, when performing operations or examinations such as bone surgery, urology, cardiovascular imaging, etc. by using the mobile C-arm equipment, the mobile C-arm equipment needs to be moved to different positions. The medical auxiliary equipment is accurately and safely adjusted to the corresponding position, and great help is brought to the improvement of the working efficiency of medical staff and the safety of clinical operation.
Therefore, there is a need to provide a method and a system for controlling the motion of a medical auxiliary device, so as to reduce the workload of medical staff, improve the operation efficiency and improve the safety of clinical operation.
Disclosure of Invention
One embodiment of the present application provides a motion control method for a medical auxiliary device, where the method includes: acquiring scene information of the medical auxiliary equipment in a preset space; determining a target position of the medical auxiliary device at least according to the scene information; planning a motion path of the medical auxiliary equipment according to the target position and the scene information; acquiring indication information indicating the movement of the medical auxiliary equipment; and controlling the medical auxiliary equipment to move along the corresponding movement path according to the indication information.
One of the embodiments of the present application provides a motion control system of a medical auxiliary device, the system including: the scene information acquisition module is used for acquiring scene information of the medical auxiliary equipment in a preset space; a target position determination module for determining a target position of the medical auxiliary device at least according to the scene information; the movement path planning module is used for planning a movement path of the medical auxiliary equipment according to the target position and the scene information; the indication information acquisition module is used for acquiring indication information indicating the movement of the medical auxiliary equipment; and the motion control module is used for controlling the medical auxiliary equipment to move along the corresponding motion path according to the indication information.
One of the embodiments of the present application further provides a motion control apparatus for a medical auxiliary device, which includes at least one processing device and at least one storage device; the at least one storage device is to store computer instructions; the at least one processing device is configured to execute at least some of the computer instructions to implement the operations described above.
One of the embodiments of the present application also provides a computer-readable storage medium storing computer instructions that, when executed by a processor, implement the operations as described above.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a motion control system of a medical assistance device according to some embodiments of the present application;
FIG. 2 is an exemplary flow chart of a method of motion control of a medical assistance device according to some embodiments of the present application;
FIG. 3 is an exemplary block diagram of a motion control system of a medical assistance device according to some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system," "device," and/or "module" as used herein is a method for distinguishing different components, elements, components, parts, or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic view of an application scenario of a motion control system of a medical assistance device according to some embodiments of the present application. The medical assistance device motion control system 100 may include a medical assistance device 110, a network 120, at least one terminal 130, a processing device 140, a storage device 150, and a hospital bed 160. The various components of the system 100 may be interconnected by a network 120. For example, the medical assistance device 110 and the at least one terminal 130 may be connected or in communication via the network 120.
In some embodiments, the medical assistance device 110 may include an imaging device, an analysis device, other medical devices (e.g., surgical assistance devices), and the like. For example, the medical auxiliary device 110 may include a Digital Radiography (DR), a Computer Radiography (CR), a Digital Fluoroscopy (DF), a bio-immune analyzer, a CT scanner, a magnetic resonance scanner, an electrocardiograph, a B-scope, an endoscope and its washer-disinfector, a shadowless lamp, a mobile C-arm, a medical pendant, and the like. In some embodiments, the medical auxiliary device 110 may be a C-arm device (or C-arm) that may include a gantry, a detector, a detection region, and a radiation generating device (e.g., a radiation source). The gantry may be used to support the detector and the radiation generating device. A medical subject may be placed on the couch 160 for scanning. The medical subject may include a patient, a phantom, or other scanned object. The couch top of the couch 160 may be parallel to the ground. The radiation generating device may emit X-rays to the medical subject. By scanning the medical subject, the medical auxiliary device 110 can acquire scan data to generate (or reconstruct) an image.
The network 120 may include any suitable network capable of facilitating information and/or data exchange for the medical assistance device motion control system 100. In some embodiments, at least one component of the medical assistance device motion control system 100 (e.g., the medical assistance device 110, the processing device 140, the storage device 150, the at least one terminal 130) may exchange information and/or data with at least one other component in the medical assistance device motion control system 100 via the network 120. For example, the processing device 140 may obtain an image of its surroundings (e.g., surrounding environment information, scene information of a preset space, etc.) from the medical assistance device 110 and/or a camera device at a certain fixed position of the preset space through the network 120. As another example, processing device 140 may obtain information indicative of a medical professional (e.g., a doctor) from at least one terminal 130 via network 120. Network 120 may alternatively comprise a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN)), a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. For example, network 120 may include a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), Bluetooth, and a network interfaceTMNetwork and ZigBeeTMA network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, network 120 may include at least one network access point. For example, the network 120 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which at least one component of the medical assistance device motion control system 100 may connect to the network 120 to exchange data and/or information.
The at least one terminal 130 may be in communication with and/or connected to the medical assistance device 110, the processing device 140, and/or the storage device 150. For example, a healthcare worker may enter instructional information (e.g., image information, text information, etc.) through the at least one terminal 130 to control movement of the medical assistance device 110. As another example, a healthcare worker may enter information related to a medical plan (e.g., protocol information for a medical task, etc.) via at least one terminal 130 and be stored by processing device 140 in storage device 150. As another example, the target location and the path of movement determined by the processing device 140 may be displayed on the terminal 130. In some embodiments, at least one terminal 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, and the like, or any combination thereof. For example, mobile device 131 may include a mobile joystick, a Personal Digital Assistant (PDA), a smart phone, or the like, or any combination thereof.
In some embodiments, at least one terminal 130 may include an input device, an output device, and the like. The input device may be selected from keyboard input, touch screen (e.g., with tactile or haptic feedback) input, voice input, eye tracking input, gesture tracking input, brain monitoring system input, image input, video input, or any other similar input mechanism. Input information received via the input device may be transmitted, for example, via a bus, to the processing device 140 for further processing. Other types of input devices may include cursor control devices such as a mouse, a trackball, or cursor direction keys, among others. In some embodiments, a healthcare worker may enter information related to a medical plan via an input device. In some embodiments, the medical assistant 110 may be controlled by the medical staff by inputting instruction information (e.g., image information, text information, instruction selection information, etc.) through the input device, for example, the medical staff may control the medical assistant 110 to scan the chest by inputting a picture of the chest of the medical subject through the input device. Output devices may include a display, speakers, printer, or the like, or any combination thereof. The output device may be used to output the target position, the motion path, etc. determined by the processing device 140. In some embodiments, at least one terminal 130 may be part of the processing device 140.
In other embodiments, the medical auxiliary device 110 may not include the terminal 130. For example, after the processing device 140 obtains the scene information of the preset space, the target position is determined according to the scene information, and then the movement path is determined, after the processing device 140 determines the movement path, the medical auxiliary device 110 is directly instructed to move, without the need for the medical staff to control through the terminal 130. For another example, the medical staff may instruct the medical auxiliary apparatus 110 to move through the limb operation instruction information, and the limb operation instruction information of the medical staff may be obtained by a camera device disposed at a fixed position in the preset space and/or on the medical auxiliary apparatus 110, in this way, the terminal 130 is not needed to instruct the medical auxiliary apparatus to move.
The processing device 140 may process data and/or information obtained from the medical assistance device 110, the storage device 150, the at least one terminal 130, or other components of the medical assistance device motion control system 100. For example, the processing device 140 may retrieve information related to the target location of the medical assistance device 110 from the storage device 150. In some embodiments, the processing device 140 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data from the medical assistance device 110, the storage device 150, and/or the at least one terminal 130 via the network 120. As another example, the processing device 140 may be directly connected to the medical assistance device 110, the at least one terminal 130, and/or the storage device 150 to access information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
Storage device 150 may store data, instructions, and/or any other information. For example, scene information of the preset space, protocol information of the medical task, a target position of the medical assistance apparatus 110, a movement path, and the like. In some embodiments, the storage device 150 may store data obtained from the medical assistance device 110, the at least one terminal 130, and/or the processing device 140. In some embodiments, storage device 150 may store data and/or instructions that are used by processing device 140 to perform or use to perform the exemplary methods described in this application. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the storage device 150 may be implemented on a cloud platform.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with at least one other component (e.g., the processing device 140, the at least one terminal 130) in the medical assistance device motion control system 100. At least one component of the medical assistance device motion control system 100 may access data (e.g., target location, motion path, etc. determined by the processing device 140) or instructions (e.g., various types of indication information, etc.) stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be part of the processing device 140.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Many variations and modifications will occur to those skilled in the art in light of the teachings herein. The features, structures, methods, and other features of the example embodiments described herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the storage device 150 may be a data storage device comprising a cloud computing platform, such as a public cloud, a private cloud, a community and hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present application.
Currently, the adjustment of the position of the medical assistance device 110 is mainly controlled manually, i.e. requires manual manipulation by a medical staff, e.g. the current mobile C-arm devices. Specifically, when the mobile C-arm device is used to perform an operation or examination such as bone surgery, urinary surgery, cardiovascular angiography, etc., a medical staff usually controls the device to move to a corresponding station for clinical operation (e.g., operation, scanning, etc.) manually, due to the limited space of an operating room or examination room, the numerous devices used in the operation or examination process, and the limited field of vision of the medical staff, the medical staff cannot observe environmental information comprehensively and accurately, and in addition, the medical staff has different proficiency in operating different medical auxiliary devices 110, such as certain medical auxiliary devices (e.g., mobile C-arm) having larger size, heavier weight, and poorer flexibility, in such a case, the medical auxiliary device 110 may collide with other devices or medical subjects (e.g., medical staff, patients, etc.), or the station where the medical auxiliary device is located does not meet the examination requirements, And (5) performing surgical requirements.
In some embodiments, the motion control method of the medical auxiliary device 110 provided by the application may adopt related technologies such as automatically acquiring scene information, intelligently planning a motion path, and intelligently avoiding obstacles, so that the medical auxiliary device 110 may be better applied to the above clinical operation environment, and problems of safety and low efficiency caused by limitations brought by manual operations of medical staff are avoided.
Fig. 2 is an exemplary flow chart of a method for motion control of a medical assistance device according to some embodiments of the present application. In particular, the motion control method 200 of the medical assistance device 110 may be performed by the processing device 140. For example, the motion control method 200 of the medical assistance apparatus 110 may be stored in a storage device (e.g., the storage apparatus 150) in the form of a program or instructions that, when executed by the motion control system 100 (e.g., the processing apparatus 140) of the medical assistance apparatus 110, may implement the motion control method 200 of the medical assistance apparatus 110.
And step 210, acquiring scene information of the medical auxiliary equipment in a preset space. In some embodiments, step 210 is performed by the context information acquisition module 310.
In step 210, the processing device 140 may acquire scene information of the medical assistance device 110 within a preset space. Specifically, step 210 may be performed by the context information acquiring module 310 in the processing device 140.
The predetermined space may be a range of areas in which the medical auxiliary device 110 can move. In some embodiments, the predetermined space may be an area where a medical subject is subjected to surgery and/or examination, e.g., an operating room, a radiology examination room, etc.
The scene information may be understood as environment information in a preset space. For example, the scene information may include information of a medical subject (e.g., a patient, a subject, a medical staff, etc.), information of a non-medical subject (e.g., medical assistance apparatus 110 information, obstacle information, etc.), and the like, which are located in a preset space. Further description of obstacle information may be found in other embodiments of the present application and will not be repeated here. In some embodiments, the context information may include position information of one or more subjects in the preset space and time information corresponding to the position information, for example, the context information may include a specific position where a medical subject is located at a certain time in the preset space. In some embodiments, the location information of one or more subjects may be represented by coordinates. Specifically, a spatial rectangular coordinate system may be established with a subject (e.g., the hospital bed 160) fixed with respect to a preset space as an origin, and the position information of the subject may be represented with the distance of X, Y, Z axes of all subjects located in the preset space from the origin as the coordinates of the subject.
In some embodiments, the scene information may be obtained by a camera device, and the camera device may be disposed at a fixed position in the preset space or disposed on the medical auxiliary device 110. The image pickup device may acquire image information including, for example, video images and/or still pictures, etc., of all subjects (including medical subjects and non-medical subjects) located within a preset space, and transmit the image information and time information corresponding to the image information to the processing apparatus 140. In some embodiments, the processing device 140 may create a stereoscopic image (e.g., a three-dimensional map, etc.) of a preset space by analyzing the image information and obtain scene information corresponding to each subject. Specifically, the stereo image in the preset space includes a plurality of pixel points, each main body in the preset space is composed of a plurality of corresponding pixel points, and the processing device 140 assigns a value to each pixel point according to the result of analyzing the image information, so as to obtain the scene information corresponding to each main body.
In some embodiments, the image capturing device may include a plurality of image capturing devices, and the plurality of image capturing devices may be arranged according to a certain rule. For example, the spacing distance between two adjacent image pickup devices is the same. For another example, the plurality of imaging devices are arranged in a line array or a surface array. The imaging device may be disposed at a fixed position in the preset space and on the medical auxiliary apparatus 110 at the same time, and perform different functions, for example, the imaging device disposed at a fixed position is used for acquiring scene information that is far away from the medical auxiliary apparatus 110, and the imaging device disposed on the medical auxiliary apparatus 110 is used for acquiring scene information that is near to the medical auxiliary apparatus 110 (for example, ambient environment information). In some embodiments, the camera may capture images periodically or in real-time to acquire scene information. For example, the processing device 140 may capture images at fixed length time intervals of 100 milliseconds, 500 milliseconds, 1 second, 2 seconds, and the like to acquire scene information. In some embodiments, the processing device 140 may adjust parameters of the camera device based on the acquired environment information to acquire more comprehensive and accurate scene information, and exemplary parameters of the camera device may include a position of a camera face of the camera device, a direction of the camera face, an angle, an aperture, a focal length, a sensitivity, a white balance, and the like.
In some embodiments, the processing device 140 may acquire more accurate and comprehensive scene information in combination with other technologies besides acquiring scene information through the camera. For example, infrared recognition technology, laser recognition technology, ultrasonic recognition technology, and the like.
Step 220, determining the target position of the medical auxiliary equipment at least according to the scene information. In some embodiments, step 220 is performed by the target position determination module 320.
In step 220, the processing device 140 may determine a target location of the medical assistance device 110 based at least on the context information. In particular, step 220 may be performed by a target location determination module 320 in the processing device 140. In some embodiments, the processing device 140 may determine the target location of the medical assistance device 110 according to the context information, which may be described in detail in step 220 a. In other embodiments, the processing device 140 may further determine the target position of the medical assistance device 110 according to the protocol information and the scenario information of the medical task, which may be described in detail in step 220 b.
Step 220a, determining the target position of the medical auxiliary device 110 according to the scene information.
In step 220a, the processing device 140 may determine a target location of the medical assistance device 110 from the context information. In particular, step 220a may be performed by the target position determination module 320 in the processing device 140.
The target position may be understood as an end position of a certain movement of the medical assistance apparatus 110 when performing a medical task, for example, when a chest scan needs to be performed, the medical assistance apparatus 110 moves to a certain designated position to start scanning, and the position is the target position, and for example, when the medical task is completed, the medical assistance apparatus 110 moves to a certain designated position to wait for standby (for example, stop working, wait for the next clinical operation, etc.), and the designated position is the target position.
In some embodiments, the target location comprises one or more target sub-locations and/or target remote locations. The target sub-position may refer to a position where the medical assistance apparatus 110 must be moved or adjusted to a specific position before a certain clinical operation (e.g., to assist performing a suture operation, a chest scan, etc.) is performed, so that the specific position is the target sub-position, and thus the target sub-position may be understood as a position of the medical assistance apparatus 110 when the certain clinical operation is performed. It is understood that a medical task may include one or more clinical procedures, and a medical task may be performed by different medical auxiliary devices 110, or may be performed by the same medical auxiliary device 110, for example, a C-arm performs an X-ray fluoroscopy operation on a medical subject and then an auxiliary operation (e.g., an auxiliary suture operation). In some embodiments, the position of the medical assistance device 110 varies when performing different clinical procedures, and thus the target position may include a plurality of target sub-positions, for example, when a brain scan is performed, the medical subject lies on the medical bed 160, and the medical assistance device 110 needs to be located near the chest of the medical subject, which is one of the target sub-positions; for another example, when performing a leg suture operation, the medical auxiliary device 110 (e.g., a shadowless lamp or the like) needs to be located near the leg of the medical main body to perform the operation, which is another target sub-position.
The target remote position may be understood as a clinical operation or a medical task, and the medical auxiliary device 110 needs to be on standby in an idle position, which is a target remote position, remote from the medical subject or the medical bed 160. For example, after one scan is completed, the medical auxiliary device 110 is no longer needed to be used, and at this time, the medical auxiliary device 110 moves to a free area (e.g., a corner), which is a target far position, for standby. In some embodiments, the target remote location may also include a plurality of target remote locations, for example, the target remote location may be a corner, a doorway, or the like.
In some embodiments, the processing device 140 may preset the target position according to the scene information. Presetting may be understood as a target location determined before a medical task is not performed or information related to the medical task (e.g., protocol information, etc.) is not acquired. In some embodiments, the target location may be determined from context information, including: the processing device 140 presets one or more target sub-locations and/or target remote locations according to the scene information. In this embodiment, after acquiring the context information, the processing device 140 may preset one or more target sub-positions and/or target far-from positions according to the context information, where the one or more target sub-positions and/or target far-from positions may be suitable for most or more common clinical operations, for example, after acquiring the context information, the context information includes information related to the hospital bed 160 (e.g., a specific position of the hospital bed 160, a length, a width, a height, etc. of the hospital bed 160) or information related to a human body or a human body model, and the one or more target sub-positions are directly set according to the information related to the hospital bed 160. Exemplary target sub-locations may include a first station for head scanning: the first section of the hospital bed 160 (which refers to the end of the hospital bed 160 where the head is located when the medical subject lies flat on the hospital bed); second station for chest scanning: the midsection of the hospital bed 160 (which refers to the position of the chest when the medical subject lies flat on the hospital bed 160); a third station for assisting in the performance of a leg suture procedure: the tail section of the hospital bed 160 (which refers to the end of the hospital body where the foot is located when lying down on the hospital bed 160). When the medical assistance device 110 is moved to a predetermined corresponding workstation, the corresponding clinical operation can be started (for example, after moving to a second workstation, a chest scan can be performed). For another example, after the scene information is acquired, a certain free area in the preset space is directly used as the target distant position. The preset target sub-position can be suitable for most clinical operations, the universality is stronger, and when the clinical operations to be performed are common (for example, the leg suture operation, the chest scanning and the like are assisted), the preset target position can reduce the steps for determining the target position, and the execution efficiency of the medical task is improved. For another example, the scene information includes human body or human body model information, and after the corresponding part of the human body is identified through the human body or human body model information, the station is directly determined according to the corresponding part.
In some embodiments, the target distance position may also be preset, for example, the target distance position may be fixedly disposed in a certain spatial region (e.g., a certain corner) of the preset space. The medical auxiliary device 110 can automatically return to the target remote position every time a medical task or a clinical operation in the medical task is performed, the target position does not need to be determined repeatedly, and the efficiency of the clinical operation is effectively improved.
In some embodiments, the determining the target location of the medical device according to the context information may further include: and determining a target position corresponding to the current operation type station according to the current scene information and a preset space threshold value. The preset target position may be suitable for most clinical operations, but in some cases the preset target position may be occupied by other devices or not suitable for the current clinical operation. Therefore, the processing device 140 may also incorporate a preset spatial threshold in determining the target location, further improving the efficiency of the clinical procedure. The preset spatial threshold may refer to a preset volume of space capable of accommodating the medical auxiliary device 110, for example, 1 cubic meter, 2 cubic meters, 3 cubic meters, and the like. In some embodiments, the spatial threshold may be determined according to the volume of the medical assistance device 110, and the spatial threshold may be at least greater than or equal to the volume of the medical assistance device 110 to ensure that the medical assistance device 110 is accommodated when a certain position in a preset space reaches the preset spatial threshold.
In some embodiments, the processing device 140 may determine the target position corresponding to the current surgical operation type workstation according to the current scene information and a preset spatial threshold. Specifically, after the processing device 140 obtains the current scene information and a preset spatial threshold (for example, 2 cubic meters), several positions (for example, a first position, a second position, a third position, and the like) whose spatial volume reaches the spatial threshold may be screened out, and then one or more positions corresponding to the current operation type (for example, the current operation type is leg suture) position (for example, a second position close to the third position) may be screened out from the several positions as the target position (for example, the third position or the second position close to the third position is selected).
And step 220b, determining the target position of the medical auxiliary equipment according to the protocol information and the scene information. In some embodiments, step 220b may be performed by the target position determination module 320.
In step 220b, the processing device 140 may determine a target location of the medical assistance device 110 according to the protocol information and the scenario information. In particular, step 220 may be performed by a target location determination module 320 in the processing device 140.
In some embodiments, the processing device 140 may obtain protocol information for the medical task and determine the target location based on the protocol information and the context information. In some embodiments, the protocol information can reflect one or more clinical operating workstations in the medical task. For example, the protocol information of the medical task, which requires performing clinical operations such as chest scanning, leg scanning, and leg suture surgery, may include "the medical assistance device 110 performs chest scanning at the chest position of the medical subject", "the medical assistance device 110 performs leg scanning at the leg position of the medical subject, and assists in performing leg suture surgery". The protocol information reflects a plurality of clinical operation stations in a medical task, such as "chest position", "leg position". In some embodiments, the processing device 140 may obtain the target position corresponding to the clinical operation workstation through the plurality of clinical operation workstations and the scenario information analyzed from the protocol information. When the medical assistance device 110 is moved to the target position, the corresponding clinical operation may be performed. For example, by analyzing the "chest position" of the clinical procedure station, the processing device 140 needs to move to the middle of the medical bed 160, and thus takes the middle of the medical bed 160 as the target position; or a corresponding part of the human body is taken as a target position of the imaging assembly and the medical assistance device 110 is moved to the corresponding position. For another example, the processing device 140 analyzes the "leg position", and the medical assistance device 110 needs to move to the end section of the hospital bed 160, thereby taking the end section of the hospital bed 160 as the target position. In some embodiments, protocol information may be retrieved by processing device 140 directly from the medical tasks stored in storage device 150 or may be manually entered by a medical professional via terminal 130.
In some embodiments, the processing device 140 determines the target location based on the acquired protocol information. Since each medical task includes different contents of clinical operations, the corresponding protocol information may also be different. For example, the second medical task may include clinical operations such as leg scanning and suture surgery, but the leg scanning of the second medical task is directed to the lower leg portion, the leg scanning of the first medical task is directed to the sole portion, the protocol information of the first medical task may be the "sole position", and the protocol information of the second medical task may be the "lower leg position", and therefore, when the processing device 140 analyzes the protocol information corresponding to the first medical task and the protocol information corresponding to the second medical task, the determined target positions are different.
It should be noted that the above description of the determining the target position steps 220a and 220b is only a specific example and should not be considered as the only feasible embodiment. It will be obvious to those having skill in the art having the benefit of the teachings of the present invention that numerous modifications and variations in form and detail may be made to the specific details and procedures for determining a target position without departing from such principles, but such modifications and variations are within the purview of the above description. For example, the processing device 140 of the present application may determine the target position through step 220a, may also determine the target position through step 220b, and may also determine the target position through step 220a and step 220b, respectively, and select the optimal target position by analyzing and comparing the target positions determined in the two steps. Such variations are within the scope of the present application.
Step 230, planning a motion path of the medical auxiliary device 110 according to the target position and the scene information. In some embodiments, step 230 may be performed by the motion path planning module 330.
In step 230, the processing device 140 may plan a motion path of the medical assistance device 110 according to the target position and the scene information. In particular, step 230 may be performed by the motion path planning module 330 in the processing device 140.
The movement path of the medical assistance device 110 may indicate a route or trajectory of the medical assistance device 110 from its starting position to said target position. The processing device 140 may determine that the path of the medical assistance device 110 pointing at the target position in its starting position is the movement path of the medical assistance device 110. In some embodiments, the starting position of one motion path may be the target position of the previous motion path, that is, the target position of the first motion is used as the starting position of the second motion, for example, the processing device 140 determines a first station for brain scanning and a second station for leg scanning, respectively, and the medical auxiliary device 110 performs the brain scanning and then performs the leg scanning according to the medical task, so that after the brain scanning is completed, the medical auxiliary device 110 needs to move from the first station to the second station, where the first station is the starting position of the motion path of the medical auxiliary device 110.
In some embodiments, the motion path comprises one or more motion sub-paths between the target sub-locations and/or between the target sub-locations and the target distant location. In some embodiments, the movement sub-path between target sub-locations may refer to a route or trajectory that the medical auxiliary device 110 moves between a plurality of clinical operation workstations. For example, the motion sub-path may include a route or trajectory that the medical auxiliary device 110 moves from a first workstation (a first section of the medical bed 160) to a second workstation (a middle section of the medical bed 160). In some embodiments, the movement sub-path between the target sub-position and the target distant position may be a route or a track that the medical auxiliary device 110 needs to perform a standby after performing a medical task or a clinical operation, and moves from a clinical operation workstation to a standby position. For example, after the medical assistance device 110 completes the brain scan at the first workstation, it is no longer needed, and at this time, the medical assistance device 110 will move to a free area (for example, a corner), which is a target distant location, for standby, and the route or trajectory from the first workstation to the corner is also referred to as a motion sub-path.
In some embodiments, the number of target sub-paths is related to the number of target locations (target sub-locations and/or target far-locations). For example, when the target location has only one (e.g., only one target sub-location or only one target remote location), the target sub-path is typically only one, and when the target location has a plurality (e.g., two or more target sub-locations), the target sub-path is typically also a plurality.
In some embodiments, the processing device 140 may preset the one or more motion sub-paths according to the scene information and the preset one or more target sub-positions and/or target far-away positions. In this embodiment, the processing device 140 may determine the movement sub-path according to one or more preset target positions (e.g., the first station, the second station, the third station, etc.), and the processing device 140 may directly set one or more movement sub-paths by using the one or more preset target positions as the end positions of a certain movement of the medical assistance device 110 before a medical task is not acquired or performed. The target position (including the target sub-position and the target far-away position) and the motion path (including the motion sub-path) preset by the processing device 140 can be generally suitable for most clinical operations, can meet most medical tasks, have strong universality, save the step of planning the motion path, and improve the efficiency of the clinical operations.
Step 240, obtaining indication information indicating the movement of the medical assistance device 110. In some embodiments, step 240 may be performed by indication information acquisition module 340.
In step 240, the processing device 140 may obtain indication information indicating a movement of the medical assistance device 110. Specifically, step 250 may be performed by indication information obtaining module 340 in processing device 140.
The indication information may be instructions for instructing the medical assistance device 110 to perform a movement. In some embodiments, the indication may be issued by the healthcare worker, for example, instructions or programs entered by the healthcare worker, or the like.
In some embodiments, the indication comprises input from a healthcare worker; the input information includes at least one of voice information, text information, instruction selection information, image information, and limb operation instruction information. Wherein, the voice information can be voice, password, dialogue and the like sent by the medical staff, and the exemplary voice information can be: "move to first station", "return to standby position", "start chest scan", and the like. The text message may be text entered into the terminal 130 (e.g., control pad, computer, etc.) by a healthcare worker, and exemplary text messages may be: "move to the first station first and then to the second station", "perform suture operation", etc. The order selection information may be one or more orders displayed on the terminal 130 for selection by the medical staff, and different orders correspond to different clinical operations, for example, an order-one correspondence indicates moving to a first workstation, and an order-two correspondence indicates performing a chest scan. The image information may be a picture, image, etc. entered into terminal 130 by a healthcare worker, e.g., the healthcare worker enters an image of a breast into terminal 130, which may instruct medical assistance device 110 to perform an X-ray scan of the breast. The limb operation instruction information may be information indicating the limb movement of the medical staff, and the different limb movements correspond to different instructions, for example, when the finger of the medical staff moves to the tail end (or the third station) of the medical bed 160, the instruction indicates that the medical auxiliary device 110 needs to move to the tail end of the medical bed 160, and for example, when the medical staff puts down the surgical tool, the instruction indicates that the surgery is finished and the medical auxiliary device 110 needs to move to the standby position (for example, the target remote position). In some embodiments, the meaning corresponding to the indication information is related to the protocol information, and the indication information corresponding to different protocol information has different meanings, for example, although the voice information may indicate that the medical assistance device 110 performs a suturing operation, the protocol information is a leg suturing operation and the protocol information is a hand suturing operation, so that when the hand of the medical staff points to the tail section of the hospital bed 160, the target position for moving the medical assistance device is different, and for example, in the limb operation indication information, the medical staff may put down a surgical tool to indicate that the surgery is completed, and the medical assistance device 110 needs to return to the standby position, or may indicate that the surgical tool needs to be replaced. In some embodiments, the medical staff may instruct the medical assistance device 110 to move by using the voice information, the text information, the instruction selection information, the image information, and the limb operation instruction information at the same time, so as to improve the clinical operation efficiency of the medical staff, for example, by using the voice information and the limb operation instruction information at the same time.
In some embodiments, the processing device 140 may determine whether the current operation type operation is finished according to the limb operation instruction information of the medical staff; and determining a next target location according to the protocol information and the current scene information. After the processing device 140 acquires the medical task, the clinical operation information (e.g., the type of the surgical operation, the clinical operation workstation, etc.) corresponding to the limb operation instruction information may be determined according to the protocol information of the medical task, and may be stored in the storage device 150 so as to be called at any time. Specifically, after a certain medical task needs to perform a calf suture operation and a thigh suture operation in sequence, the processing device 140 obtains corresponding protocol information, determines the position of the calf when the medical subject lies on the hospital bed 160 as the target position of the calf suture operation, the processing device 140 can judge whether the calf suture operation is finished according to limb operation instruction information sent by medical staff, and when the calf suture operation is finished, determines the target position of the thigh suture operation. For example, the medical staff may point to the position of the thigh of the medical subject with a hand, and for example, the medical staff may put down the suturing tool, and for example, the medical staff makes an "OK" gesture, and the above-mentioned limb operation instruction information is captured by the camera device and transmitted to the processing device 140 for analysis, and it is determined that the calf suturing operation is finished and the thigh suturing operation is required. The processing device 140 may determine the position of the thigh when the medical subject lies down on the medical bed 160 as the target position of the thigh suturing operation, so that the medical assistance device 110 can smoothly perform the next suturing operation. In some embodiments, the corresponding meaning of the limb operation indication information may be customized by the medical staff, for example, the medical staff may use "put down the suture surgical tool for 10 seconds" as the indication information of the end of the current operation type operation. In other embodiments, the meaning corresponding to the limb operation instruction information may also be determined by the processing device 140 after analyzing the historical limb operation instruction information stored in the storage device.
In some embodiments, the indication information may further include an instruction or information automatically issued by the processing device 140 according to the medical task, for example, after the processing device 140 acquires protocol information of the medical task, the target position and the movement path may be determined, and after the target position and the movement path are determined, the processing device 140 may issue the indication information to the medical auxiliary device 110 to instruct the medical auxiliary device 110 to move to the target position corresponding to the clinical operation workstation without the medical staff issuing the indication information.
And step 250, controlling the medical auxiliary equipment 110 to move along the corresponding movement path according to the indication information. In some embodiments, step 250 may be performed by the motion control module 350.
In step 250, the processing device 140 may control the medical assistance device 110 to move along the corresponding movement path according to the indication information. In particular, step 250 may be performed by a motion control module 350 in the processing device 140.
The medical assistance device 110 may include a drive mechanism and the processing device 140 may transmit the indication to the drive mechanism of the medical assistance device 110. In some embodiments, the drive mechanism may drive a movement mechanism (e.g., wheels, tracks, robotic feet, etc.) of the medical assistance device 110 based on the received indication to move the medical assistance device 110 along the movement path to the target location. In some embodiments, the processing device 140 may determine a speed of movement of the medical assistance device 110. For example, the processing device 140 may control the medical auxiliary device 110 to move along the motion path to the target position at a constant speed (e.g., 1m/s, 2m/s, etc.). In some embodiments, during the movement of the medical assistance device 110 to the target position, the processing device 140 may determine a velocity of the medical assistance device 110 based on the ambient environment information of the medical assistance device 110 in response to the ambient environment information of the medical assistance device 110. For example, there are obstacles around the medical assistance apparatus 110, and the closer the obstacle is to the medical assistance apparatus 110, the slower the medical assistance apparatus 110 moves.
The peripheral environment information may be environment information in a space centered on the medical assistance apparatus 110 and having a first threshold (e.g., 2m, 3m, 5m, etc.) as a radius, and the peripheral environment information may include obstacle information and non-obstacle information (e.g., road surface information, etc.) in a direction of movement of the medical assistance apparatus 110 toward the target position. In some embodiments, the processing device 140 may obtain ambient environment information for the medical assistance device 110. The processing device 140 may determine a motion path of the medical assistance device 110 based on the ambient environment information; wherein the surrounding environment information includes obstacle information in a moving direction of the medical assistance apparatus 110 toward the target position. In some embodiments, the ambient environment information is obtained by an identification device provided on the medical assistance apparatus 110; the recognition device comprises an infrared recognition device, a laser recognition device or an image recognition device. For example, the infrared recognition device may include an infrared scanner, an infrared rangefinder, and the like, the laser recognition device may include a laser scanner, and the like, and the image recognition device may include an image pickup device (e.g., a camera), and the like. In some embodiments, one or more cameras may be disposed on the medical assistance apparatus 110 to obtain images of the surroundings of the medical assistance apparatus 110, and the processing apparatus 140 may analyze the obtained images to identify the surrounding environment information of the medical assistance apparatus 110, for example, to identify whether a subject is an obstacle.
In some embodiments, the obstacle may refer to a body within a second threshold (e.g., 0.5m, 1m, 1.5m, etc.) of the distance from the medical assistance device 110, i.e., a body within the second threshold of the distance from the medical assistance device 110 may be referred to as an obstacle during the movement of the medical device toward the target position. In some embodiments, the obstacle may be a medical subject (e.g., a person), a non-medical subject (e.g., other medical assistance device 110), or the like. In some embodiments, whether a subject is an obstacle may also be determined by whether the subject moves within a time threshold, e.g., the subject is within a second threshold (e.g., 0.5m, 1m, 1.5m, etc.) from the medical assistance device 110 and does not move within the time threshold (e.g., 5s, 10s, 15s, etc.), then the subject may be considered an obstacle. In some particular embodiments, the obstacle information may include whether an obstacle is present, a category of the obstacle (e.g., person, wall, hospital bed 160, door, etc.), a height of the obstacle (e.g., 3 meters, 1 meter, 0.5 meters, 0.2 meters, 0.05 meters), and the like, any combination thereof. In some embodiments, when it is determined that there is an obstacle in the movement direction of the medical assistance apparatus 110 toward the target position, the processing apparatus 140 may determine whether the movement path needs to be updated based on the height and/or the category of the obstacle, if the movement path needs to be updated, the processing apparatus 140 plans an obstacle avoidance path, and if the movement path does not need to be updated, the movement is continued according to the original movement path. In some embodiments, the processing device 140 may update the motion path of the medical assistance device 110 to an obstacle avoidance path. For example, when the height of the obstacle is higher than a certain threshold (e.g., 0.05 meters), the processing device 140 determines that the medical assistance device 110 cannot continue to move against the obstacle and needs to update the movement path, and the processing device 140 may determine the obstacle avoidance path. For another example, when the height of the obstacle is below the threshold (i.e., 0.05 meters), processing device 140 determines that medical accessory 110 may continue to move against the obstacle, processing device 140 may not update the path of movement of medical accessory 110, at which point processing device 140 may increase the drive torque applied by the drive mechanism to medical accessory 110. For another example, when it is determined that the obstacle is a person, the medical assistance apparatus 110 may stop waiting, and move toward the target position according to the original movement path after the person ahead of the movement direction disappears. In some embodiments, when the obstacle is determined to be a person, the medical assistance device 110 may issue an alert to alert the healthcare worker that the medical assistance device 110 at that time meets the obstacle or is trapped in a location. Exemplary alert prompts may take the form of a combination of one or more of voice, flashing light, vibration, etc.
In some embodiments, the processing device 140 may plan an obstacle avoidance path for the medical assistance device 110 based on ambient environmental information (e.g., obstacle information, etc.). For example, when there is an insurmountable obstacle in the movement direction of the medical auxiliary device 110 during the movement of the medical auxiliary device 110 to the target position according to the original movement path, the processing device 140 may control the medical auxiliary device 110 to change the movement direction thereof, and if the insurmountable obstacle is encountered again during the movement to the target position, the medical auxiliary device 110 may change the movement direction again until the target position is finally reached. In some embodiments, the processing device 140 may acquire a distance between the medical assistance device 110 and the obstacle, and the processing device 140 may plan a new obstacle avoidance path based on the distance between the medical assistance device 110 and the obstacle. For example, when the distance between the obstacle and the medical assistance apparatus 110 is 1 meter, the medical assistance apparatus 110 may move 0.5 meter toward the obstacle, and then move toward the target position while passing around the obstacle along the circumference of a circle having a radius of 0.5 meter and centered on the obstacle. For another example, when the obstacle is located in front of the medical auxiliary device 110 and the distance is 1 meter, the medical auxiliary device 110 may move 0.5 meter to the obstacle and then move 0.5 meter to the left side until moving 0.5 meter to the right side after bypassing the obstacle and then continue moving toward the target position. In some embodiments, when an obstacle (e.g., a person) appears right in front of the moving direction, the processing device 140 may control the medical auxiliary device 110 to emit a turn prompting sound to prompt the medical subject to pay attention to the medical auxiliary device 110 to avoid colliding with the medical auxiliary device and avoid the obstacle.
It should be noted that the above description related to the flow 200 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 200 will be apparent to those skilled in the art in light of this disclosure. However, such modifications and variations are intended to be within the scope of the present application. For example, step 240 may precede step 220 (steps 220a, 220b), i.e. the processing device 140 may first obtain indication information indicating the movement of the medical assistance device 110 before determining the target position of the medical assistance device 110. For example, after a chest scan is completed, the healthcare worker issues a voice command to the medical assistance device 110: "scan end", the processing device 140 determines that the voice information indicates that the current clinical operation is ended, and the medical assistance device 110 needs to perform the next clinical operation (for example, leg scan, assistance in performing leg surgery, etc.) or to wait according to the protocol information, and since the voice information does not include a target position corresponding to the next clinical operation (for example, leg scan, assistance in performing leg surgery, etc.) or to wait, the processing device 140 needs to acquire current scene information and temporarily confirm the target position according to the scene information. Such variations are within the scope of the present application.
FIG. 3 is an exemplary block diagram of a motion control system of a medical assistance device according to some embodiments of the present application. As shown in fig. 3, the motion control system 300 of the medical assistance apparatus 110 may include a scene information acquisition module 310, a target position determination module 320, a motion path planning module 330, an indication information acquisition module 340, and a motion control module 350.
The context information acquiring module 310 may be disposed at a fixed position in the preset space and/or on the medical auxiliary device 110, and may be configured to acquire context-related information in the preset space, where the context-related information may include context information and ambient environment information. In some embodiments, the context information acquisition module 310 may also be configured to acquire protocol information for a medical task, which may reflect one or more clinical operating workstations of the medical task.
The target location determination module 320 may be configured to determine a target location of the medical assistance device 110 based on the context information. In some embodiments, the target location determination module 320 may be further configured to preset one or more target sub-locations and/or target far-away locations according to the context information. In some embodiments, the target position determining module 320 may be further configured to determine a target position corresponding to the current surgical operation type workstation according to the current scene information and a preset position. In some embodiments, the target location determination module 320 may be further configured to determine the target location of the medical assistance device 110 according to the protocol information and the scenario information.
The motion path planning module 330 may be configured to plan a motion path of the medical assistance apparatus 110 according to the target location and the scenario information. In some embodiments, the motion path planning module 330 may be further configured to preset the one or more motion sub-paths according to the scene information and the preset one or more target sub-positions and/or target far-away positions.
The indication information acquisition module 340 may be configured to acquire indication information indicating a movement of the medical assistance device 110. In some embodiments, the indication information obtaining module 340 may be further configured to determine whether the current operation type operation is ended according to the limb operation indication information of the medical staff, and determine a next target position according to the protocol information and the current scenario information.
The motion control module 350 may be configured to control the medical auxiliary device 110 to move along the corresponding motion path according to the indication information. In some embodiments, the motion control module 350 may also be configured to determine the motion of the medical assistance device 110 based on ambient information of the medical assistance device; determining a motion path of the medical assistance device 110; wherein the surrounding environment information includes obstacle information in a moving direction of the medical assistance apparatus 110 toward the target position.
In other embodiments of the present application, a motion control apparatus for a medical auxiliary device is provided, comprising at least one processing device 140 and at least one storage device 150; the at least one memory device 150 is configured to store computer instructions, and the at least one processing device 140 is configured to execute at least some of the computer instructions to implement the motion control method of the medical assistance device 110 as described above.
In still further embodiments of the present application, a computer-readable storage medium for motion control of a medical assistance device 140 is provided, the storage medium storing computer instructions that, when executed by a processing device 140, implement a method of motion control of a medical assistance device 140 as described above.
It should be noted that the above description of the motion control system of the medical auxiliary device 110 and the devices/modules thereof is merely for convenience of description and should not limit the scope of the present application to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of devices/modules or configuration of subsystems with other devices/modules may be implemented without departing from such teachings. For example, in some embodiments, for example, the scene information obtaining module 310, the target position determining module 320 and the motion path planning module 330 disclosed in fig. 3 may be different modules in one device (e.g., the processing device 140), or may be one module that implements the functions of two or more modules described above. For example, the target position determining module 320 and the motion path planning module 330 may be two modules, or one module may have both functions of receiving signals and processing information. For another example, each device may have its own memory module. For another example, each device may share a memory module. And such modifications are intended to be included within the scope of the present application.
The beneficial effects that may be brought by the embodiments of the present application include, but are not limited to: (1) scene information of a preset space can be automatically acquired and a motion path reaching a target position can be planned, so that limitation caused by manual operation is avoided, and safety and efficiency of clinical operation are improved; (2) in the process that the medical auxiliary equipment moves to the target position, the obstacle avoidance path can be updated in real time according to the surrounding environment information, and the safety of clinical operation is improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (15)

1. A method of motion control of a medical assistance apparatus, the method comprising:
acquiring scene information of the medical auxiliary equipment in a preset space;
determining a target position of the medical auxiliary device at least according to the scene information;
planning a motion path of the medical auxiliary equipment according to the target position and the scene information;
acquiring indication information indicating the movement of the medical auxiliary equipment;
and controlling the medical auxiliary equipment to move along the corresponding movement path according to the indication information.
2. The method according to claim 1, wherein the scene information is acquired by a camera device, and the camera device is arranged at a fixed position in the preset space or on the medical auxiliary equipment.
3. The method of claim 1, wherein the target location comprises one or more target sub-locations and/or target far-locations; the motion path comprises one or more motion sub-paths between the target sub-locations and/or between the target sub-locations and the target distancing locations.
4. The method of claim 3, wherein the determining the target location of the medical device from the context information comprises: and presetting one or more target sub-positions and/or target far-away positions according to the scene information.
5. The method of claim 4, wherein planning the motion path of the medical assistance device according to the target location and the context information comprises:
and presetting the one or more motion sub-paths according to the scene information and the preset one or more target sub-positions and/or target far-away positions.
6. The method of claim 3, wherein the determining the target location of the medical device from the context information comprises:
and determining a target position corresponding to the current clinical operation station according to the current scene information and a preset space threshold value.
7. The method of claim 1, wherein the indication comprises input from a healthcare worker; the input information includes at least one of voice information, text information, instruction selection information, image information, and limb operation instruction information.
8. The method of claim 1, further comprising:
acquiring protocol information of a medical task; the protocol information can reflect one or more clinical operating workstations in the medical task;
the determining the target position of the medical device at least according to the context information comprises:
and determining the target position of the medical auxiliary equipment according to the protocol information and the scene information.
9. The method of claim 1 or 8, wherein the obtaining indication information indicative of the medical assistance device movement comprises:
and determining whether the current operation type operation is finished or not according to the limb operation instruction information of the medical staff.
10. The method according to claim 1, wherein the controlling the medical auxiliary device to move along the corresponding movement path according to the indication information comprises:
determining a motion path of the medical auxiliary equipment according to the peripheral environment information of the medical auxiliary equipment; wherein the surrounding environment information includes obstacle information in a direction of movement of the medical assistance apparatus toward the target position.
11. The method according to claim 10, characterized in that the ambient environment information is acquired by an identification means provided on the medical assistance device; the recognition device comprises an infrared recognition device, a laser recognition device or an image recognition device.
12. The method of claim 1, wherein the medical accessory device comprises a C-arm device.
13. A motion control system for a medical auxiliary device, the system comprising:
the scene information acquisition module is used for acquiring scene information of the medical auxiliary equipment in a preset space;
a target position determination module for determining a target position of the medical auxiliary device at least according to the scene information;
the movement path planning module is used for planning a movement path of the medical auxiliary equipment according to the target position and the scene information;
the indication information acquisition module is used for acquiring indication information indicating the movement of the medical auxiliary equipment;
and the motion control module is used for controlling the medical auxiliary equipment to move along the corresponding motion path according to the indication information.
14. A motion control apparatus for a medical auxiliary device, the apparatus comprising at least one processor and at least one memory;
the at least one memory is for storing computer instructions;
the at least one processor is configured to execute at least some of the computer instructions to implement the operations of any of claims 1 to 12.
15. A computer-readable storage medium, characterized in that the storage medium stores computer instructions which, when executed by a processor, carry out the operations of any one of claims 1 to 12.
CN202010789591.0A 2020-05-09 2020-08-07 Motion control method and system for medical auxiliary equipment Active CN114077243B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010789591.0A CN114077243B (en) 2020-08-07 2020-08-07 Motion control method and system for medical auxiliary equipment
PCT/CN2021/092905 WO2021228057A1 (en) 2020-05-09 2021-05-10 Motion control method and system for medical device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010789591.0A CN114077243B (en) 2020-08-07 2020-08-07 Motion control method and system for medical auxiliary equipment

Publications (2)

Publication Number Publication Date
CN114077243A true CN114077243A (en) 2022-02-22
CN114077243B CN114077243B (en) 2023-12-05

Family

ID=80279652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010789591.0A Active CN114077243B (en) 2020-05-09 2020-08-07 Motion control method and system for medical auxiliary equipment

Country Status (1)

Country Link
CN (1) CN114077243B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023186066A1 (en) * 2022-03-31 2023-10-05 Shanghai United Imaging Healthcare Co., Ltd. Mobile medical devices

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860209A (en) * 1983-11-24 1989-08-22 Kabushiki Kaisha Toyota Chuo Kenkyusho Running command system for unmanned vehicle
CN101518447A (en) * 2009-03-02 2009-09-02 北京积水潭医院 Precision improving method for spinal surgery computer navigation system
CN105117000A (en) * 2015-07-29 2015-12-02 青岛海信医疗设备股份有限公司 Method and device for processing medical three-dimensional image
US20170326730A1 (en) * 2014-12-12 2017-11-16 Korea Aerospace Research Institute Apparatus and method for controlling moving object, and computer-readable recording medium in which program for implementing method in computer is recorded
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN109363872A (en) * 2018-12-17 2019-02-22 上海联影医疗科技有限公司 Medical imaging system, scanning bed control method, device and storage medium
CN109389620A (en) * 2017-08-08 2019-02-26 西门子保健有限责任公司 For tracking the method and tracking system of medical object
CN109946715A (en) * 2019-04-09 2019-06-28 云鲸智能科技(东莞)有限公司 Detection method, device, mobile robot and storage medium
US20190329074A1 (en) * 2018-04-26 2019-10-31 Rodney Wiersma Systems and Methods for Determining 6DoF Correction Trajectories in Compensating Movement of a Treatment Target in a Patient
CN111053564A (en) * 2019-12-26 2020-04-24 上海联影医疗科技有限公司 Medical equipment movement control method and medical equipment
CN111053565A (en) * 2019-12-28 2020-04-24 上海联影医疗科技有限公司 Positioning method, positioning device, C-arm system and medium
CN111298439A (en) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 Data processing method, device, medium and electronic equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860209A (en) * 1983-11-24 1989-08-22 Kabushiki Kaisha Toyota Chuo Kenkyusho Running command system for unmanned vehicle
CN101518447A (en) * 2009-03-02 2009-09-02 北京积水潭医院 Precision improving method for spinal surgery computer navigation system
US20170326730A1 (en) * 2014-12-12 2017-11-16 Korea Aerospace Research Institute Apparatus and method for controlling moving object, and computer-readable recording medium in which program for implementing method in computer is recorded
CN105117000A (en) * 2015-07-29 2015-12-02 青岛海信医疗设备股份有限公司 Method and device for processing medical three-dimensional image
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN109389620A (en) * 2017-08-08 2019-02-26 西门子保健有限责任公司 For tracking the method and tracking system of medical object
US20190329074A1 (en) * 2018-04-26 2019-10-31 Rodney Wiersma Systems and Methods for Determining 6DoF Correction Trajectories in Compensating Movement of a Treatment Target in a Patient
CN109363872A (en) * 2018-12-17 2019-02-22 上海联影医疗科技有限公司 Medical imaging system, scanning bed control method, device and storage medium
CN109946715A (en) * 2019-04-09 2019-06-28 云鲸智能科技(东莞)有限公司 Detection method, device, mobile robot and storage medium
CN111053564A (en) * 2019-12-26 2020-04-24 上海联影医疗科技有限公司 Medical equipment movement control method and medical equipment
CN111053565A (en) * 2019-12-28 2020-04-24 上海联影医疗科技有限公司 Positioning method, positioning device, C-arm system and medium
CN111298439A (en) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 Data processing method, device, medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023186066A1 (en) * 2022-03-31 2023-10-05 Shanghai United Imaging Healthcare Co., Ltd. Mobile medical devices

Also Published As

Publication number Publication date
CN114077243B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
US11272985B2 (en) Patient-specific preoperative planning simulation techniques
CN109567954B (en) Workflow assistance system and method for image guided program
EP3212109B1 (en) Determining a configuration of a medical robotic arm
RU2640566C2 (en) Personal and automatic correction of x-ray system based on optical detection and interpretation of three-dimensional scene
US11925423B2 (en) Guidance for positioning a patient and surgical robot
US9642584B2 (en) Method for controlling the movement of an x-ray apparatus and x-ray system
CN108175503A (en) System for arranging objects in an operating room in preparation for a surgical procedure
CN112022201A (en) Machine guided imaging techniques
CN110507337B (en) Medical equipment control system, medical equipment control method and device
CN111870268A (en) Method and system for determining target position information of beam limiting device
JP2015019777A (en) Body position determination support device and medical image diagnostic apparatus
CN112450956A (en) Automatic positioning method, device, readable storage medium, electronic equipment and system
US11141859B2 (en) Determining a configuration of a medical robotic arm
CN114077243B (en) Motion control method and system for medical auxiliary equipment
JP2022508276A (en) Positioning of medical X-ray imaging equipment
JP2015202222A (en) X-ray image diagnostic apparatus
WO2023078249A1 (en) Obstacle avoidance method, system and apparatus for surgical robot, and storage medium
CN112053346A (en) Method and system for determining operation guide information
CN209826968U (en) Surgical robot system
WO2021228057A1 (en) Motion control method and system for medical device
US10748034B2 (en) Method and system for learning to obtain medical scans of patients
CN112716509B (en) Motion control method and system for medical equipment
CN113081013B (en) Spacer scanning method, device and system
CN114224428A (en) Osteotomy plane positioning method, osteotomy plane positioning system and osteotomy plane positioning device
EP3939511A2 (en) System and method for planning pedicle screw fixation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant