CN113842209B - Ultrasonic device control method, ultrasonic device and computer readable storage medium - Google Patents

Ultrasonic device control method, ultrasonic device and computer readable storage medium Download PDF

Info

Publication number
CN113842209B
CN113842209B CN202110975732.2A CN202110975732A CN113842209B CN 113842209 B CN113842209 B CN 113842209B CN 202110975732 A CN202110975732 A CN 202110975732A CN 113842209 B CN113842209 B CN 113842209B
Authority
CN
China
Prior art keywords
gesture
preset
locking
hand
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110975732.2A
Other languages
Chinese (zh)
Other versions
CN113842209A (en
Inventor
丁旻昊
任冠清
熊飞
王筱毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Delica Medical Equipment Co ltd
Original Assignee
Shenzhen Delica Medical Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Delica Medical Equipment Co ltd filed Critical Shenzhen Delica Medical Equipment Co ltd
Priority to CN202110975732.2A priority Critical patent/CN113842209B/en
Publication of CN113842209A publication Critical patent/CN113842209A/en
Application granted granted Critical
Publication of CN113842209B publication Critical patent/CN113842209B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The invention discloses an ultrasonic equipment control method, which comprises the following steps: acquiring a real-time image, if an unlocking gesture exists in the real-time image, converting a gesture locking state of the ultrasonic equipment into a gesture control state corresponding to the unlocking gesture, and acquiring a preset control gesture in the real-time image; if the preset control gesture is a preset locking tracking gesture, locking and tracking a target locking person corresponding to the locking tracking gesture; if a preset following gesture sent by a target hand of the target locking person is acquired, converting the gesture control state into a preset hand following state, and judging whether the real-time image accords with a preset safety rule or not; and if the real-time image accords with the preset safety rule, controlling the mechanical arm to move along with the target hand. The invention also discloses ultrasonic equipment and a computer readable storage medium. The ultrasonic equipment control method has the advantages of low cost, high degree of freedom and high safety.

Description

Ultrasonic device control method, ultrasonic device and computer readable storage medium
Technical Field
The present invention relates to the field of medical apparatuses, and in particular, to an ultrasonic apparatus control method, an ultrasonic apparatus, and a computer-readable storage medium.
Background
The mechanical arm plays a role in many fields, can assist people to complete complicated, high-load, high-precision and other works, and greatly facilitates production and labor. People can automatically complete some works through the mechanical arm or complete related works through controlling the mechanical arm through the control console even without going to a working site. In the field of medical appliances, the traditional ultrasonic equipment can be combined with the mechanical arm to form mechanical arm ultrasonic equipment which is more convenient for a doctor to control, and in the operation execution process, the doctor can operate the mechanical arm ultrasonic equipment through traditional means such as manual operation of the mechanical arm ultrasonic equipment body or menu instructions of an upper computer, and can also complete remote control of the mechanical arm ultrasonic equipment through a remote controller. However, at present, the control operation mode of the mechanical arm ultrasonic device is single, and the mechanical arm ultrasonic device cannot be cooperatively controlled, and the mechanical arm ultrasonic device is dependent on complex and expensive matched control equipment, so that the mode of controlling the mechanical arm ultrasonic device is limited greatly.
Disclosure of Invention
The invention provides an ultrasonic equipment control method, ultrasonic equipment and a computer readable storage medium, and aims to solve the technical problem that the mode of controlling the ultrasonic equipment is limited greatly.
In order to achieve the above object, the present invention provides an ultrasonic device control method applied to an ultrasonic device including a mechanical arm; the ultrasonic equipment control method comprises the following steps:
acquiring a real-time image, if an unlocking gesture exists in the real-time image, converting a gesture locking state of the ultrasonic equipment into a gesture control state corresponding to the unlocking gesture, and acquiring a preset control gesture in the real-time image;
if the preset control gesture is a preset locking tracking gesture, locking and tracking a target locking person corresponding to the locking tracking gesture;
if a preset following gesture sent by a target hand of the target locking person is acquired, converting the gesture control state into a preset hand following state, and judging whether the real-time image accords with a preset safety rule or not;
and if the real-time image accords with the preset safety rule, controlling the mechanical arm to move along with the target hand.
Optionally, after the step of determining whether the real-time image meets the preset security rule, the method further includes:
if the real-time image does not accord with the preset safety rule, converting the hand following state into a preset safety protection state, and controlling the mechanical arm to recover to the starting initial position;
and if a preset reset instruction is received, converting the safety protection state into the gesture control state.
Optionally, after the step of collecting the preset control gesture in the real-time image, the method further includes:
determining a mechanical arm to-be-operated track corresponding to the preset control gesture, and detecting whether the mechanical arm to-be-operated track exceeds a preset range;
and if the track to be operated of the mechanical arm exceeds a preset range, stopping executing a preset control instruction corresponding to the preset control gesture.
Optionally, the mechanical ultrasonic equipment further comprises an ultrasonic probe and a pressure sensing device, wherein the ultrasonic probe is clamped by the mechanical arm; the step of controlling the mechanical arm to move along with the target hand further comprises the following steps:
the mechanical arm is controlled to move along with the target hand, and the probe pressure of the ultrasonic probe is monitored in real time through the pressure sensing device;
Judging whether the pressure of the probe is greater than or equal to a preset pressure threshold value;
and if the probe pressure is greater than or equal to a preset pressure threshold, controlling the mechanical arm to adjust to a state when the probe pressure is lower than the preset pressure threshold.
Optionally, the step of if the unlocking gesture exists in the real-time image includes:
determining human body contour models of all people in the real-time image;
collecting all hand key points of each human body contour model in real time;
judging whether the position change of each hand key point is matched with the hand key point change information corresponding to the unlocking gesture according to the position change of each hand key point;
and if the hand key point change information corresponding to the unlocking gesture is matched, judging that the unlocking gesture exists in the real-time image.
Optionally, if the preset control gesture is a preset lock tracking gesture, the step of locking and tracking the target lock person corresponding to the lock tracking gesture includes:
judging whether the position change of the hand key point corresponding to the preset control gesture is matched with the hand key point change information corresponding to the preset lock tracking gesture;
if the hand key point change information corresponding to the preset lock tracking gesture is matched, judging that the preset control gesture is the preset lock tracking gesture, and executing a preset lock tracking instruction corresponding to the lock tracking gesture;
And locking and tracking a human body contour model of the target locking person corresponding to the locking and tracking gesture according to the executed locking and tracking instruction, wherein the human body contour model of the target locking person comprises all human body key points.
Optionally, if the preset following gesture sent by the target hand of the target locking person is collected, the step of converting the gesture control state into the preset hand following state includes:
if the hand key point position change in the target hand of the target locking person is acquired, and the hand key point position change is matched with the hand key point change information corresponding to the preset following gesture, determining that the target hand of the target locking person sends the preset following gesture, and converting the gesture control state into the preset hand following state.
Optionally, after the step of converting the gesture locking state of the ultrasonic device into the gesture control state corresponding to the unlocking gesture, the method further includes:
and if a preset gesture locking instruction is received, converting the gesture control state or the hand following state into the gesture locking state.
In addition, to achieve the above object, the present invention also provides an ultrasonic apparatus including a memory, a processor, and an ultrasonic apparatus control program stored on the memory and executable on the processor, wherein: the ultrasound device control program when executed by the processor implements the steps of the ultrasound device control method as described above.
In addition, in order to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon an ultrasonic device control program which, when executed by a processor, implements the steps of the ultrasonic device control method as described above.
According to the method for controlling the ultrasonic equipment, the real-time image is obtained, if the unlocking gesture exists in the real-time image, the gesture locking state of the ultrasonic equipment is converted into the gesture control state corresponding to the unlocking gesture, and the preset control gesture in the real-time image is collected, so that the ultrasonic equipment can be conveniently switched from other non-gesture control modes to gesture control modes through the unlocking gesture obtained by the real-time image, and a user can use various gestures to perform various operations on the ultrasonic equipment; if the preset control gesture is a preset locking tracking gesture, locking and tracking a target locking person corresponding to the locking tracking gesture can eliminate gesture interference of other persons in the real-time image, and the load for identifying the next gesture is reduced; and if the preset following gesture sent by the target hand of the target locking person is acquired, converting the gesture control state into a preset hand following state, judging whether the real-time image accords with a preset safety rule, and finally if the real-time image accords with the preset safety rule, controlling the mechanical arm to move along with the target hand, so that the mechanical arm can move correspondingly due to the movement of the target hand, and the safety of a patient is ensured. The ultrasonic equipment control method provided by the invention has low cost and small limitation, can realize multiple gesture control and multi-person cooperative operation, and ensures the safety of using gestures to control the ultrasonic equipment.
Drawings
Fig. 1 is a schematic diagram of a terminal structure of a hardware operating environment of an ultrasonic device according to an embodiment of the present invention;
FIG. 2 is a flow chart of a first embodiment of the ultrasound apparatus control method of the present invention;
FIG. 3 is a flow chart of a fourth embodiment of the ultrasound device control method of the present invention;
FIG. 4 is a schematic flow chart of a fifth embodiment of the ultrasonic device control method of the present invention;
FIG. 5 is a diagram of human body key point identification related to the control method of the ultrasonic device of the present invention;
FIG. 6 is a partial state switching flow chart of the ultrasound device control method of the present invention;
fig. 7 is a sequence diagram of a lock-up tracking process involved in the ultrasound device control method of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic diagram of a terminal structure of a hardware operating environment of an ultrasonic device according to an embodiment of the present invention.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display (Display), an input unit such as a control panel, and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WLAN interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above. An ultrasound device control program may be included in the memory 1005 as a computer storage medium.
Optionally, the terminal may also include a microphone, speaker, RF (Radio Frequency) circuitry, sensors, audio circuitry, wireless modules, etc. Among them, sensors such as pressure sensor, visual sensor and others are not described herein.
It will be appreciated by those skilled in the art that the terminal structure shown in fig. 1 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the control method of the ultrasonic device of the present invention, in this embodiment, the method includes:
step S10, acquiring a real-time image, if an unlocking gesture exists in the real-time image, converting a gesture locking state of the ultrasonic equipment into a gesture control state corresponding to the unlocking gesture, and acquiring a preset control gesture in the real-time image;
the ultrasonic equipment comprises a control end and an execution end, wherein the execution end comprises a computer system, the ultrasonic probe, an ultrasonic image display, a mechanical arm monitoring camera, a gesture capturing camera, a mechanical arm and other equipment. The robot arm monitoring camera is used for monitoring the robot arm, the gesture capturing camera is used for acquiring real-time images at the execution end, and the remote control end only needs one camera and one display, so that the number of cameras and displays can be increased according to actual conditions if special needs exist for medical staff, and in addition, the specifications of the cameras and the displays are not limited.
The gesture capturing camera of the remote control end and/or the gesture capturing camera of the execution end obtains gesture actions of medical staff by shooting real-time image information in a shooting range, and the display is used for displaying ultrasonic images and the gesture of the mechanical arm obtained by the mechanical arm monitoring camera. The control end and the execution end are connected by Ethernet, local area network or any other communication mode, doctor images acquired by the control end are transmitted to the computer system of the execution end, gestures of a doctor are identified by a human body contour acquisition and gesture identification algorithm, and ultrasonic equipment is controlled to execute corresponding operation, so that ultrasonic image acquisition is completed.
After the real-time image is acquired, the image information of all medical staff in the image is identified and acquired, the image information of all staff can be converted into a dynamic human body contour model in a computer system, and a plurality of human body key points are created on the human body contour model, and particularly, reference can be made to fig. 5, and fig. 5 is a human body key point identification diagram related to the ultrasonic equipment control method. The human body key points comprise trunk key points and hand key points and also can comprise face key points, and the trunk key points and the face key points can be used for identifying the characteristics of different medical staff. The change in posture of all persons causes the change in the spatial positions of the human keypoints and the relative positions of the individual keypoints in the human body contour. And acquiring and detecting dynamic changes of the human body contour model in real time according to the related changes of the trunk key points. According to the relevant changes of the hand key points, gesture changes sent by medical staff are collected in real time. The number of the trunk key points, the hand key points, the face key points and the key points created in the human body parts can be adjusted according to practical situations, and fig. 5 is only used for understanding and reference, and is not a limitation of the invention. Because the body type and the facial features of each person are different, the image information of different medical staff can be clearly distinguished by utilizing the human body contour model and the human body key points. After the human body outline model and the human body key points are formed, any medical staff in the real-time image sends out the gesture image acquired by the camera, and the gesture image is automatically converted into the change of the hand key points of the human body outline model of the medical staff in the ultrasonic equipment system, so that gesture acquisition is completed, and the gesture is recognized and corresponding instructions are executed according to the gesture control system of the ultrasonic equipment.
When the change of the position information of the hand key points acquired is in accordance with the position change information of the hand key points corresponding to the unlocking gesture stored in the gesture control system of the ultrasonic equipment, the hand key points are identified as the unlocking gesture, namely, the unlocking gesture exists in the real-time image, and then the gesture unlocking instruction is executed, so that the state is converted into the gesture control state from the initial gesture locking state, then each preset control gesture sent by each medical staff in the real-time image can be acquired, so that the ultrasonic equipment executes the instruction corresponding to the different control gestures to finish different operations, wherein the unlocking gesture can be freely set by the medical staff, what gesture is selected, the hand key point change information corresponding to the gesture is stored in the gesture control system after the setting is finished, and learning and controlling can be performed according to the built-in unlocking gesture in the gesture control system. The gesture lock state is a state in which any gesture other than the unlock gesture is ignored, that is, a state corresponding to any gesture other than the unlock gesture being disabled. The gesture control system can be switched from the locked state to the gesture control state by using an unlocking gesture, and other instructions, such as a voice unlocking gesture instruction, an unlocking gesture instruction in an upper computer menu instruction and the like, can be used for switching the gesture control system from the gesture locked state to the gesture control state. In addition, the gesture control state can be switched to the gesture locking state by using the locking gesture and other commands with higher priority, such as other commands in the unlocking gesture command in the menu commands of the upper computer.
Step S20, if the preset control gesture is a preset locking tracking gesture, locking and tracking a target locking person corresponding to the locking tracking gesture;
referring to fig. 7, images of all persons are acquired through real-time images, then the images of all persons are converted into dynamic human body contour models of all persons, human body key points are simultaneously created on the human body contour models according to the number of preset key points and preset human body recognition positions, after position changes of hand key points of any medical person are acquired, fuzzy comparison is carried out on the position changes of the hand key points and hand key point change information corresponding to locking tracking gestures pre-stored in a hand recognition system, if the comparison finds that the position changes of the hand key points and the hand key point change information are matched within a reasonable error range, the preset control gestures sent by the medical person are the locking tracking gestures, then locking tracking instructions are executed, the person sending the locking tracking gestures is identified as a target locking person, the human body contour of the target locking person is locked and the human body contour of the target locking person is correlated with instructions caused by any gesture of the target locking person, and all gesture instructions of the target locking person are executed by acquiring and recognizing the hand key points of the target locking person; the person who is not locked and tracked does not recognize other gestures except the gesture of locking and tracking, so that the load of gesture recognition is reduced. When the human body outline of the target locking person cannot be identified, for example, the target locking person moves out of the shooting range, the locking tracking of the person is ended, the person re-sends out the locking tracking gesture, and the locking tracking of the person is restarted. The human body key points comprise trunk key points, hand key points and face key points, and the actions of the trunk key points and the face key points can be used for identifying different medical staff and converting the posture change of the medical staff into the change of a human body outline model in time.
Step S30, if a preset following gesture sent by a target hand of the target locking person is acquired, converting the gesture control state into a preset hand following state, and judging whether the real-time image accords with a preset safety rule;
after the position change of the hand key point of any hand of the target medical staff is acquired, the position change of the hand key point of the hand is subjected to fuzzy comparison with hand key point change information corresponding to the following gesture pre-stored in the hand recognition system, if the position change of the hand key point and the hand key point change information are found to be matched within a reasonable error range through comparison, the preset control gesture sent by the hand of the target medical staff is recognized as the following gesture, the following instruction is executed to lock the hand and be used as the target hand, the gesture control state is converted into the hand following state, then the mechanical arm can move along with the target hand, and simultaneously in the hand following state, the gesture action of the target medical staff in the real-time image is judged in real time, including whether the hand action accords with the safety rule pre-stored in the gesture control system, for example, the safety rule can be: 1. the mechanical arm does not move beyond the limited area; 2. the target hand does not move too fast, and the safe movement speed of the mechanical arm is exceeded; 3. the target hand is not lost, namely the following hand does not move out of the picture of the camera; 4. the following target locking personnel are not lost, namely the trunk of the medical personnel to which the following hand belongs does not move outside the picture of the camera or cannot be identified. And comparing the acquired real-time image with all safety rules in the system one by one, so as to judge whether the real-time image accords with the preset safety rules. In addition, in the hand following state, after the other hand or the other locked tracking medical staff is received and the following gesture is cancelled, the following is stopped, the current gesture is kept by the mechanical arm, and the gesture control system is converted into a gesture control state.
In addition, the preset control gesture can also comprise an unlocking tracking gesture besides a locking tracking gesture and a following gesture, and after the unlocking tracking object gesture is acquired and identified, the target medical staff can not respond to other gestures except the locking tracking object gesture after sending the gesture.
The preset control gesture can also comprise a step gesture along a scanning path, after the step gesture along the scanning path is acquired and identified, for example, a point A closest to the current probe position on the scanning path is found, a preset distance is moved along the scanning path to obtain a point B, the mechanical arm is controlled to move the probe to the point B, the gesture of the mechanical arm is adjusted in real time according to the signal of the baroreceptor in the moving process, and the pressure of the probe to the checking position is kept within a certain range.
The preset control gesture can also comprise an inverse scanning path stepping gesture, for example, after the inverse scanning path stepping gesture is acquired and identified, a point A closest to the current probe position on the scanning path is found, the inverse scanning path is moved for a preset distance to obtain a point B, the mechanical arm is controlled to move the probe to the point B, the gesture of the mechanical arm is adjusted in real time according to the signal of the baroreceptor in the moving process, and the pressure of the probe to the checking position is kept within a certain range.
The preset control gesture may also include a gesture of "automatically moving forward along the scan path", and after the gesture of "automatically moving forward along the scan path" is acquired and identified, for example, a point a closest to the current probe position on the scan path is found, the mechanical arm is controlled to move the probe to the point a, then the mechanical arm is controlled to move along the scan path with the point a as a starting point, and the mechanical arm is controlled to automatically stop moving to the end point of the scan path. And in the moving process, the gesture of the mechanical arm is adjusted in real time according to the signal of the baroreceptor, and the pressure of the probe to the checking position is kept within a certain range.
The preset control gesture may also include a gesture of "moving reversely and automatically along the scan path", for example, after the gesture of "moving reversely and automatically along the scan path" is acquired and identified, for example, a point a closest to the current probe position on the scan path is found, the mechanical arm is controlled to move the probe to the point a, then the mechanical arm is controlled to move reversely along the scan path with the point a as a starting point, and the movement is stopped automatically from the starting point of the scan path. And in the moving process, the gesture of the mechanical arm is adjusted in real time according to the signal of the baroreceptor, and the pressure of the probe to the checking position is kept within a certain range.
The preset control gesture may also include a "stop move" gesture by which movement is suspended after the acquisition identifies the preset "stop move" gesture, for example, during automatic movement along the scan path. In addition, the ultrasonic device may stop moving due to other instructions or other reasons, for example, if an external force is sensed to interfere with the movement of the mechanical arm, the ultrasonic device is considered to be a medical staff manually operating the device body, and at this time, the gesture control instruction is stopped to execute, so as to respond to the manual operation of the device body. Similarly, if the instruction of the upper computer menu is received during the execution of the instruction, the execution of the gesture control instruction is stopped at this time, and the instruction of the upper computer menu is responded. In this embodiment, the gesture control command has a lower priority in the computer system.
The preset control gestures can also comprise locking gestures, after the locking gestures are acquired and identified, the robot arm enters a locking state, only the unlocking gestures are responded, other gestures are ignored, the gestures are prevented from being made in operation, and the robot arm is operated by mistake.
The preset control gesture can also comprise an unlocking gesture, and after the unlocking gesture is acquired and identified, the user exits from the locking state and enters the gesture control state to start to read gesture instructions of the locking tracking object.
The preset control gesture can also comprise a hand-out following gesture, and after the hand-out following gesture is acquired and identified, the hand-out following state is carried out, and the mechanical arm keeps the current gesture.
The preset control gesture may also include an "ultrasound image freeze" gesture, which is followed by a freeze of the current ultrasound image after the "ultrasound image freeze" gesture is acquired and identified.
The preset control gesture may also include an "ultrasound image defrost" gesture, which is followed by the acquisition and recognition of the "ultrasound image defrost" gesture to defrost the current ultrasound image.
The preset control gesture can also comprise an ultrasonic image parameter adjustment gesture, and after the ultrasonic image parameter adjustment gesture is acquired and identified, the corresponding ultrasonic image parameter is adjusted.
In addition to the above-mentioned preset gestures, different gesture commands can be added according to actual requirements, so that preset control commands corresponding to different control gestures are executed after different control gestures are acquired.
Step S40, if the real-time image accords with the preset safety rule, the mechanical arm is controlled to move along with the target hand;
if the real-time image is judged to be in accordance with all safety rules in the system, the mechanical arm starts to move correspondingly along with the movement of the target hand by taking the current position of the ultrasonic probe as a starting point. The target hand may be either left hand or right hand. For example, the target hand of the target medical staff is lifted up by a certain extent, the mechanical arm is lifted by a corresponding extent, the target hand of the target medical staff is moved rightwards by a certain extent, the mechanical arm is also moved by a corresponding extent, the movement extent of the target hand of the target medical staff and the corresponding movement extent of the mechanical arm can be equal, or can be in a certain proportion, for example, 2:1, namely, the target hand moves by 10 cm, the mechanical arm correspondingly moves by 5 cm, and the sensitivity of the following movement is related to the sensitivity of the following movement, and the sensitivity of the following movement can be adjusted and set according to the habit of medical staff so as to improve the accuracy of the following movement.
In addition, if the real-time image is judged not to accord with any safety rule, for example, the mechanical arm moves beyond a limited area, the system automatically enters a preset safety protection state, any gesture sent by any person is not collected, and the on-site operation of medical staff is waited.
In addition, referring to fig. 6, fig. 6 is a partial state switching flowchart related to the control method of the ultrasonic device according to the present invention, in which the gesture control state is switched from the gesture lock state, and the gesture lock state is a state in which any other gesture-induced instruction is ignored by the unlocking gesture, that is, a state equivalent to disabling any other gesture by the unlocking gesture. The user can switch from the gesture locking state to the gesture control state by using the unlocking gesture, and the gesture control state can be switched to the gesture locking state by using the locking gesture and other commands with higher priority, such as an upper computer menu command. After the locking tracking gestures of all medical staff are received and collected, the staff sent by the gestures are locked, the staff send out following gestures again, the computer gesture control system enters a hand following state, in the hand following state, whether the real-time images acquired by the camera meet the safety rules pre-stored in the gesture control system or not is judged, if not, the computer gesture control system enters a safety protection state, and in the safety protection state, in order to ensure the safety of a patient, the manual reset of the medical staff is needed to switch the computer gesture control system into the gesture control state.
According to the method for controlling the ultrasonic equipment, the real-time image is obtained, if the unlocking gesture exists in the real-time image, the gesture locking state of the ultrasonic equipment is converted into the gesture control state corresponding to the unlocking gesture, and the preset control gesture in the real-time image is collected, so that the ultrasonic equipment can be conveniently switched from other non-gesture control modes to gesture control modes through the unlocking gesture obtained by the real-time image, and a user can use various gestures to perform various operations on the ultrasonic equipment; if the preset control gesture is a preset locking tracking gesture, locking and tracking a target locking person corresponding to the locking tracking gesture can eliminate gesture interference of other persons in the real-time image, and the load for identifying the next gesture is reduced; and if the preset following gesture sent by the target hand of the target locking person is acquired, converting the gesture control state into a preset hand following state, judging whether the real-time image accords with a preset safety rule, and finally if the real-time image accords with the preset safety rule, controlling the mechanical arm to move along with the target hand, so that the mechanical arm can move correspondingly due to the movement of the target hand, and the safety of a patient is ensured. The ultrasonic equipment control method provided by the invention has low cost and small limitation, can realize multiple gesture control and multi-person cooperative operation, and ensures the safety of using gestures to control the ultrasonic equipment.
Further, a second embodiment of the ultrasonic device control method of the present invention is proposed based on the first embodiment of the ultrasonic device control method of the present invention, and in this embodiment, after step S30, the method includes:
step a, if the real-time image does not accord with a preset safety rule, converting the hand following state into a preset safety protection state, and controlling the mechanical arm to recover to a starting initial position;
if the real-time image does not accord with any safety rule in the system, for example, the mechanical arm moves beyond a limited area, the system automatically enters a preset safety protection state, no response is made to gestures sent by any person in the safety protection state, the mechanical arm is controlled to return to the position when the ultrasonic equipment system is just started while entering the safety protection state, and the position is recorded to the ultrasonic equipment system after the starting-up for the starting-up initial position.
And b, if a preset reset instruction is received, converting the safety protection state into the gesture control state.
The preset reset instruction needs to be manually reset on the ultrasonic equipment on site by a medical staff at an execution end, for example, the medical staff manually moves the mechanical arm, when adjusting the position of the ultrasonic probe to a reset state, the system automatically generates the reset instruction and receives the reset instruction for execution, so that the ultrasonic equipment system reenters a gesture control state and starts to collect gestures sent by all medical staff.
Further, a third embodiment of the ultrasonic device control method of the present invention is proposed based on the first embodiment of the ultrasonic device control method of the present invention, and in this embodiment, after step S10, the method includes:
step c, determining a mechanical arm to-be-operated track corresponding to the preset control gesture, and detecting whether the mechanical arm to-be-operated track exceeds a preset range;
after any control gesture is sent by medical staff, the control gesture is identified, before a control instruction corresponding to the control gesture is executed, the running track and the running speed of the mechanical arm after the control instruction is executed are predicted, and whether the running track exceeds a preset range or not and whether the running speed exceeds a preset speed or not are detected.
And d, if the track to be operated of the mechanical arm exceeds a preset range, stopping executing a preset control instruction corresponding to the preset control gesture.
If the running track exceeds the preset range or the running speed exceeds the preset speed, or any situation that the patient may be injured by executing the control instruction is judged, for example, if the probe pressure exceeds the safety pressure threshold value, the execution of the control instruction corresponding to the control gesture is stopped.
As shown in fig. 3, further, a fourth embodiment of the ultrasonic device control method of the present invention is proposed based on the first embodiment of the ultrasonic device control method of the present invention, in which the mechanical ultrasonic device further includes an ultrasonic probe and a pressure sensing device, the ultrasonic probe being held by the mechanical arm; step S40 further includes:
step S41, controlling a mechanical arm to move along with the target hand, and monitoring the probe pressure of the ultrasonic probe in real time through a pressure sensing device;
in order to ensure the safety of a patient in the process that the mechanical arm holds the ultrasonic probe to follow the target hand to carry out mobile scanning operation, the pressure of the ultrasonic probe exceeds a safety threshold value to cause injury to the patient due to the acquisition and identification of various gestures and misoperation of other instructions, and a pressure sensing device is arranged on the mechanical arm and used for detecting the probe pressure of the ultrasonic probe in real time in the process that the mechanical arm holds the ultrasonic probe to work.
Step S42, judging whether the probe pressure is greater than or equal to a preset pressure threshold;
because the medical ultrasonic probe has a plurality of specifications and has different pressure areas for the ultrasonic probes with different specifications, the preset pressure threshold can be set according to the actually used ultrasonic probes, so that the patient is ensured not to feel obvious pressing sense in the ultrasonic scanning process, and other unsafe factors are not generated for the patient. And comparing the actual probe pressure value of the probe with an ultrasonic probe pressure threshold value pre-stored in the ultrasonic equipment system in real time, and judging whether the actual probe pressure is larger than or equal to the ultrasonic probe pressure threshold value pre-stored in the ultrasonic equipment system.
If the probe pressure is greater than or equal to the preset pressure threshold, step S43 is executed to control the mechanical arm to adjust to a state when the probe pressure is lower than the preset pressure threshold.
If the pressure of the probe is greater than or equal to a preset pressure threshold, the mechanical arm is controlled to be slowly lifted immediately, the lifting is stopped until the pressure of the ultrasonic probe is detected to be lower than the preset pressure threshold, and the lifting process can be parallel to a gesture instruction executed previously, so that the ultrasonic scanning efficiency is ensured.
If the pressure of the probe is smaller than the preset pressure threshold, the operation of the mechanical arm lifting adjustment is not executed, and related operations are normally executed according to the gesture instruction executed previously.
As shown in fig. 4, and referring to fig. 5, further, a fifth embodiment of the method for controlling an ultrasonic device according to the present invention is provided based on the first embodiment of the method for controlling an ultrasonic device according to the present invention, in this embodiment, the steps when an unlocking gesture exists in the real-time image include:
step S100, determining human body contour models of all people in the real-time images;
after the real-time images are obtained through the cameras, the images of all medical staff in the real-time images are determined, the automatic focusing treatment is carried out on all the medical staff, and the background images in the real-time images are weakened, so that the definition of the figure images is improved, the interference of non-human images is eliminated, and the images of all the medical staff are determined by a gesture control system of the ultrasonic equipment and then are converted into a human body contour model of all the medical staff. A number of body keypoints are created in a body contour model at predetermined body parts, such as at shoulder joints, hip keys, elbow joints, knee joints, and hand joints of a torso part of a human body. Besides the trunk and the hands, a certain number of key points can be created on the faces of the human body contours of all medical staff to perform facial recognition, so that the distinguishing degree of the characteristics of each medical staff is improved.
Step S110, collecting all hand key points of each human body contour model in real time;
all medical staff make various gestures in real time through the changes and the movements of the hands, and real-time images containing the various gestures are obtained by a gesture control system and are converted into dynamic changes of hand key points of various human body contour models.
Step S120, judging whether the position change of each hand key point is matched with the hand key point change information corresponding to the unlocking gesture according to the position change of each hand key point;
because any gesture is generated by one hand action, the position change of each hand key point on each human body outline is reflected, and whether the position change of the hand key point is matched with hand key point change information corresponding to an unlocking gesture pre-stored in a gesture control system is judged.
Step S130, if the hand key point change information corresponding to the unlocking gesture is matched, judging that the unlocking gesture exists in the real-time image.
If the position change of the hand key point is matched with the hand key point change information corresponding to the unlocking gesture pre-stored in the gesture control system, it can be determined that a person makes the unlocking gesture in the real-time image. For example, the hand key point change information corresponding to the unlocking gesture pre-stored in the gesture control system is: the relative positions of the hand key points corresponding to the fist shape are converted into the relative positions of the hand key points corresponding to the palm shape, and if the medical staff makes a hand holding the fist and then opens, the hand holding the fist can be recognized as an unlocking gesture by the gesture control system.
Further, a sixth embodiment of the ultrasonic device control method of the present invention is proposed based on the first embodiment of the ultrasonic device control method of the present invention, in which step S20 includes:
step e, judging whether the position change of the hand key point corresponding to the preset control gesture is matched with the hand key point change information corresponding to the preset lock tracking gesture;
f, if the hand key point change information corresponding to the preset lock tracking gesture is matched, judging that the preset control gesture is the preset lock tracking gesture, and executing a preset lock tracking instruction corresponding to the lock tracking gesture;
and g, locking and tracking a human body contour model of the target locking person corresponding to the locking and tracking gesture according to the executed locking and tracking instruction, wherein the human body contour model of the target locking person comprises all human body key points.
Judging whether the change of the hand key point corresponding to the gesture made by any medical staff is matched with the hand key point change information corresponding to the locking tracking gesture pre-stored in the gesture control system, if the change of the hand key point of the gesture is matched with the hand key point change information corresponding to the locking tracking gesture pre-stored in the gesture control system, identifying the hand key point as the locking tracking gesture by the gesture control system, further executing a locking tracking instruction, locking and tracking the target medical staff making the gesture, and specifically, locking and tracking a human body contour model of the target medical staff reflected in the gesture identification system according to the trunk key point of the target medical staff or the trunk key point and the face key point of the target medical staff.
In an embodiment, the step of converting the gesture control state into the preset hand following state if the preset following gesture sent by the target hand of the target locking person is acquired includes:
and h, if the hand key point position change in the target hand of the target locking person is acquired and the hand key point position change is matched with the hand key point change information corresponding to the preset following gesture, determining that the target hand of the target locking person sends the preset following gesture, and converting the gesture control state into the preset hand following state.
If the position change of the hand key point corresponding to the gesture made by any one hand collected by any target medical staff is matched with the hand key point change information corresponding to the following gesture pre-stored in the gesture control system, the hand key point is identified as the following gesture, the hand which sends the following gesture is further locked while entering the following gesture state, and in the gesture following state, the hand is only followed, namely, how the hand moves, and the mechanical arm correspondingly moves. For example, when the right hand makes a preset following gesture, it is recognized that the change of the key point of the right hand corresponds to the change information of the key point of the hand corresponding to the following gesture preset in the system, the right hand is taken as the target hand, the right hand moves under the condition of meeting the safety rule, the mechanical arm correspondingly moves, for example, the right hand moves forwards, the mechanical arm moves forwards, but the moving distances can be unequal, the corresponding following movement can be performed according to a preset proportion, for example, the proportion is 3:1, and if the right hand moves upwards by 9 cm, the mechanical arm moves upwards by 3 cm.
In another embodiment, after the step of converting the gesture locking state of the ultrasonic device into the gesture control state corresponding to the unlocking gesture, the method further includes:
and i, if a preset gesture locking instruction is received, converting the gesture control state or the hand following state into the gesture locking state.
When the gesture control system is in a gesture control state or hand following state transition, no matter what gesture-induced instruction is executed, the execution is stopped immediately and the current state is transitioned to a gesture locking state as long as a gesture locking instruction is received. The gesture locking instruction can be derived from a locking gesture, or can be derived from other instructions with higher priority, such as an instruction sent by an upper computer menu, a locking voice instruction, a locking foot switch instruction or other state switching instructions with higher priority, after any instruction with higher priority is received, the instruction can be used as a gesture locking instruction to switch the gesture control system into a gesture locking state, in the gesture locking state, only the instruction corresponding to the unlocking gesture or other non-gesture instructions are responded to switch the gesture locking state into the gesture control state, and other gestures are ignored, so that the gesture made in operation is prevented from being responded by the gesture control system, and the mechanical arm is misoperation.
In addition, the invention also provides an ultrasonic device, which comprises a memory, a processor and an ultrasonic device control program stored on the memory and capable of running on the processor, wherein the processor realizes the steps of the ultrasonic device control method according to the embodiment when executing the ultrasonic device control program.
The specific implementation manner of the ultrasonic equipment is basically the same as that of each embodiment of the control method of the ultrasonic equipment, and is not repeated here.
Furthermore, the present invention also proposes a computer-readable storage medium, characterized in that the computer-readable storage medium comprises an ultrasound device control program which, when executed by a processor, implements the steps of the ultrasound device control method as described in the above embodiments.
The specific implementation manner of the computer readable storage medium of the present invention is basically the same as that of each embodiment of the control method of the ultrasonic device, and will not be repeated here.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a terminal device (which may be a television, a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
In the present invention, the terms "first", "second", "third", "fourth", "fifth" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, and the specific meaning of the above terms in the present invention will be understood by those of ordinary skill in the art depending on the specific circumstances.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, the scope of the present invention is not limited thereto, and it should be understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications and substitutions of the above embodiments may be made by those skilled in the art within the scope of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (8)

1. The ultrasonic equipment control method is characterized by being applied to ultrasonic equipment, wherein the ultrasonic equipment comprises a mechanical arm; the ultrasonic equipment control method comprises the following steps:
acquiring a real-time image, if an unlocking gesture exists in the real-time image, converting a gesture locking state of the ultrasonic equipment into a gesture control state corresponding to the unlocking gesture, and acquiring a preset control gesture in the real-time image;
if the preset control gesture is a preset locking tracking gesture, locking and tracking a target locking person corresponding to the locking tracking gesture;
if a preset following gesture sent by a target hand of the target locking person is acquired, converting the gesture control state into a preset hand following state, and judging whether the real-time image accords with a preset safety rule or not; wherein, the preset security rule comprises: the safe movement speed of the mechanical arm;
if the real-time image accords with the preset safety rule, controlling the mechanical arm to move along with the target hand;
after the step of collecting the preset control gesture in the real-time image, the method further comprises:
Determining a mechanical arm to-be-operated track corresponding to the preset control gesture, and detecting whether the mechanical arm to-be-operated track exceeds a preset range;
if the track to be operated of the mechanical arm exceeds a preset range, stopping executing a preset control instruction corresponding to the preset control gesture;
if the preset control gesture is a preset lock tracking gesture, the step of locking and tracking the target lock person corresponding to the lock tracking gesture includes:
judging whether the position change of the hand key point corresponding to the preset control gesture is matched with the hand key point change information corresponding to the preset lock tracking gesture;
if the hand key point change information corresponding to the preset lock tracking gesture is matched, judging that the preset control gesture is the preset lock tracking gesture, and executing a preset lock tracking instruction corresponding to the lock tracking gesture;
according to the executed locking tracking instruction, locking and tracking a human body contour model of a target locking person corresponding to the locking tracking gesture, wherein the human body contour model of the target locking person comprises all human body key points;
after the step of locking and tracking the target locking person corresponding to the locking and tracking gesture, the method further comprises the following steps:
If the human body outline of the target locking person cannot be identified, ending the locking tracking of the target locking person;
after the step of moving the control mechanical arm along with the target hand, the method further comprises the following steps:
and if the follow-up canceling gesture of the other hand part of the target locking person except the target hand part or other locking person is received, stopping the follow-up.
2. The method for controlling an ultrasonic device according to claim 1, wherein after the step of determining whether the real-time image meets a preset safety rule, further comprising:
if the real-time image does not accord with the preset safety rule, converting the hand following state into a preset safety protection state, and controlling the mechanical arm to recover to the starting initial position;
and if a preset reset instruction is received, converting the safety protection state into the gesture control state.
3. The ultrasonic apparatus control method of claim 1, wherein the ultrasonic apparatus further comprises an ultrasonic probe and a pressure sensing device, the ultrasonic probe being held by the mechanical arm; the step of controlling the mechanical arm to move along with the target hand further comprises the following steps:
The mechanical arm is controlled to move along with the target hand, and the probe pressure of the ultrasonic probe is monitored in real time through the pressure sensing device;
judging whether the pressure of the probe is greater than or equal to a preset pressure threshold value;
and if the probe pressure is greater than or equal to a preset pressure threshold, controlling the mechanical arm to adjust to a state when the probe pressure is lower than the preset pressure threshold.
4. The method for controlling an ultrasonic device according to claim 1, wherein the step of if there is an unlock gesture in the live image comprises:
determining human body contour models of all people in the real-time image;
collecting all hand key points of each human body contour model in real time;
judging whether the position change of each hand key point is matched with the hand key point change information corresponding to the unlocking gesture according to the position change of each hand key point;
and if the hand key point change information corresponding to the unlocking gesture is matched, judging that the unlocking gesture exists in the real-time image.
5. The ultrasonic device control method according to claim 1, wherein the step of converting the gesture control state into a preset hand following state if a preset following gesture issued by a target hand of the target person is acquired comprises:
If the hand key point position change in the target hand of the target locking person is acquired, and the hand key point position change is matched with the hand key point change information corresponding to the preset following gesture, determining that the target hand of the target locking person sends the preset following gesture, and converting the gesture control state into the preset hand following state.
6. The method for controlling an ultrasonic device according to claim 1, further comprising, after the step of converting the gesture lock state of the ultrasonic device into the gesture control state corresponding to the unlock gesture:
and if a preset gesture locking instruction is received, converting the gesture control state or the hand following state into the gesture locking state.
7. An ultrasound device comprising a memory, a processor, and an ultrasound device control program stored on the memory and executable on the processor, wherein: the ultrasound device control program when executed by the processor implements the steps of the ultrasound device control method of any one of claims 1 to 6.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon an ultrasonic device control program which, when executed by a processor, implements the steps of the ultrasonic device control method according to any one of claims 1 to 6.
CN202110975732.2A 2021-08-24 2021-08-24 Ultrasonic device control method, ultrasonic device and computer readable storage medium Active CN113842209B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110975732.2A CN113842209B (en) 2021-08-24 2021-08-24 Ultrasonic device control method, ultrasonic device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110975732.2A CN113842209B (en) 2021-08-24 2021-08-24 Ultrasonic device control method, ultrasonic device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113842209A CN113842209A (en) 2021-12-28
CN113842209B true CN113842209B (en) 2024-02-09

Family

ID=78976128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110975732.2A Active CN113842209B (en) 2021-08-24 2021-08-24 Ultrasonic device control method, ultrasonic device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113842209B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115421590B (en) * 2022-08-15 2023-05-12 珠海视熙科技有限公司 Gesture control method, storage medium and image pickup device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104809387A (en) * 2015-03-12 2015-07-29 山东大学 Video image gesture recognition based non-contact unlocking method and device
CN105700683A (en) * 2016-01-12 2016-06-22 厦门施米德智能科技有限公司 Intelligent window and control method thereof
US9563955B1 (en) * 2013-05-15 2017-02-07 Amazon Technologies, Inc. Object tracking techniques
CN107848116A (en) * 2015-08-25 2018-03-27 川崎重工业株式会社 Tele-manipulator system
CN108135658A (en) * 2015-10-09 2018-06-08 索尼公司 operation control device, operation control method and program
CN108268181A (en) * 2017-01-04 2018-07-10 奥克斯空调股份有限公司 A kind of control method and device of non-contact gesture identification
CN108288010A (en) * 2017-01-07 2018-07-17 湖南移商动力网络技术有限公司 Visual gesture identifying system based on Android
CN109325408A (en) * 2018-08-14 2019-02-12 莆田学院 A kind of gesture judging method and storage medium
WO2020092170A1 (en) * 2018-11-02 2020-05-07 Verb Surgical Inc. Surgical robotic system
CN111556350A (en) * 2020-04-21 2020-08-18 海信集团有限公司 Intelligent terminal and man-machine interaction method
CN111904597A (en) * 2020-08-07 2020-11-10 山东威瑞外科医用制品有限公司 Lightweight surgical robot
CN112668506A (en) * 2020-12-31 2021-04-16 咪咕动漫有限公司 Gesture tracking method and device and computer readable storage medium
CN112914601A (en) * 2021-01-19 2021-06-08 深圳市德力凯医疗设备股份有限公司 Obstacle avoidance method and device for mechanical arm, storage medium and ultrasonic equipment
CN112998863A (en) * 2021-03-12 2021-06-22 杭州柳叶刀机器人有限公司 Robot safety boundary interaction method and device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8996173B2 (en) * 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
WO2011085815A1 (en) * 2010-01-14 2011-07-21 Brainlab Ag Controlling a surgical navigation system
JP6787623B2 (en) * 2015-02-24 2020-11-18 エスアールアイ インターナショナルSRI International Very dexterous system user interface
EP3658005A4 (en) * 2017-07-27 2021-06-23 Intuitive Surgical Operations, Inc. Light displays in a medical device
US11504193B2 (en) * 2019-05-21 2022-11-22 Verb Surgical Inc. Proximity sensors for surgical robotic arm manipulation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9563955B1 (en) * 2013-05-15 2017-02-07 Amazon Technologies, Inc. Object tracking techniques
CN104809387A (en) * 2015-03-12 2015-07-29 山东大学 Video image gesture recognition based non-contact unlocking method and device
CN107848116A (en) * 2015-08-25 2018-03-27 川崎重工业株式会社 Tele-manipulator system
CN108135658A (en) * 2015-10-09 2018-06-08 索尼公司 operation control device, operation control method and program
CN105700683A (en) * 2016-01-12 2016-06-22 厦门施米德智能科技有限公司 Intelligent window and control method thereof
CN108268181A (en) * 2017-01-04 2018-07-10 奥克斯空调股份有限公司 A kind of control method and device of non-contact gesture identification
CN108288010A (en) * 2017-01-07 2018-07-17 湖南移商动力网络技术有限公司 Visual gesture identifying system based on Android
CN109325408A (en) * 2018-08-14 2019-02-12 莆田学院 A kind of gesture judging method and storage medium
WO2020092170A1 (en) * 2018-11-02 2020-05-07 Verb Surgical Inc. Surgical robotic system
CN111556350A (en) * 2020-04-21 2020-08-18 海信集团有限公司 Intelligent terminal and man-machine interaction method
CN111904597A (en) * 2020-08-07 2020-11-10 山东威瑞外科医用制品有限公司 Lightweight surgical robot
CN112668506A (en) * 2020-12-31 2021-04-16 咪咕动漫有限公司 Gesture tracking method and device and computer readable storage medium
CN112914601A (en) * 2021-01-19 2021-06-08 深圳市德力凯医疗设备股份有限公司 Obstacle avoidance method and device for mechanical arm, storage medium and ultrasonic equipment
CN112998863A (en) * 2021-03-12 2021-06-22 杭州柳叶刀机器人有限公司 Robot safety boundary interaction method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FPGA创新创业大赛.基于手势识别的多功能机械臂.《哔哩哔哩》.2021, *

Also Published As

Publication number Publication date
CN113842209A (en) 2021-12-28

Similar Documents

Publication Publication Date Title
US10653472B2 (en) Touch free operation of ablator workstation by use of depth sensors
Matsumotot et al. Development of intelligent wheelchair system with face and gaze based interface
JP6221224B2 (en) Robot system, program, production system and robot
US9625993B2 (en) Touch free operation of devices by use of depth sensors
CN109571513B (en) Immersive mobile grabbing service robot system
JP4386367B2 (en) Communication robot improvement system
US20060168523A1 (en) Interface system
CN113842209B (en) Ultrasonic device control method, ultrasonic device and computer readable storage medium
CN109605363A (en) Robot voice control system and method
CN105844746B (en) A kind of access control device, system and method that identity is identified by gait information
JP2019041261A (en) Image processing system and setting method of image processing system
JP2017073670A (en) Image processing apparatus, image processing method and image processing system
CN111709277A (en) Human body tumbling detection method and device, computer equipment and storage medium
JP2007143886A (en) Electrical wheelchair system
CN110781714B (en) Image processing apparatus, image processing method, and computer readable medium
CN110363811A (en) Control method and device for grabbing equipment, storage medium and electronic equipment
WO2023124732A1 (en) Device control method and system for image-guided interventional punctures
JP2019202354A (en) Robot control device, robot control method, and robot control program
JP4878462B2 (en) Communication robot
CN113827270B (en) Instruction conflict resolution method, ultrasonic device and computer readable storage medium
JP6358637B1 (en) Dispatch system
CN109571423A (en) The user equipment of robot control system, robot control method and robot control system
EP4144489A1 (en) Control device, control method, and computer program
CN116189308B (en) Unmanned aerial vehicle flight hand detection method, unmanned aerial vehicle flight hand detection system and storage medium
Zhang et al. Computerized Environment for People with Serious Disability

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant