CN112472133A - Posture monitoring method and device for ultrasonic probe - Google Patents
Posture monitoring method and device for ultrasonic probe Download PDFInfo
- Publication number
- CN112472133A CN112472133A CN202011528675.5A CN202011528675A CN112472133A CN 112472133 A CN112472133 A CN 112472133A CN 202011528675 A CN202011528675 A CN 202011528675A CN 112472133 A CN112472133 A CN 112472133A
- Authority
- CN
- China
- Prior art keywords
- ultrasonic probe
- scanned
- posture
- image
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000000523 sample Substances 0.000 title claims abstract description 249
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000012544 monitoring process Methods 0.000 title claims abstract description 29
- 230000033001 locomotion Effects 0.000 claims abstract description 12
- 230000003287 optical effect Effects 0.000 claims abstract description 7
- 238000001514 detection method Methods 0.000 claims description 32
- 238000002604 ultrasonography Methods 0.000 claims description 21
- 238000012549 training Methods 0.000 claims description 17
- 238000003384 imaging method Methods 0.000 abstract description 9
- 206010067484 Adverse reaction Diseases 0.000 abstract description 6
- 230000006838 adverse reaction Effects 0.000 abstract description 6
- 238000005070 sampling Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 238000011897 real-time detection Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000011217 control strategy Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000011888 foil Substances 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0858—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2063—Acoustic tracking systems, e.g. using ultrasound
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Robotics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention discloses a method and a device for monitoring the posture of an ultrasonic probe, wherein the monitoring method comprises the following steps: acquiring position information of an ultrasonic probe relative to a part to be scanned; determining the distance between the ultrasonic probe and the part to be scanned according to the position information; and determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance to obtain the posture of the ultrasonic probe, and adjusting the motion state of the ultrasonic probe. According to the invention, the optical structure camera, the infrared distance measuring sensor and the pressure sensor are arranged on the ultrasonic probe, so that the sensors are matched together to obtain the position information, the distance and the fitting state of the ultrasonic probe and the part to be scanned, and the logic judgment is carried out according to the obtained data and the ultrasonic probe is controlled to carry out corresponding motion state adjustment, therefore, the ultrasonic probe cannot be tightly fitted to the part to be scanned or the applied pressure is too large, the ultrasonic scanning imaging quality is improved, and adverse reactions of patients cannot be caused.
Description
Technical Field
The invention relates to the technical field of ultrasonic probes, in particular to a method and a device for monitoring the posture of an ultrasonic probe.
Background
For the ultrasonic operation navigation system, when a doctor manually performs an operation, the mechanical arm controls the probe to track a surgical instrument or fix the probe at an operation position for real-time scanning and imaging. Ultrasonic scanning requires that the probe applies proper pressure on the skin, the joint state of the probe and the skin needs to be monitored in real time, and the probe is adjusted to avoid the situation that the joint is not tight or the applied pressure is too large.
In the prior art, the strain gauge is attached to the surface of the probe, the attachment state of the probe and the skin is monitored through the strain gauge, but the attachment of the strain gauge influences the work of the probe at the attached part, so that the attachment cannot be carried out too much, the attachment cannot be effectively monitored too little, and the mode and effect of attaching the strain gauge in the prior art are poor.
Thus, there is a need for improvements and enhancements in the art.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a method and a device for monitoring the posture of an ultrasonic probe, aiming at solving the problems that when the ultrasonic probe in the prior art adopts a strain foil to detect the bonding state with a scanning part, the effect of adjusting the ultrasonic probe is very poor, so that the ultrasonic probe is not tightly bonded with the scanning part or the applied pressure is too large, which causes poor ultrasonic scanning imaging quality and discomfort for patients.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
in a first aspect, the present invention provides a method for monitoring the posture of an ultrasound probe, wherein the method comprises:
acquiring position information of an ultrasonic probe relative to a part to be scanned;
determining the distance between the ultrasonic probe and the part to be scanned according to the position information;
and determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance to obtain the posture of the ultrasonic probe, and adjusting the motion state of the ultrasonic probe.
In one implementation, the acquiring position information of the ultrasound probe relative to the part to be scanned includes:
acquiring an image of a region to be scanned by a preset structured light camera, wherein the structured light camera is fixed with the ultrasonic probe;
and determining the position information of the ultrasonic probe relative to the part to be scanned according to the image of the region to be scanned.
In one implementation, the determining, according to the image of the region to be scanned, the position information of the ultrasound probe relative to the part to be scanned includes:
inputting an image of a region to be scanned into a preset target detection network, and determining the position of a part to be scanned in the region to be scanned;
and determining the position information of the ultrasonic probe relative to the part to be scanned according to the position of the part to be scanned.
In one implementation, the target detection network is constructed in a manner including:
the method comprises the steps of obtaining an image of a region to be scanned by emitting and receiving infrared rays through a structured light camera in advance, and registering the image of the region to be scanned with an RGB (red, green and blue) image of the camera to obtain color information of each pixel on a depth map; expanding the original three-channel image into a six-channel image of RGB-XYZ;
determining a pixel area of a part to be scanned in the area to be scanned according to the color information, and printing a label to construct a training set, wherein the training set comprises image frames which are cut from images of the area to be scanned in different scanning stages according to the same proportion in advance;
and using the MobileNet as a basic network, and using the training set to train the MobileNet to obtain the target detection network.
In one implementation, the determining the distance between the ultrasound probe and the part to be scanned according to the position information includes:
according to the position information, starting up four infrared distance measuring sensors preset on the ultrasonic probe, wherein the four infrared distance measuring sensors are symmetrically arranged;
and measuring the distance between the ultrasonic probe and the part to be scanned through the infrared distance measuring sensor.
In one implementation manner, determining a bonding state between the ultrasonic probe and the part to be scanned according to the distance to obtain a posture of the ultrasonic probe, and adjusting a motion state of the ultrasonic probe includes:
determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance;
acquiring pressure data of the ultrasonic probe according to the attaching state, wherein the pressure data is obtained based on a pressure sensor arranged on the ultrasonic probe;
and obtaining the posture of the ultrasonic probe according to the pressure data, and adjusting the posture of the ultrasonic probe.
In a second aspect, the present invention provides an attitude monitoring apparatus of an ultrasonic probe, wherein the apparatus includes: an ultrasonic probe; an optical structure camera disposed on the ultrasound probe; an infrared distance measuring sensor arranged on the ultrasonic probe; the structured light camera is fixed with the ultrasonic probe and is used for acquiring an image of a region to be scanned; the infrared distance measuring sensor is used for measuring the distance between the ultrasonic probe and the part to be scanned.
In one implementation mode, the number of the infrared distance measuring sensors is four, and the infrared distance measuring sensors are symmetrically arranged on the ultrasonic probe.
In one implementation, a pressure sensor is further disposed on the ultrasonic probe, and the pressure sensor is used for acquiring pressure data of the ultrasonic probe between the scanning and the part to be scanned.
In one implementation, the ultrasonic probe is further connected to a mechanical arm, and the mechanical arm is connected to the pressure sensor and the infrared distance measuring sensor, so that the mechanical arm is controlled to adjust the posture of the ultrasonic probe through feedback from the pressure sensor and the infrared distance measuring sensor.
Has the advantages that: the invention provides a posture monitoring method and a device of an ultrasonic probe, which are characterized in that an optical structure camera, an infrared distance measuring sensor and a pressure sensor are arranged on the ultrasonic probe, so that the sensors are matched together to obtain the position information, the distance and the attaching state of the ultrasonic probe and a part to be scanned, and the logic judgment is carried out according to the obtained data and the ultrasonic probe is controlled to carry out corresponding motion state adjustment, so that the ultrasonic probe cannot be attached to the part to be scanned too tightly or applied with too large pressure, the ultrasonic scanning imaging quality is improved, and adverse reactions of patients cannot be caused.
Drawings
Fig. 1 is a side view of an attitude monitoring apparatus for an ultrasound probe according to an embodiment of the present invention.
Fig. 2 is a front view of an attitude monitoring apparatus for an ultrasound probe according to an embodiment of the present invention.
Fig. 3 is an overall flowchart of a method for monitoring an attitude of an ultrasonic probe according to an embodiment of the present invention.
Fig. 4 is a flowchart of determining position information of an ultrasound probe relative to a to-be-scanned portion in a method for monitoring an attitude of an ultrasound probe according to an embodiment of the present invention.
Fig. 5 is a flowchart for determining a distance between an ultrasound probe and a portion to be scanned in a method for monitoring an attitude of an ultrasound probe according to an embodiment of the present invention.
Fig. 6 is a flowchart illustrating an adjustment of an attitude of an ultrasound probe in a method for monitoring an attitude of an ultrasound probe according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and effects of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
For the ultrasonic operation navigation system, when a doctor manually performs an operation, the mechanical arm controls the probe to track a surgical instrument or fix the probe at an operation position for real-time scanning and imaging. Ultrasonic scanning requires that the probe applies proper pressure on the skin, the joint state of the probe and the skin needs to be monitored in real time, and the probe is adjusted to avoid the situation that the joint is not tight or the applied pressure is too large.
In the prior art, the strain gauge is attached to the surface of the probe, the attachment state of the probe and the skin is monitored through the strain gauge, but the attachment of the strain gauge influences the work of the probe at the attached part, so that the attachment cannot be carried out too much, the attachment cannot be effectively monitored too little, and the mode and effect of attaching the strain gauge in the prior art are poor.
In order to solve the problems in the prior art, the present embodiment provides a method and an apparatus for monitoring an attitude of an ultrasound probe. When the invention is applied, firstly, an ultrasonic probe is utilized to scan a part to be scanned, a structured light on the ultrasonic probe inputs an image of the part to be scanned into a preset target detection network for image processing, the target detection network is obtained by training, before formal ultrasonic scanning is carried out, the image of the part to be scanned, which is obtained by sampling, is processed and trained according to an algorithm to form a target detection network, after the image of the part to be scanned is processed in the detection network, the position information of the part to be scanned is obtained, meanwhile, an infrared distance measuring sensor on the ultrasonic probe measures the distance to the part to be scanned, and a pressure sensor on the ultrasonic probe also obtains pressure data when detecting the pressure between the ultrasonic probe and the part to be scanned, all the sensors are matched together to carry out real-time detection data and data transmission to a system, the system carries out real-time logic judgment and control on a mechanical arm according to the algorithm, the mechanical arm drives the ultrasonic probe to move according to the algorithm track, and the posture of the ultrasonic probe is adjusted, so that the ultrasonic probe is attached to the part to be scanned in the optimal posture in the scanning process, the ultrasonic scanning imaging quality is improved, and adverse reactions of patients cannot be caused.
In specific implementation, as shown in fig. 1 and fig. 2, 10 is a structured light camera, 20 is an infrared distance measuring sensor, 30 is a pressure sensor, 40 is an ultrasonic emitting part of an ultrasonic probe, and the ultrasonic emitting part 40 of the ultrasonic probe 50 is flush with the structured light camera 10 and the infrared distance measuring sensor. The ultrasonic probe of the invention is loaded with a plurality of sensors to jointly detect the joint state of the ultrasonic probe and the part to be scanned in real time, and simultaneously gives consideration to visual guidance and pressure guidance, such as
The ultrasonic probe can be adjusted through real-time visual guidance and pressure guidance according to the detected data, so that the fitting state is optimal. Before formal ultrasonic scanning is carried out on a part to be scanned of a human body, multiple sampling image processing needs to be carried out on the part to be scanned of the human body to construct a target detection network, the target network is used for acquiring the relative position of an ultrasonic probe and the part to be scanned, namely, an optical structure camera fixed on the ultrasonic probe is used for carrying out image input on the part to be scanned, the scanned image is input into a target neural network, image registration and labeling are carried out according to an algorithm, and the MobileNet is used as a basic network to train the target detection network; after preparation work is done, when ultrasonic scanning is formally carried out, the ultrasonic probe is moved and stuck on a part to be scanned, at the moment, the structured light camera is also scanning images in real time and inputting the images into a target detection network; the target detection network outputs position information of the ultrasonic probe relative to a part to be scanned, an infrared distance measuring sensor on the ultrasonic probe is opened according to the position information, the distance between the ultrasonic probe and the part to be scanned is measured, the system carries out logic judgment according to the distance data and pressure data measured by a pressure sensor on the ultrasonic probe and an algorithm, the system controls the ultrasonic probe in real time according to a PID algorithm according to the result of the logic judgment, the posture of the ultrasonic probe is adjusted, and the ultrasonic probe can scan in the optimal posture in the whole ultrasonic scanning process.
For example, the ultrasonic scanning and the control of the posture of the ultrasonic probe are performed simultaneously, in the first stage of the beginning of the ultrasonic scanning, the mechanical arm controls the ultrasonic probe to move to the position, 10cm away from the skin, in the planned path at a high speed of 20cm/s, the structured light camera acquires a frame of image after the movement is finished, the infrared distance measuring sensor is started to calculate whether the distance between the current probe and the skin reaches the position 10cm-15cm above the skin to be scanned, if not, the current probe and the skin continue to move at the speed of 10cm/s, and if so, the current probe and the skin decelerate to 2cm/s to enter the next stage. In the second stage, the infrared distance measuring sensor is needed to monitor the distance between the ultrasonic probe and the skin in real time, the distance between the ultrasonic probe and the skin is continuously fed back, when the distance between the ultrasonic probe and the skin is 1cm, the speed is reduced to 0.5cm/s, pressure data collected by the pressure sensor is monitored, when the contact pressure of the pressure sensors at two ends is 5-8N and the difference is less than 1N, or the ultrasonic probe is pressed into the skin by 1cm after being measured by the infrared distance measuring sensor, and the scanning stage is started in the third stage. In the third stage, a scanning task is finished at the speed of 0.5cm/s under the condition that a pressure sensor and a structured light camera are jointly detected, and if the pressure is unbalanced in the scanning process, an ultrasonic probe is rotated to balance the pressure; if the pressure at the two ends of the probe is very small, or the infrared distance measuring sensor detects that the two ends of the pressure sensor are not contacted and the pressing depth is enough, the structured light camera guides to continue scanning, and in the process, the direction of the ultrasonic probe is collinear with the vertical line of the skin pixels, so that the ultrasonic probe is ensured to be always in positive pressure on the skin. And after the scanning is finished, returning to the initial position at the speed of 20cm/s to finish the ultrasonic scanning task.
In the third stage, the mechanical arm is controlled by a PID algorithm to adjust the posture of the ultrasonic probe, a rectangular coordinate system is established by defining the front surface of the ultrasonic probe according to the right-hand rule, a three-dimensional coordinate system is established by adding the rectangular coordinate system to the emission direction of the ultrasonic generating part of the ultrasonic probe, the pressure sensor measurement data F1, F2< m is 0.05N: the current probe is not considered to be in contact with the skin. According to the distances respectively measured by the four infrared distance measuring sensors, the distance between the ultrasonic probe and the skin is calculated, and the posture of the ultrasonic probe is adjusted according to the three-dimensional coordinates, and the specific measures are as follows: calculating the average value of the four infrared distance measuring sensors, calculating the difference value between the distance measured by each infrared distance measuring sensor and the average value, deflecting the corresponding direction according to the sequence of the absolute value of the difference value from large to small, taking the real-time difference value as input, and adjusting the rotating speed by using a PID controller until the absolute value of each difference value is less than dL which is 1 cm. And calculating the distance between the probe and the surface of the tissue to be scanned by the mean value of the four infrared distance measuring sensors, controlling the ultrasonic probe to move downwards by using the PID controller according to the distance until the ultrasonic probe is contacted with the skin, and transmitting a pressure signal by the pressure sensor. And if the pressure sensor measurement data F1 and F2> u is 0.5N, the contact is started currently, and if F1 and F2< M is 2N or F1 and F2> M is 4N, the current ultrasonic probe is too tightly or too loosely contacted with the skin, and the PID control strategy is adopted to control the ultrasonic probe to be far away from or close to the skin. If the measured values 2N < F1, F2<4N, the current contact force is considered appropriate, the difference dF is calculated as F1-F2, and if dF >0.3N, the angle about the Y-axis is adjusted using PID control so that the X-axis of the probe is parallel to the skin surface. And (4) making a difference dL of the measurement result of the infrared distance measurement sensor along the Y-axis direction, and if the dL is larger than 1cm, adjusting the angle around the X-axis direction by using a PID controller. Finally the ultrasound probe can apply a force of suitable magnitude on the skin and the probe is exactly perpendicular to the scanned tissue surface, and finally image acquisition is performed.
Exemplary method
The embodiment provides a method for monitoring the posture of an ultrasonic probe, which is specifically shown in fig. 3, and the method includes the following steps:
and S100, acquiring the position information of the ultrasonic probe relative to the part to be scanned.
When the invention is applied, ultrasonic detection and detection posture adjustment of the ultrasonic probe can be simultaneously carried out, but before application, preparation work needs to be carried out, namely, firstly, a structured light camera on the ultrasonic probe is utilized to collect an image sample of a region to be scanned, the image sample is trained into a target detection network according to an algorithm, and when ultrasonic scanning is carried out formally, the structured light camera scans the part to be scanned together with the ultrasonic probe scanning the part to be scanned and inputs the part to the target detection network, so that the position information of the part to be scanned can be obtained according to a result output by the target detection network.
In one implementation, as shown in fig. 4, the step S100 specifically includes the following steps:
s101, acquiring an image of a region to be scanned through a preset structured light camera, wherein the structured light camera and the ultrasonic probe are fixed together;
s102, according to the image of the area to be scanned, determining the position information of the ultrasonic probe relative to the part to be scanned.
In specific implementation, as shown in fig. 1 and 2, the structured light camera and the ultrasonic probe are fixed together, and before formal scanning, the structured light camera is required to be used for sampling an image to be trained to obtain a target detection network, that is, an image of a region to be scanned is obtained by transmitting and receiving infrared rays through the structured light camera in advance, the image of the region to be scanned and an RGB image of the camera are registered to obtain a color value of a depth map and expanded to a six-channel map of RBG-XYZ, the position of a part to be scanned in the region to be scanned is determined according to color information, a label is printed, a training set is constructed, and the training set comprises image frames which are intercepted from the image of the region to be scanned in advance according to the same proportion in different scanning stages; and using the MobileNet as a basic network, and using a training set to train the MobileNet to obtain the target detection network.
When the ultrasonic scanning is formally performed, the ultrasonic probe scans a part to be scanned, meanwhile, an image of a region to be scanned is acquired through a preset structured light camera, and according to the image of the region to be scanned, the position information of the ultrasonic probe relative to the part to be scanned can be determined.
For example, the target detection network selects the currently most advanced one-stage target detection method ssd (single Shot multi box detector), and with MobileNet as a basic network, it may use less computing resources under the condition of ensuring precision, an input picture first passes through the first MobileNet network to extract a first large-scale feature map, and then passes through feature extraction for multiple times in sequence to obtain feature maps of different sizes, and the feature map of each scale may participate in detection, and the large-scale feature map is sensitive to small-size objects, and the small-scale feature map is sensitive to size objects, such a feature pyramid network structure (feature pyramid network) may implement detection of different-size objects. In a target detection task, a training sample is a prior frame, and positive and negative samples are quite unbalanced, so that the SSD network samples negative samples, the negative samples are arranged in a descending order according to confidence errors (the smaller the confidence of a prediction background is, the larger the error is) during sampling, and top-k with larger error is selected as the training negative sample to ensure that the proportion of the positive and negative samples is close to 1: 3. The loss function is defined as a weighted sum of the location error (loc) and the confidence error (conf):
where N is the number of positive samples of the prior box. Here, xpij ∈ {1,0} is an indication parameter, which indicates that the ith prior frame matches the jth group channel when xpij ═ 1, and the category of the group channel is p. And c is a category confidence prediction value. l is the predicted value of the position of the corresponding bounding box of the prior frame, and g is the position parameter of the ground channel.
The target network needs to be trained through a training set in practice, and with respect to the acquisition of training data, the mechanical arm moves for multiple times along a planned track to drive the structured light camera, the structured light camera is started to scan for multiple times of fixed planning, an image of a region to be scanned is registered with an RGB (red, green and blue) image of the camera to obtain depth information of each pixel, a camera video is recorded, the video is checked, the position of skin is judged manually according to the depth information, a ground channel is drawn manually, characteristics of a probe and the surface of a human body are marked, a label is made, in order to enable the training set to be distributed evenly, video frames are sampled and intercepted according to the same proportion at different scanning stages in the video, the intercepted video frames are used as the data set, and the picture is stored. And adopting horizontal overturning, random cutting and color distortion, randomly acquiring a block domain mode to amplify a data set to obtain 3000 pictures, and storing the pictures in a workstation to finish training.
When formal scanning is carried out, an image scanned by the structured light camera is input to a target detection network, and position information of the ultrasonic probe relative to a part to be scanned is obtained.
And S200, determining the distance between the ultrasonic probe and the part to be scanned according to the position information.
After the system acquires the position information of the ultrasonic probe relative to the part to be scanned, the infrared distance measuring sensor arranged on the ultrasonic probe can be started, the system controls the infrared distance measuring sensor to work, and the distance between the ultrasonic probe and the part to be scanned is determined.
In one implementation, as shown in fig. 5, the step S200 specifically includes the following steps:
s201, starting four infrared distance measuring sensors preset on the ultrasonic probe according to the position information, wherein the four infrared distance measuring sensors are symmetrically arranged;
s202, measuring the distance between the ultrasonic probe and the part to be scanned through the infrared distance measuring sensor.
In specific implementation, as shown in fig. 1 and 2, the ultrasonic probe is provided with four infrared distance measuring sensors, which are symmetrically arranged around the ultrasonic probe.
The system starts an infrared distance measuring sensor preset on the ultrasonic probe according to the obtained position information, the infrared distance measuring sensor emits infrared rays and receives analog signals of the infrared rays, after the analog signals are converted into digital signals, the system carries out mathematical program calculation according to physical characteristics of light and received data, and finally the distance between the ultrasonic probe and the part to be scanned is measured.
And S300, determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance to obtain the posture of the ultrasonic probe, and adjusting the motion state of the ultrasonic probe.
And finally, the system carries out logic judgment according to the obtained distance data and an algorithm to determine the bonding state between the ultrasonic probe and the part to be scanned, meanwhile, a pressure sensor arranged on the ultrasonic probe also detects the pressure between the ultrasonic probe and the part to be scanned in real time, the system carries out algorithm logic judgment according to the detected pressure and the bonding state to obtain the posture of the ultrasonic probe, and adjusts the motion state of the ultrasonic probe according to the judged result algorithm, thereby adjusting the posture of the ultrasonic probe.
In one implementation, as shown in fig. 6, the step S300 specifically includes the following steps:
s301, determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance;
s302, acquiring pressure data of the ultrasonic probe according to the attaching state, wherein the pressure data is obtained based on a pressure sensor arranged on the ultrasonic probe;
s303, obtaining the posture of the ultrasonic probe according to the pressure data, and adjusting the posture of the ultrasonic probe.
In specific implementation, after the system acquires the distance data obtained in the previous stage, the distance data can be logically judged according to an algorithm, the attaching state between the ultrasonic probe and the part to be scanned is determined according to the judging result, the pressure data of the ultrasonic probe is acquired in an auxiliary manner according to the attaching state and the detection pressure of the pressure sensor arranged on the ultrasonic probe, as shown in fig. 1 and 2, 2 pressure sensors are arranged, the posture of the ultrasonic probe can be obtained according to the pressure data, the system continuously performs logical judgment on the posture by a subsequent algorithm and controls a mechanical arm, the mechanical arm moves according to an algorithm track, so that the ultrasonic probe on the mechanical arm is driven to move to adjust the posture of the ultrasonic probe, the ultrasonic probe and the part to be scanned cannot be attached too loosely or too tightly, even in the whole dynamic scanning moving process, the ultrasonic probe can be adaptively adjusted to be attached to the part to be scanned in the optimal posture, so that the ultrasonic scanning imaging quality is improved, and adverse reactions of patients cannot be caused.
In summary, the optical structure camera, the infrared distance measuring sensor and the pressure sensor are mounted on the ultrasonic probe, so that the sensors are matched together to obtain the position information, the distance and the fitting state of the ultrasonic probe and the part to be scanned, logic judgment is performed according to the obtained data, and the ultrasonic probe is controlled to perform corresponding motion state adjustment, so that the ultrasonic probe cannot be tightly fitted to the part to be scanned or too much pressure is applied, the ultrasonic scanning imaging quality is improved, and adverse reactions of patients cannot be caused.
Exemplary device
As shown in fig. 1 and 2, an embodiment of the present invention provides an apparatus for monitoring an attitude of an ultrasound probe, which may use any of the above-described methods for monitoring an attitude of an ultrasound probe, and at the same time, any of the above-described methods for monitoring an attitude of an ultrasound probe may also be applied to the apparatus.
The device comprises: an ultrasonic probe 50; an optical structure camera disposed on the ultrasound probe 50; an infrared distance measuring sensor 20 provided on the ultrasonic probe 50; the structured light camera 10 is fixed with the ultrasonic probe 50 and is used for acquiring an image of a region to be scanned; the infrared distance measuring sensor 20 is used for measuring the distance between the ultrasonic probe 50 and the part to be scanned.
In one implementation, four infrared distance measuring sensors 20 are provided and symmetrically arranged on the ultrasonic probe 50.
In one implementation manner, the ultrasonic probe is further provided with two pressure sensors 30, and the pressure sensors 30 are used for acquiring pressure data between the ultrasonic probe 50 and the part to be scanned.
In one implementation, the ultrasonic probe 50 is further connected to a mechanical arm, and the mechanical arm is connected to the pressure sensor 30 and the infrared distance measuring sensor 20, so as to control the mechanical arm to adjust the posture of the ultrasonic probe through the feedback of the pressure sensor 30 and the infrared distance measuring sensor 20.
Other implementations are set forth in the exemplary method and will not be described here.
In summary, the invention discloses a method and a device for monitoring the posture of an ultrasonic probe, wherein the monitoring method comprises the following steps: acquiring position information of an ultrasonic probe relative to a part to be scanned; determining the distance between the ultrasonic probe and the part to be scanned according to the position information; and determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance to obtain the posture of the ultrasonic probe, and adjusting the motion state of the ultrasonic probe. The invention scans a part to be scanned by an ultrasonic probe, a structured light on the ultrasonic probe inputs an image of the part to be scanned into a preset target detection network for image processing, the target detection network is obtained by training, before formal ultrasonic scanning, the image of the part to be scanned, which is obtained by sampling, is processed and trained according to an algorithm to form the target detection network, the image of the part to be scanned is processed in the detection network to obtain the position information of the part to be scanned, simultaneously an infrared distance measuring sensor on the ultrasonic probe measures the distance to the part to be scanned, and a pressure sensor on the ultrasonic probe also obtains pressure data when detecting the pressure between the ultrasonic probe and the part to be scanned, all the sensors are matched together to carry out real-time detection data and data transmission to a system, and the system carries out real-time logic judgment and control on a mechanical arm according to the algorithm, the mechanical arm drives the ultrasonic probe to move according to the algorithm track, and the posture of the ultrasonic probe is adjusted, so that the ultrasonic probe is attached to the part to be scanned in the optimal posture in the scanning process, the ultrasonic scanning imaging quality is improved, and adverse reactions of patients cannot be caused.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the present invention in its responsive technical solutions.
Claims (10)
1. A method of attitude monitoring of an ultrasound probe, the method comprising:
acquiring position information of an ultrasonic probe relative to a part to be scanned;
determining the distance between the ultrasonic probe and the part to be scanned according to the position information;
and determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance to obtain the posture of the ultrasonic probe, and adjusting the motion state of the ultrasonic probe.
2. The method for monitoring the posture of the ultrasonic probe according to claim 1, wherein the acquiring the position information of the ultrasonic probe relative to the part to be scanned comprises:
acquiring an image of a region to be scanned by a preset structured light camera, wherein the structured light camera is fixed with the ultrasonic probe;
and determining the position information of the ultrasonic probe relative to the part to be scanned according to the image of the region to be scanned.
3. The method for monitoring the posture of the ultrasonic probe according to claim 2, wherein the determining the position information of the ultrasonic probe relative to the part to be scanned according to the image of the region to be scanned comprises:
inputting an image of a region to be scanned into a preset target detection network, and determining the position of a part to be scanned in the region to be scanned;
and determining the position information of the ultrasonic probe relative to the part to be scanned according to the position of the part to be scanned.
4. The method of claim 3, wherein the target detection network is constructed in a manner comprising:
the method comprises the steps of obtaining an image of a region to be scanned by emitting and receiving infrared rays through a structured light camera in advance, and registering the image of the region to be scanned with an RGB (red, green and blue) image of the camera to obtain color information of each pixel on a depth map; expanding the original three-channel image into a six-channel image of RGB-XYZ;
determining a pixel area of a part to be scanned in the area to be scanned according to the color information, and printing a label to construct a training set, wherein the training set comprises image frames which are cut from images of the area to be scanned in different scanning stages according to the same proportion in advance;
and using the MobileNet as a basic network, and using the training set to train the MobileNet to obtain the target detection network.
5. The method for monitoring the posture of the ultrasonic probe according to claim 1, wherein the determining the distance between the ultrasonic probe and the part to be scanned according to the position information comprises:
according to the position information, starting up four infrared distance measuring sensors preset on the ultrasonic probe, wherein the four infrared distance measuring sensors are symmetrically arranged;
and measuring the distance between the ultrasonic probe and the part to be scanned through the infrared distance measuring sensor.
6. The method for monitoring the posture of the ultrasonic probe according to claim 1, wherein the step of determining the fitting state between the ultrasonic probe and the part to be scanned according to the distance to obtain the posture of the ultrasonic probe and adjusting the motion state of the ultrasonic probe comprises the steps of:
determining the attaching state between the ultrasonic probe and the part to be scanned according to the distance;
acquiring pressure data of the ultrasonic probe according to the attaching state, wherein the pressure data is obtained based on a pressure sensor arranged on the ultrasonic probe;
and obtaining the posture of the ultrasonic probe according to the pressure data, and adjusting the posture of the ultrasonic probe.
7. An attitude monitoring apparatus of an ultrasonic probe, characterized in that the apparatus comprises: an ultrasonic probe; an optical structure camera disposed on the ultrasound probe; an infrared distance measuring sensor arranged on the ultrasonic probe; the structured light camera is fixed with the ultrasonic probe and is used for acquiring an image of a region to be scanned; the infrared distance measuring sensor is used for measuring the distance between the ultrasonic probe and the part to be scanned.
8. The apparatus according to claim 7, wherein the infrared distance measuring sensors are provided in four numbers and symmetrically disposed on the ultrasonic probe.
9. The apparatus for monitoring the posture of the ultrasonic probe according to claim 8, wherein a pressure sensor is further provided on the ultrasonic probe, and the pressure sensor is used for acquiring pressure data between the ultrasonic probe and the part to be scanned.
10. The apparatus according to claim 9, wherein the ultrasonic probe is further connected to a robot arm, and the robot arm is connected to the pressure sensor and the infrared distance sensor, so that the robot arm is controlled to adjust the posture of the ultrasonic probe by the feedback of the pressure sensor and the infrared distance sensor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011528675.5A CN112472133B (en) | 2020-12-22 | 2020-12-22 | Posture monitoring method and device for ultrasonic probe |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011528675.5A CN112472133B (en) | 2020-12-22 | 2020-12-22 | Posture monitoring method and device for ultrasonic probe |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112472133A true CN112472133A (en) | 2021-03-12 |
CN112472133B CN112472133B (en) | 2024-07-09 |
Family
ID=74915325
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011528675.5A Active CN112472133B (en) | 2020-12-22 | 2020-12-22 | Posture monitoring method and device for ultrasonic probe |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112472133B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112998759A (en) * | 2021-04-06 | 2021-06-22 | 无锡海斯凯尔医学技术有限公司 | Tissue elasticity detection method, device and system |
CN113093193A (en) * | 2021-04-06 | 2021-07-09 | 无锡海斯凯尔医学技术有限公司 | Ultrasonic signal triggering method, device and system |
CN113616945A (en) * | 2021-08-13 | 2021-11-09 | 湖北美睦恩医疗设备有限公司 | Detection method based on focused ultrasound image identification and beauty and body care device |
CN113759001A (en) * | 2021-09-24 | 2021-12-07 | 成都汇声科技有限公司 | Method for obtaining and processing ultrasound data |
CN113951932A (en) * | 2021-11-30 | 2022-01-21 | 上海深至信息科技有限公司 | Scanning method and device for ultrasonic equipment |
CN114041828A (en) * | 2022-01-13 | 2022-02-15 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, robot and storage medium |
CN114748101A (en) * | 2022-06-15 | 2022-07-15 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, system and computer readable storage medium |
CN115177210A (en) * | 2022-07-05 | 2022-10-14 | 重庆医科大学 | Photoacoustic tomography system and method |
WO2024090190A1 (en) * | 2022-10-26 | 2024-05-02 | ソニーグループ株式会社 | Ultrasonic inspection device, inspection method, and program |
CN118299039A (en) * | 2024-04-12 | 2024-07-05 | 东莞索诺星科技有限公司 | Three-dimensional imaging and interaction method of ultrasonic detector |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040254460A1 (en) * | 2001-09-11 | 2004-12-16 | Burcher Michael Richard | Method and apparatus for ultrasound examination |
US20120101388A1 (en) * | 2010-10-20 | 2012-04-26 | Gaurav Tripathi | System for Locating Anatomical Objects in Ultrasound Imaging |
CN102551804A (en) * | 2011-12-31 | 2012-07-11 | 重庆海扶(Hifu)技术有限公司 | Ultrasonic treatment apparatus monitoring system capable of reducing image artifacts and image acquisition method |
US20130296707A1 (en) * | 2010-12-18 | 2013-11-07 | Massachusetts Institute Of Technology | User interface for ultrasound scanning system |
US20150297177A1 (en) * | 2014-04-17 | 2015-10-22 | The Johns Hopkins University | Robot assisted ultrasound system |
CN108095761A (en) * | 2012-03-07 | 2018-06-01 | 齐特奥股份有限公司 | Spacial alignment equipment, spacial alignment system and the method for instructing medical procedure |
CN109288541A (en) * | 2018-11-15 | 2019-02-01 | 深圳市比邻星精密技术有限公司 | Robot system and its checking method based on ultrasonic scan |
CN109480906A (en) * | 2018-12-28 | 2019-03-19 | 无锡祥生医疗科技股份有限公司 | Ultrasonic transducer navigation system and supersonic imaging apparatus |
CN109567864A (en) * | 2019-01-23 | 2019-04-05 | 上海浅葱网络技术有限公司 | A kind of orientable ultrasonic probe |
CN110363803A (en) * | 2019-07-18 | 2019-10-22 | 深圳市思锐视科技有限公司 | A kind of object detection method and system of combination depth map slice and neural network |
CN110488745A (en) * | 2019-07-23 | 2019-11-22 | 上海交通大学 | A kind of human body automatic ultrasonic scanning machine people, controller and control method |
CN110755110A (en) * | 2019-11-20 | 2020-02-07 | 浙江伽奈维医疗科技有限公司 | Three-dimensional ultrasonic scanning device and method based on mechanical arm unit |
CN111084638A (en) * | 2020-02-21 | 2020-05-01 | 常州市第二人民医院 | Ultrasonic probe surface pressure detection device |
US20200194117A1 (en) * | 2018-12-13 | 2020-06-18 | University Of Maryland, College Park | Systems, methods, and media for remote trauma assessment |
CN111488857A (en) * | 2020-04-29 | 2020-08-04 | 北京华捷艾米科技有限公司 | Three-dimensional face recognition model training method and device |
CN111652085A (en) * | 2020-05-14 | 2020-09-11 | 东莞理工学院 | Object identification method based on combination of 2D and 3D features |
CN111820917A (en) * | 2020-06-05 | 2020-10-27 | 哈工大机器人(中山)无人装备与人工智能研究院 | Binocular vision blood sampling device and blood sampling robot with same |
CN111950543A (en) * | 2019-05-14 | 2020-11-17 | 北京京东尚科信息技术有限公司 | Target detection method and device |
CN112022346A (en) * | 2020-08-31 | 2020-12-04 | 同济大学 | Control method of full-automatic venipuncture recognition integrated robot |
CN215128942U (en) * | 2020-12-22 | 2021-12-14 | 深圳市德力凯医疗设备股份有限公司 | Posture monitoring device of ultrasonic probe |
-
2020
- 2020-12-22 CN CN202011528675.5A patent/CN112472133B/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040254460A1 (en) * | 2001-09-11 | 2004-12-16 | Burcher Michael Richard | Method and apparatus for ultrasound examination |
US20120101388A1 (en) * | 2010-10-20 | 2012-04-26 | Gaurav Tripathi | System for Locating Anatomical Objects in Ultrasound Imaging |
US20130296707A1 (en) * | 2010-12-18 | 2013-11-07 | Massachusetts Institute Of Technology | User interface for ultrasound scanning system |
CN102551804A (en) * | 2011-12-31 | 2012-07-11 | 重庆海扶(Hifu)技术有限公司 | Ultrasonic treatment apparatus monitoring system capable of reducing image artifacts and image acquisition method |
CN108095761A (en) * | 2012-03-07 | 2018-06-01 | 齐特奥股份有限公司 | Spacial alignment equipment, spacial alignment system and the method for instructing medical procedure |
US20150297177A1 (en) * | 2014-04-17 | 2015-10-22 | The Johns Hopkins University | Robot assisted ultrasound system |
CN109288541A (en) * | 2018-11-15 | 2019-02-01 | 深圳市比邻星精密技术有限公司 | Robot system and its checking method based on ultrasonic scan |
US20200194117A1 (en) * | 2018-12-13 | 2020-06-18 | University Of Maryland, College Park | Systems, methods, and media for remote trauma assessment |
CN109480906A (en) * | 2018-12-28 | 2019-03-19 | 无锡祥生医疗科技股份有限公司 | Ultrasonic transducer navigation system and supersonic imaging apparatus |
CN109567864A (en) * | 2019-01-23 | 2019-04-05 | 上海浅葱网络技术有限公司 | A kind of orientable ultrasonic probe |
CN111950543A (en) * | 2019-05-14 | 2020-11-17 | 北京京东尚科信息技术有限公司 | Target detection method and device |
CN110363803A (en) * | 2019-07-18 | 2019-10-22 | 深圳市思锐视科技有限公司 | A kind of object detection method and system of combination depth map slice and neural network |
CN110488745A (en) * | 2019-07-23 | 2019-11-22 | 上海交通大学 | A kind of human body automatic ultrasonic scanning machine people, controller and control method |
CN110755110A (en) * | 2019-11-20 | 2020-02-07 | 浙江伽奈维医疗科技有限公司 | Three-dimensional ultrasonic scanning device and method based on mechanical arm unit |
CN111084638A (en) * | 2020-02-21 | 2020-05-01 | 常州市第二人民医院 | Ultrasonic probe surface pressure detection device |
CN111488857A (en) * | 2020-04-29 | 2020-08-04 | 北京华捷艾米科技有限公司 | Three-dimensional face recognition model training method and device |
CN111652085A (en) * | 2020-05-14 | 2020-09-11 | 东莞理工学院 | Object identification method based on combination of 2D and 3D features |
CN111820917A (en) * | 2020-06-05 | 2020-10-27 | 哈工大机器人(中山)无人装备与人工智能研究院 | Binocular vision blood sampling device and blood sampling robot with same |
CN112022346A (en) * | 2020-08-31 | 2020-12-04 | 同济大学 | Control method of full-automatic venipuncture recognition integrated robot |
CN215128942U (en) * | 2020-12-22 | 2021-12-14 | 深圳市德力凯医疗设备股份有限公司 | Posture monitoring device of ultrasonic probe |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113093193A (en) * | 2021-04-06 | 2021-07-09 | 无锡海斯凯尔医学技术有限公司 | Ultrasonic signal triggering method, device and system |
CN112998759A (en) * | 2021-04-06 | 2021-06-22 | 无锡海斯凯尔医学技术有限公司 | Tissue elasticity detection method, device and system |
CN113616945A (en) * | 2021-08-13 | 2021-11-09 | 湖北美睦恩医疗设备有限公司 | Detection method based on focused ultrasound image identification and beauty and body care device |
CN113616945B (en) * | 2021-08-13 | 2024-03-08 | 湖北美睦恩医疗设备有限公司 | Detection method based on focused ultrasonic image recognition and beauty and body-building device |
CN113759001B (en) * | 2021-09-24 | 2024-02-27 | 成都汇声科技有限公司 | Method for obtaining and processing ultrasound data |
CN113759001A (en) * | 2021-09-24 | 2021-12-07 | 成都汇声科技有限公司 | Method for obtaining and processing ultrasound data |
CN113951932A (en) * | 2021-11-30 | 2022-01-21 | 上海深至信息科技有限公司 | Scanning method and device for ultrasonic equipment |
CN114041828B (en) * | 2022-01-13 | 2022-04-29 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, robot and storage medium |
CN114041828A (en) * | 2022-01-13 | 2022-02-15 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, robot and storage medium |
CN114748101B (en) * | 2022-06-15 | 2022-11-01 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, system and computer readable storage medium |
CN114748101A (en) * | 2022-06-15 | 2022-07-15 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, system and computer readable storage medium |
CN115177210A (en) * | 2022-07-05 | 2022-10-14 | 重庆医科大学 | Photoacoustic tomography system and method |
WO2024090190A1 (en) * | 2022-10-26 | 2024-05-02 | ソニーグループ株式会社 | Ultrasonic inspection device, inspection method, and program |
CN118299039A (en) * | 2024-04-12 | 2024-07-05 | 东莞索诺星科技有限公司 | Three-dimensional imaging and interaction method of ultrasonic detector |
Also Published As
Publication number | Publication date |
---|---|
CN112472133B (en) | 2024-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112472133A (en) | Posture monitoring method and device for ultrasonic probe | |
CN112288742B (en) | Navigation method and device for ultrasonic probe, storage medium and electronic equipment | |
US12078479B2 (en) | Dual-resolution 3D scanner and method of using | |
KR100871595B1 (en) | A system for measuring flying information of globe-shaped object using the high speed camera | |
US20230042756A1 (en) | Autonomous mobile grabbing method for mechanical arm based on visual-haptic fusion under complex illumination condition | |
US6445814B2 (en) | Three-dimensional information processing apparatus and method | |
EP2185077B1 (en) | Ultrasonic diagnostic imaging system and control method thereof | |
EP3653989B1 (en) | Imaging device and monitoring device | |
US20080301072A1 (en) | Robot simulation apparatus | |
JPH09187038A (en) | Three-dimensional shape extract device | |
CN101949689B (en) | Optical coherence tomography system correction method | |
EP3513738B1 (en) | Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device | |
WO2023272372A1 (en) | Method for recognizing posture of human body parts to be detected based on photogrammetry | |
CN101842053A (en) | Apparatus and method for medical scanning | |
CN106125066B (en) | The control system and control method of laser radar | |
CN114533111A (en) | Three-dimensional ultrasonic reconstruction system based on inertial navigation system | |
CN107680065A (en) | Radiation image bearing calibration and means for correcting and correction system | |
EP4209312A1 (en) | Error detection method and robot system based on association identification | |
CN215128942U (en) | Posture monitoring device of ultrasonic probe | |
CN112656442A (en) | Ultrasonic probe pressure detection device and pressure detection method | |
JP7428814B2 (en) | Ultrasonic diagnostic device and method of controlling the ultrasonic diagnostic device | |
CN207601853U (en) | Radiation image corrects system | |
JPH0973543A (en) | Moving object recognition method/device | |
KR102615722B1 (en) | Ultrasound scanner and method of guiding aim | |
CN113837385B (en) | Data processing method, device, equipment, medium and product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |