CN110786887B - Mammary gland ultrasonic screening method, device and system - Google Patents
Mammary gland ultrasonic screening method, device and system Download PDFInfo
- Publication number
- CN110786887B CN110786887B CN201911007859.4A CN201911007859A CN110786887B CN 110786887 B CN110786887 B CN 110786887B CN 201911007859 A CN201911007859 A CN 201911007859A CN 110786887 B CN110786887 B CN 110786887B
- Authority
- CN
- China
- Prior art keywords
- scanning
- breast
- area
- point cloud
- ultrasonic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0825—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention discloses a mammary gland ultrasonic screening method, which comprises the following steps: acquiring a depth image of a chest region of a user; carrying out model reconstruction according to the depth image to obtain a three-dimensional structure model of a region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model; generating a motion control code according to the scanning track, and inputting the motion control code into a scanning mechanism so as to control the scanning mechanism to drive an ultrasonic probe to perform ultrasonic scanning on the breast area of the user; and analyzing and processing the acquired ultrasonic image to generate a diagnosis result. The breast ultrasonic screening method of the invention can make low-cost and large-scale group breast cancer screening possible by means of an automation technology and an artificial intelligence technology, greatly improve the proportion of the women of the age suitable in China participating in breast cancer screening, and is beneficial to the prevention and control of breast cancer.
Description
Technical Field
The invention relates to the technical field of ultrasonic diagnosis, in particular to a breast ultrasonic screening method, device and system.
Background
The threat of breast cancer to the global female health is increasing day by day, and according to the report of 2018 global cancer statistical data, the breast cancer exceeds the lung cancer with the highest incidence rate of human, and becomes the cancer with the highest incidence rate of female at present. From the characteristics of breast cancer, the breast cancer develops slowly in the early stage, the screening time is sufficient and can last for ten years, and as long as a woman can screen the breast cancer once a year, the breast cancer can be kept away basically. The early breast cancer belongs to carcinoma in situ, does not need radiotherapy or chemotherapy, has very high success rate of direct intervention, and the 5-year survival rate of patients can exceed 95 percent.
In 2009, the country began to push early screening for breast cancer nationwide. However, as far as now, the early screening amount of breast cancer in China each year is very limited, and meanwhile, the problem of uneven regional distribution exists. What causes the lack of popularity of breast cancer population screening in china? Mainly caused by insufficient allocation of primary doctor resources and equipment. The ultrasonic technology is a well-known technology suitable for breast cancer screening, and in the Chinese breast cancer screening guide, ultrasonic inspection is listed as one of the main means for breast cancer screening. Therefore, according to the traditional breast cancer screening mode, the current dilemma of breast cancer mass screening is difficult to relieve due to factors such as insufficient doctor resources, high ultrasonic equipment cost and the like.
Disclosure of Invention
The invention mainly aims to provide a mammary gland ultrasonic screening method, and aims to solve the technical problem that the existing mammary gland ultrasonic screening method has high dependence on professional doctors.
In order to achieve the above object, the present invention provides a breast ultrasound screening method, comprising:
acquiring a depth image of a chest region of a user;
carrying out model reconstruction according to the depth image to obtain a three-dimensional structure model of a region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model;
generating a motion control code according to the scanning track, and inputting the motion control code into a scanning mechanism so as to control the scanning mechanism to drive an ultrasonic probe to perform ultrasonic scanning on the breast area of the user;
and analyzing and processing the acquired ultrasonic image to generate a diagnosis result.
Preferably, before the step of acquiring a depth image of the user's chest region, the method further comprises:
inputting personal information of a user;
and generating a screening serial number according to the personal information of the user and adding the screening serial number into a screening waiting queue.
Preferably, before the step of analyzing and processing the acquired ultrasound images to generate the diagnosis result, the method further comprises:
and carrying out effectiveness analysis on the obtained ultrasonic image, and adjusting the scanning posture of the ultrasonic probe according to the result of the effectiveness analysis.
Preferably, the analyzing the validity of the acquired ultrasound image and adjusting the scanning posture of the ultrasound probe according to the result of the validity analysis includes:
dividing the obtained ultrasonic image into a plurality of sub-regions, and calculating the number ratio of black pixel points of each sub-region;
judging whether the corresponding sub-area is an invalid imaging area or not according to the number ratio of the black pixel points;
dividing a left region and a right region by taking a central line of the ultrasonic image as a reference, counting the number of the invalid imaging regions in the left region and the right region respectively, and calculating the area ratio of all the invalid imaging regions in the left region and the right region in the ultrasonic image;
and calculating the pose compensation amount of the ultrasonic probe according to the area ratio so as to adjust the scanning posture of the ultrasonic probe.
Preferably, the analyzing the acquired ultrasound image to generate a diagnosis result includes:
inputting the acquired ultrasonic image into an AI diagnostic algorithm model for analysis processing to obtain diagnostic data;
and grading the diagnostic data according to the BI-RADS grading to generate a diagnostic result.
Preferably, the analyzing the acquired ultrasound image to generate a diagnosis result further comprises:
and sending the acquired ultrasonic image to a remote diagnosis terminal for analysis processing.
Preferably, the performing model reconstruction according to the depth image to obtain a three-dimensional structure model of the region to be scanned, and generating the scanning track of the ultrasound probe according to the three-dimensional structure model includes:
carrying out coordinate transformation on the point cloud data of the plurality of depth images under different viewing angles to obtain three-dimensional point clouds of the chest region under the same base coordinate system;
segmenting the three-dimensional point cloud of the chest area according to a preset point cloud segmentation algorithm to obtain a point cloud of a breast scanning area;
performing skeleton model reconstruction on the breast area structure according to the breast scanning area point cloud to obtain a curve skeleton;
dividing each curve in the curve skeleton according to a preset curve dividing condition, and taking all dividing points on each curve;
selecting a plurality of groups of segmentation points from the segmentation point set according to a preset ultrasonic scanning direction, and connecting each group of segmentation points into a scanning track curve;
and extracting a plurality of track points from the scanning track curve, and calculating the attitude angle of each track point.
To achieve the above object, the present invention further provides a breast ultrasound screening device, comprising:
the image acquisition module is used for acquiring a depth image of the chest area of the user;
the track generation module is used for carrying out model reconstruction according to the depth image so as to obtain a three-dimensional structure model of a region to be scanned and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model;
the scanning control module is used for generating a motion control code according to the scanning track and inputting the motion control code into the scanning mechanism so as to control the scanning mechanism to drive the ultrasonic probe to carry out ultrasonic scanning on the breast area of the user;
and the diagnosis module is used for analyzing and processing the acquired ultrasonic image to generate a diagnosis result.
In order to achieve the above object, the present invention further provides a breast ultrasound screening system, which includes a host, a shooting device, a scanning mechanism and an ultrasound probe, wherein:
the shooting equipment is used for acquiring a depth image of the chest area of the user;
the host is used for carrying out model reconstruction on the depth image to obtain a three-dimensional structure model of a region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model; the host is also used for generating a motion control code according to the scanning track;
the scanning mechanism is used for receiving the motion control code output by the host and driving the ultrasonic probe to carry out ultrasonic scanning on the breast area of the user according to the motion control code;
the host is further used for analyzing and processing the acquired ultrasonic image to generate a diagnosis result.
Preferably, the breast ultrasound screening system further comprises a user information input device, the user information input device comprises an information input module and a number calling module, wherein:
the information input module is used for inputting personal information of a user;
and the number calling module is used for generating a screening serial number according to the personal information of the user and adding the screening serial number into a screening waiting queue.
Compared with the prior art, the method has the advantages that a set of scheme suitable for performing group ultrasonic screening on the mammary gland is formulated, so that the degree of dependence on a professional doctor is reduced, the screening cost is reduced, and the application range is expanded. The method comprises the steps of constructing full-surface three-dimensional space information of a breast area according to breast characteristics of each user, generating a scanning track, controlling a scanning mechanism to drive an ultrasonic probe to move according to a motion control code obtained by converting the scanning track, and carrying out ultrasonic scanning on the breast area of the user in a full-automatic mechanical scanning mode in the whole process, so that the ultrasonic probe can adjust scanning postures according to the shape of a contact area, the information contained in each frame of acquired ultrasonic image is comprehensive and accurate, and the physiological conditions of the breast, peripheral organs and tissues of the breast are comprehensively and accurately judged. Thus, streamlined operations enable large-scale, mass breast cancer screening.
Drawings
FIG. 1 is a schematic diagram of an exemplary environment in which various disclosed embodiments of the invention may be implemented;
FIG. 2 is a block diagram of another exemplary environment in which various disclosed embodiments of the invention may be implemented;
FIG. 3 is a flow chart illustrating the operation of breast ultrasound screening in various implementations of the present disclosure;
FIG. 4 is a schematic diagram of an offline calibration during acquisition of a point cloud of a thoracic region in accordance with various embodiments of the present disclosure;
FIG. 5 is a cloud point at a first perspective in various disclosed embodiments;
FIG. 6 is a cloud point at a second perspective in various disclosed embodiments;
FIG. 7 is a thoracic region point cloud obtained by coordinate transformation in various embodiments of the present disclosure;
FIG. 8 is a schematic diagram of a point cloud obtained after preprocessing a thoracic region image in various embodiments disclosed herein;
FIG. 9 is a schematic view of a point cloud of a breast scan area obtained by cropping a point cloud of a breast area in various embodiments disclosed herein;
FIG. 10 is a schematic representation of a curved skeleton obtained from reconstruction of a skeleton model in various embodiments disclosed herein;
FIG. 11 is a schematic flow chart diagram of one embodiment of a breast ultrasound screening method of the present invention;
FIG. 12 is a schematic flow chart of another embodiment of the breast ultrasound screening method of the present invention;
FIG. 13 is a schematic flow chart diagram of a breast ultrasound screening method of yet another embodiment of the present invention;
FIG. 14 is a schematic flow chart diagram of a breast ultrasound screening method of yet another embodiment of the present invention;
FIG. 15 is a schematic flow chart diagram of a breast ultrasound screening method of yet another embodiment of the present invention;
FIG. 16 is a functional block diagram of an embodiment of the breast ultrasound screening apparatus of the present invention;
FIG. 17 is a block diagram of a computing device in which various embodiments of the present disclosure can be implemented.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In order to solve the above technical problem, the present invention provides an ultrasound breast screening system, as shown in fig. 1, which mainly includes a main machine (not shown), a shooting device 30, a scanning mechanism 10 and an ultrasound probe 13, and in this embodiment, the ultrasound breast screening system further includes a screening platform 20 placed horizontally, so that ultrasound screening is performed in a horizontal posture, while in other embodiments, ultrasound screening is performed in an upright posture, so that the screening platform 20 can be omitted. The host can be an industrial personal computer or other suitable computer equipment, in the hardware configuration of the embodiment, the host is used as an upper computer, the scanning mechanism 10 in communication connection with the host is used as a lower computer, and the host and the scanning mechanism 10 can be connected through a TCP/IP communication protocol. The screening platform 20 may be a fixed support structure or a movable structure capable of providing position adjustment, such as by providing a lifting mechanism to adjust the height of the support surface of the screening platform 20 or by providing a horizontal moving mechanism to adjust the horizontal position of the support surface of the screening platform 20, wherein the lifting mechanism and the horizontal moving mechanism may be hydraulic devices or may be a screw or rack and pinion drive device driven by a motor to adjust the initial position of the user without the need for the user to move his or her body.
The photographing device 30 is disposed above the screening platform 20, for more comprehensively acquiring a depth image (including three-dimensional point cloud data), for example, the depth image of the present embodiment includes an RGB image and point cloud data, two sets of photographing devices 30 may be configured according to the guidance of the structure shown in fig. 1, in this example, the photographing device 30 is disposed with the transverse direction of the body of the user as the reference direction, in other embodiments, it is also sufficient to dispose the photographing device 30 with the longitudinal direction of the body of the user as the reference direction, and the photographing device 30 of the present embodiment may be a structured light sensor, and of course, may be a laser radar; for another example, the photographing apparatus 30 is mounted on a moving mechanism, and the moving mechanism is used to change the photographing angles, so as to reduce the number of the photographing apparatus 30, and in a minimum case, only one photographing apparatus 30 may be disposed, and the photographing apparatus 30 is used to change the photographing angles by moving along a certain set circle, so as to obtain the cloud point images at a plurality of angles, as shown in fig. 5 and 6, through the cloud point images of two chest regions acquired at two different angles.
The scanning mechanism 10 mainly includes a control device 11 and a mechanical arm 12 in communication connection with the control device 11, the ultrasonic probe 13 is installed at the execution end of the mechanical arm 12, in this embodiment, the control device 11 has corresponding hardware capable of realizing communication, data processing and motion control functions, and the mechanical arm 12 is configured into a multi-axis structure capable of providing three degrees of freedom of linear motion and more than two degrees of freedom of rotation, so as to ensure that the ultrasonic probe 13 can perform adaptive posture change according to the surface shape of the region to be scanned, and in particular, when the scanning mechanism is applied, the mechanical arm 12 may be a five-axis mechanical arm or a six-axis mechanical arm.
In another embodiment, as shown in fig. 2, the scanning mechanism 10 'acquires ultrasound images of the left and right breasts of a user by using ultrasound probes 13' mounted on two mechanical arms 12 ', wherein each of the two mechanical arms 12' has at least three degrees of freedom in three directions perpendicular to each other. The two robot arms 12' are driven by the linear motion mechanism to move in the up-down (i.e., Z-axis), front-back (i.e., Y-axis) and left-right (i.e., X-axis) directions.
The two mechanical arms 12 'are both arranged on a support frame (not shown) through a linear motion mechanism, and the two mechanical arms 12' are arranged in a hoisting state, so that the mechanical arms 12 'can drive the ultrasonic probe 13' to move conveniently. In particular, the invention realizes that the two mechanical arms 12 'do not interfere with each other in the respective movement process during the movement process of the two mechanical arms 12'.
The linear motion mechanism comprises two first linear guide rails 121 'arranged along the X-axis direction, two second linear guide rails 122' arranged along the Y-axis direction and two third linear guide rails 123 'arranged along the Z-axis direction, and the two first linear guide rails 121' are arranged on the support frame in a horizontal state at intervals; the two second linear guide rails 122 ' are mounted on the first linear guide rail 121 ' through sliders in sliding fit with the first linear guide rail 121 '; the two third linear guide rails 123 ' are respectively mounted on the two second linear guide rails 121 ' through sliders in sliding fit with the second linear guide rails 122 ', and the two mechanical arms 12 ' are respectively connected with the sliders on the two third linear guide rails 123 '. In this embodiment, the two mechanical arms 12 'are adopted, and the two ultrasound probes 13' can be driven simultaneously to perform scanning, so that the time for performing breast ultrasound screening once can be greatly shortened.
Specifically, the robot arm 12 ' provided by the present embodiment includes a first rotating assembly 124 ', a second rotating assembly 125 ', and a clamp, the first rotating assembly 124 ' is connected to the output end of the linear motion mechanism (i.e., the slider on the third linear guide 123 '), the first rotating assembly 124 ' is configured to drive the second rotating assembly 125 ' to rotate around the X axis, the second rotating assembly 125 ' is configured to drive the clamp to rotate around the Y axis, the clamp is configured to clamp the ultrasonic probe 13 ', and the first rotating assembly 124 ' and the second rotating assembly 125 ' are arranged in an up-and-down manner. Wherein, the first rotating assembly 124 'and the second rotating assembly 125' can be the same structure or different structures, such as a synchronous wheel assembly, a gear rack, and a separate motor.
Further, the breast ultrasound screening system further comprises a user information input device (not shown), and the user information input device comprises an information input module and a number calling module. The information input module is used for inputting user personal information, for example, the information input module is an identity card information reader, reading of the identity card information is mainly completed through the RFID chip, and a new screening account can be established in the database by reading the user personal information, and the established screening account can be matched in the database. In other embodiments, in addition to using the foregoing non-contact information reading technology, the user personal information may be written in a manual entry manner, for example, a touch display screen is provided, and an interactive interface for entering the user personal information is generated on the touch display screen. The number calling module is used for generating a screening serial number according to personal information of a user and adding the screening serial number into a screening waiting queue, and the breast ultrasound screening is carried out in turn by adopting the mode, so that the screening work can be carried out orderly. In addition, the user terminal can also be accessed into the breast ultrasound screening system through a wireless network (WI-FI, 4G, 5G, wireless channel widely used by the public, etc.), for example, the user terminal establishes a communication connection with a data processing center of the breast ultrasound screening system by paying attention to the public number of "breast screening" (here, the public number name is merely an example) in WeChat (WeChat); for another example, the user terminal is installed with an APP provided by a breast ultrasound screening service provider, and by starting the APP to establish a communication connection with a data processing center of a breast ultrasound screening system, information from the breast ultrasound screening system can be received through a public number or the APP, where the information includes account information, number calling information, ultrasound images, and diagnosis results.
As shown in fig. 3, the main process of screening the group breast cancer of the user by the breast ultrasound screening system of the present invention includes: collecting user information; establishing a scanning model; ultrasonic scanning; and (5) image analysis and diagnosis. The user information acquisition can be acquired by a user information input device; the establishment of the scanning model can be obtained by acquiring a depth image at a specific position and processing the depth image according to a set algorithm model; the ultrasonic scanning is a process of inputting a planned scanning track into a scanning mechanism, and driving an ultrasonic probe to move through the scanning mechanism so as to obtain an ultrasonic image; the "image analysis diagnosis" is to output a diagnosis result by performing analysis processing on an input ultrasound image using an algorithm model based on deep learning. The method comprises the steps of constructing full-surface three-dimensional space information of a breast area according to breast characteristics of each user, generating a scanning track, controlling a scanning mechanism to drive an ultrasonic probe to move according to a motion control code obtained by converting the scanning track, and carrying out ultrasonic scanning on the breast area of the user by adopting a full-automatic mechanical scanning mode in the whole process, so that the ultrasonic probe can adjust scanning postures according to the shape of a contact area, and the information covered by each frame of acquired ultrasonic image is comprehensive and accurate. Thus, streamlined operations enable large-scale, mass breast cancer screening.
So far, the application environment of the various embodiments of the present invention and the hardware structure and functions of the related devices have been described in detail, and the structure of the breast ultrasound screening system described above constitutes only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Various embodiments of breast ultrasound screening methods will be described in detail below based on the above application environment and associated equipment.
As shown in fig. 11, the present invention provides a breast ultrasound screening method, comprising:
in step S10, a depth image of the user' S chest region is acquired.
In order to satisfy the technical requirements of the scanning mechanism, the chest region of the user (female) is a part which is easily affected by the posture of the user and external force and changes in shape, and before the scanning mechanism performs a comprehensive scanning operation, the shape of the chest region needs to be adjusted by, for example, wearing a chest-tightening vest having a certain elasticity, so that the shape stability of the chest region is maintained. Therefore, for each ultrasound scanning procedure, generally speaking, the three-dimensional point cloud data (depth image) needs to be acquired again to obtain an accurate three-dimensional structure of the breast surface. In practical application, for example, a user lies on a screening platform in a lying posture, adjusts the position according to practical conditions until the requirements of three-dimensional point cloud data acquisition and ultrasonic scanning are met, and then acquires a depth image of a chest region through shooting equipment. In practical application, a plurality of shooting devices can be arranged around the screening platform, and in this case, breast area images at different visual angles can be acquired simultaneously; a shooting device which can move around the screening platform can be arranged, in this case, images of the breast area under different visual angles can be acquired in a time-sharing mode, and one of the front scheme and the rear scheme can be selected according to the specific structure of the breast ultrasound screening system. In order to ensure that the full three-dimensional structure of the breast area is acquired, a certain number of capturing views (e.g., at least two different views) should be maintained, and the captured views sufficiently overlap, the image of the breast area in this embodiment may be an RGB-D image.
As shown in fig. 4, after the user lies flat on the screening platform, the position of the user can be adjusted by a cursor positioning device (not shown) configured with the shooting device, for example, the cursor positioning device can generate a cross laser line (orthogonal transverse laser line C and longitudinal laser line L, respectively), and the posture of the user satisfies that the alignment of the cross laser line is a guarantee of accurate result output by the point cloud processing algorithm. During specific operation, the longitudinal central line of the body of the user is sufficiently overlapped with the longitudinal laser line L, meanwhile, the scanning starting line on the upper side of the chest of the body of the user is sufficiently overlapped with the transverse laser line C, the scanning starting line mentioned at the position is approximately positioned at the position of the clavicle or at a certain distance below the clavicle, and the scanning starting line can be reasonably selected according to the difference of objects to be scanned during specific application.
In consideration of the fact that the coverage area of the obtained original point cloud data is wide, boundary filtering needs to be carried out on the original point cloud data so as to simplify the post-processing difficulty of the data. The three-dimensional structure of the chest region can be accurately described by collecting the three-dimensional point cloud data of the chest region, so that the motion track of the ultrasonic probe which accords with the actual scanning contact surface is generated by a scanning track planning algorithm at the later stage.
Further, in a preferred embodiment, the breast ultrasound screening method further includes:
and preprocessing each depth image, wherein the preprocessing comprises point cloud downsampling, point cloud filtering, point cloud smoothing and the like.
The step is executed after the depth image is acquired, and the point cloud data which better accords with the ultrasonic scanning application scene can be acquired by preprocessing the three-dimensional point cloud data, meanwhile, the complexity of the data is reduced, and the data processing efficiency of the equipment is improved. Specifically, the input point cloud is dense, and the time consumption for all processing is long, so that the input point cloud is firstly subjected to down-sampling, the density of the point cloud is reduced, and the processing speed is accelerated. Intuitively, point cloud down-sampling is to take one point at a certain space distance from an original point cloud to represent other points in the neighborhood, so that a more sparse point cloud can be obtained, and a specific point cloud down-sampling setting standard can be selected according to the data acquisition specification and the post data processing precision of shooting equipment, without limitation. In addition, theoretically, the point cloud of the chest region should form a smooth and continuous curved surface, but some abnormal point clouds (such as isolated discrete points) exist due to various reasons, and can be filtered out through point cloud filtering, and a point cloud with higher quality is output for subsequent steps. The filtered point cloud has unsmooth phenomenon, such as ripples like water waves, due to the measurement error of the sensor, so that the point cloud can be further smoothed, and the curved surface of the point cloud is smoother.
In order to improve the automation degree of breast ultrasound screening, before the step S10, the breast ultrasound screening method further includes:
inputting personal information of a user; and generating a screening serial number according to the personal information of the user and adding the screening serial number into a screening waiting queue.
For example, personal information of a user is input through an identity card information reader, the identity card information reader mainly finishes reading of the identity card information through an RFID chip, and a new screening account can be established in a database by reading the personal information of the user, and the established screening account can be matched in the database. In other embodiments, in addition to using the foregoing non-contact information reading technology, the user personal information may be written in a manual entry manner, for example, a touch display screen is provided, and an interactive interface for entering the user personal information is generated on the touch display screen. By adopting the mode, the ultrasound screening of the recurrent mammary glands can ensure that the screening work is carried out orderly. In addition, the user terminal can be accessed into the breast ultrasound screening system through a wireless network (WI-FI, 4G, 5G, etc.), for example, the user terminal establishes a communication connection with a data processing center of the breast ultrasound screening system by paying attention to the public number of "breast screening" (the public number name is merely an example) in WeChat (Wechat); for another example, the user terminal is installed with an APP provided by a breast ultrasound screening service provider, and by starting the APP to establish a communication connection with a data processing center of a breast ultrasound screening system, information from the breast ultrasound screening system can be received through a public number or the APP, where the information includes account information, number calling information, ultrasound images, and diagnosis results.
And step S20, performing model reconstruction according to the depth image to obtain a three-dimensional structure model of the region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model.
In the step, the depth image is mainly processed to obtain an image processing link of the scanning track. The image processing link mainly comprises model reconstruction, region segmentation, trajectory planning and the like, and depth images under a plurality of different visual angles can be converted into a unified coordinate system through the model reconstruction; by means of region segmentation, point clouds of a region to be scanned can be extracted from original point cloud data so as to be used for subsequent trajectory planning.
As shown in fig. 15, in a preferred embodiment, step S20 specifically includes:
and step S21, performing coordinate transformation on the multiple depth images under different viewing angles to obtain a chest region three-dimensional point cloud under the same base coordinate system.
In this embodiment, calibration parameters of coordinate transformation are calculated in an offline calibration manner, and then the acquired point cloud of the chest area is reconstructed online according to the calibration parameters, so that the point clouds of a plurality of views acquired online are transformed into the same base coordinate system. Specifically, for example, in one case, the 2D image and the 3D point cloud acquired in the "offline calibration" step are from a specific calibration object, such as a calibration plate or other object with abundant texture features, while the 2D image and the 3D point cloud acquired in the "online reconstruction" step are from the chest region of the user to be ultrasonically scanned.
Carrying out feature extraction and feature matching on the depth image to obtain a plurality of matching point pairs:
taking the image data of a calibration object as an example in an off-line calibration link, for example, the calibration object is a calibration plate, surf features are respectively extracted from 2D images of each calibration plate, and surf features of every two 2D images are respectively matched, so that a plurality of 2D matching point pairs are obtained. Here, the 2D image is an RGB image included in the depth image.
In other embodiments, the surf features described above may be replaced with sift or ORB features.
Obtaining a 3D matching point pair according to the 2D matching point pair, and calculating the coordinate transformation of the 3D matching point pair to obtain a transformation matrix of the two 3D point clouds with the overlapped area:
in this embodiment, in order to obtain the corresponding 3D coordinates of each feature point in the three-dimensional point cloud, first, the three-dimensional coordinates X of the feature point on the focal plane of the photographing apparatus are calculated from the pixel coordinates X of the feature point, and the origin of the photographing apparatus is marked as O ═ 001]TAnd then the intersection point of the ray OX and the point cloud is the 3D point corresponding to the feature point. In particular toIn a preferred embodiment, to find the intersection point, all three-dimensional point clouds in the point cloud, which have an included angle with the ray OX smaller than a certain value, are captured and are fitted into a spatial plane, and then the intersection point of the ray OX and the spatial plane is calculated as a 3D point corresponding to the feature point.
After the 3D points corresponding to the feature points are obtained, the 2D matching point pairs can be converted into 3D matching point pairs, finally, the 3D matching point pairs are input into an ICP (inductively coupled plasma) algorithm to calculate a conversion relation so as to obtain a conversion matrix of the two views, and the calibration parameters { H) of the conversion matrix are utilizedijDenotes a conversion relationship between different views, where i and j are positive integers.
Calculating a full-view transformation matrix according to the transformation matrix:
if only two views are reconstructed, the full-view transformation matrix is the transformation matrix of the two views; if more than two views are reconstructed, the full-view transform matrix may be one of the set of transform matrices, or one of the set of transform matrices, and may be parameter-modified. The full-view transformation matrix is associated with all views used for reconstruction, so that the calibration parameters under the full-coverage base coordinate system can be obtained.
In a specific embodiment, the step of "calculating a full-view transformation matrix according to the transformation matrix" includes:
determining two associated shooting devices according to the transformation matrix to establish a topological connection diagram of the shooting devices:
in this step, a topological connection graph (graph) of the shooting devices is mainly established to represent the relationship of the interconnection nodes, specifically, two shooting devices with correlation are determined through a transformation matrix, if an effective transformation matrix exists between two shooting devices, an edge is established, and the distance of each edge is defined as the spatial distance between the shooting devices corresponding to the two end points of the edge, and the calculation mode of the distance is only a preferred scheme. The obtained collection of the interconnection nodes is the topological connection graph of the shooting equipment.
Selecting a reference node from the topological connection graph, and calculating the shortest paths from the other nodes to the reference node respectively:
in this step, the reference node may be selected according to the number of views captured by the capturing device, i.e., the parameters { H } are calibrated in pairsijIn the calculation process, the node corresponding to the view with the largest occurrence frequency is a reference node, or a certain node is manually designated as the reference node in the reconstruction calculation link. After the reference nodes are determined, all paths from the other nodes to the reference nodes respectively can be calculated, and the shortest path can be selected from the paths.
Computing a full-view transformation matrix of the end-located node to the reference node along the shortest path:
the transformation matrix obtained by calculation along the shortest path can represent the transformation parameters from all views to the base coordinate system, namely, the full-view transformation matrix is obtained.
Transforming the 3D point clouds in all the view angles of the chest area into the same base coordinate system according to the full-view transformation matrix to generate a complete three-dimensional point cloud of the chest area:
by calculating a transformation matrix between every two views and adopting a shortest path algorithm to calibrate parameters (H)ijDetermining full-coverage calibration parameters, so that 3D point clouds in all chest regions can be transformed into the same base coordinate system, and generating three-dimensional point clouds suitable for point cloud segmentation and trajectory planning through a post-processing link, wherein the processing result can refer to an image shown in fig. 7. It should be noted that after calibration parameters capable of covering all views are obtained by performing matrix transformation on a plurality of views, based on a shooting view angle set in "offline calibration", 2D images and 3D point clouds of a user chest region under a plurality of view angles are collected, and the 3D point clouds are transformed into the same base coordinate system.
Further, in order to reduce the accuracy error caused by the shooting device under some shooting angles, the present embodiment adopts the following scheme: and selecting an image area formed by the shooting equipment with higher shooting precision from the views with the overlapped area. Specifically, in a link of 'generating complete three-dimensional point clouds of a chest region', an overlapping region where point clouds overlap is determined, and according to shooting parameters between the point clouds in the overlapping region and shooting equipment, the best point cloud is screened from the point clouds from multiple shooting equipment in the overlapping region for three-dimensional reconstruction of the overlapping region. For example, the shooting parameter is that the optical axis of the shooting device 30 corresponds to the deflection angle of the target point cloud, and the smaller the deflection angle is, the more accurate the spatial information represented by the pixel is according to the imaging characteristics. Calculating an included angle between an original point connecting line of each overlapped point cloud and each shooting device and an optical axis of the shooting device; and screening out the optimal point clouds from a plurality of overlapped point clouds at the same position according to the included angle so as to combine the point clouds into a point cloud area for three-dimensional reconstruction. Therefore, when the redundant point clouds are eliminated, the point clouds capable of representing accurate position information can be selected according to the algorithm of the embodiment, and the obtained three-dimensional structure of the surface of the breast is more accurate.
Further, after the step of screening the point clouds in the overlapping regions, in the link of "generating a complete three-dimensional point cloud of the breast region", the breast ultrasound screening method further includes:
performing region segmentation on the point cloud in the base coordinate system to obtain a plurality of continuous curved surfaces;
and screening effective point cloud sheets from the curved surface according to preset filtering conditions.
In the embodiment, noise point clouds existing in the space are mainly further filtered, and the noise point clouds are generally areas with a small range, so that the remaining point clouds can be divided into a plurality of point cloud areas by taking the continuous curved surface characteristics as the dividing conditions. The area of the point cloud area where the breast is located is the largest, and the point cloud area with the largest curved surface area can be used as the point cloud area needing to be reserved by calculating and comparing the areas of the point cloud areas, so that effective point cloud pieces can be screened out from the plurality of curved surfaces by taking the area as a filtering condition.
After the effective point cloud film is obtained, in order to overcome the problem that the point clouds of a plurality of views cannot be completely overlapped due to calibration errors, structured light measurement errors and other factors, in the embodiment, all transition areas of the point cloud film are extracted, and point cloud smoothing operation is performed on the transition areas, so that the point clouds in the main areas are spliced into a continuous piece. Here, the transition area is a fault position between the point cloud pieces, and if data loss is serious, the data processing in the later period is greatly affected.
And step S22, segmenting the chest area three-dimensional point cloud according to a preset point cloud segmentation algorithm to obtain a breast scanning area point cloud.
Through the model reconstruction operation, the depth images under different shooting visual angles are unified into the same coordinate system, so that a foundation is provided for point cloud segmentation in the step. Specifically, the point cloud segmentation algorithm adopted in the step mainly comprises:
deleting point clouds corresponding to bed plane areas from the three-dimensional point cloud data according to preset conditions to obtain chest area point clouds; determining an upper side partition boundary and a central partition boundary of the chest of the point cloud of the chest area through off-line calibration; taking a bed plane as a reference, upwards constructing a horizontal tangent plane according to a preset height increment value, and fitting the point cloud on the current horizontal tangent plane into an axillary side segmentation boundary when the point cloud on the horizontal tangent plane meets a preset boundary segmentation condition; constructing a first vertical tangent plane according to the chest upper side segmentation boundary, shifting a preset distance in the direction from the head to the foot of the human body by taking the first vertical tangent plane as a reference to obtain a second vertical tangent plane, and fitting point clouds on the second vertical tangent plane into the chest lower side segmentation boundary; and respectively extracting point clouds in the enclosed areas of the chest upper side segmentation boundary, the center segmentation boundary, the axillary side segmentation boundary and the chest lower side segmentation boundary corresponding to the left and right breasts to serve as the point clouds of the breast scanning area.
In order to further improve the processing efficiency of the point cloud data and reduce the influence of redundant data, the embodiment may further add a link of cutting the interested 3D region in the point cloud segmentation algorithm. Since the point cloud acquiring means 30 is fixed and the 3D space where the person lies on the bed is within a certain limited area, only the point cloud data within a certain spatial range can be considered. In the embodiment, the 3D region of interest is defined as a 3D cube bounding box, and specifically, the maximum and minimum coordinate values in the three directions of the bounding box XYZ are determined by offline calibration according to the principle that the bounding box can contain the bed surface and the human chest region within the range of the screening platform. After the bounding box is calibrated off line, all point clouds in the bounding box are directly cut out for the subsequent algorithm steps. Fig. 8 shows the result of region-of-interest 3D cropping of the point cloud shown in fig. 7, where the part shown in region a in fig. 7 is the critical chest region, the cropped result mainly contains the chest region P1 and the bed plane region P2, and the point cloud data is greatly simplified. It should be noted that the point cloud shown in fig. 7 and 8 is only a chest region corresponding to one side of the breast of the human body.
As shown in fig. 8, taking the breast position on one side as an example, a point cloud set including a chest region P1 and a bed plane region P2 is obtained by performing region-of-interest 3D clipping on the point cloud. In this embodiment, the point cloud of the bed plane area P2 needs to be deleted, and the bed plane and the human body surface have significant distinguishing features, that is, the bed plane is a planar area with a large area in the point cloud acquisition space, and the human body surface is a curved area with a large area in the point cloud acquisition space.
Specifically, the preset condition mainly includes two points, namely, the area of the plane region, and whether the plane region is located at the lower part of the whole point cloud, and the plane region is separated from the whole point cloud, and the plane region is judged by using the preset condition. In a preferred embodiment, a correlation algorithm in PCL (point Cloud library) may be used to identify the point Cloud belonging to the plane area (for example, feature vectors of each point are used as correlation parameters), and calculate the area of the plane area.
And after the point cloud corresponding to the bed plane area is deleted, the remaining point cloud comprises a chest area point cloud and a noise point cloud. Then, the remaining point cloud is segmented into several continuous curved surfaces according to continuity.
Further, noise point clouds existing in the space are further filtered, and the noise point clouds are generally areas with small ranges, so that the residual point clouds can be divided into a plurality of point cloud areas by taking the continuous curved surface characteristics as dividing conditions. The area of the point cloud area where the breast is located is the largest, and the area of each point cloud area is calculated and compared, so that the point cloud with the largest curved surface area can be used as the most significant point cloud area, and the point cloud contained in the most significant point cloud area is used as the point cloud of the chest area. Further, the chest area point cloud can be screened, and some point clouds which cannot be used in a later-stage planning scanning track are removed, for example, all point clouds with a vertical distance from a highest point (such as a nipple position) smaller than a certain value (such as 10cm) are screened, so that the optimized chest area point cloud is formed.
As shown in fig. 9, for the upper side division boundary and the central division boundary of the chest, the tangent planes are fixed and can be calibrated off-line, that is, when the point cloud data is collected, the line of coincidence between the longitudinal central line of the body of the user and the longitudinal laser line L, and the line of coincidence between the scanning start line of the upper side of the chest of the body of the user and the transverse laser line C. Therefore, the transverse vertical tangent plane of the chest upper side segmentation boundary and the longitudinal vertical tangent plane of the center segmentation boundary can be directly determined according to the off-line calibration data.
The axillary side segmentation boundary can be the axillary midline or a position close to the axillary midline, the specific position can be determined according to the movement stroke of the scanning mechanism, and the selected position of the axillary side segmentation boundary can be changed. In this embodiment, the position of the axillary side segmentation boundary is determined by adopting an equidistant slicing manner, specifically, with reference to a bed plane, and with reference to fig. 9, for example, a coordinate plane determined by an XY axis coincides with the bed plane, that is, a horizontal cutting plane is constructed upward along a Z axis by a certain step length (for example, 0.5cm), for each constructed horizontal cutting plane, it is determined whether a point cloud on the horizontal cutting plane meets a preset boundary segmentation condition, when the point cloud meets the preset boundary segmentation condition, an upward slicing operation is stopped, and the point cloud on the current horizontal cutting plane is fit into the axillary side segmentation boundary. It can be understood that, since the chest area point cloud represents a curved surface feature, an intersection line is formed when the horizontal tangent plane intersects with the chest area point cloud, that is, the point cloud on the horizontal tangent plane is the point cloud on the intersection line.
In order to reduce the calculation amount of data, a horizontal cutting plane can be constructed from a preset height of a bed plane, the preset height can be specifically selected according to the stature of each user and input into data processing equipment, for example, the preset height is 5-8 cm, and the number of slices is greatly reduced by resetting the initial position for constructing the horizontal cutting plane.
The surface normal of the point cloud on the horizontal tangent plane represents the surface trend of the armpit side surface, so that whether the position of the armpit side surface meets the stroke requirement of a scanning mechanism or not can be evaluated by calculating the surface normal of the point cloud and calculating the included angle between the surface normal and the horizontal tangent plane.
Because the point clouds on the horizontal tangent plane are enough, the average value of the included angle is compared with the preset angle value, and the accuracy is higher.
The lower segmentation boundary of the breast is determined on the principle that at least the lower boundary of the breast is exceeded, so that the range of the ultrasonic scanning can cover the whole area where the breast is located. Therefore, a first vertical tangent plane is constructed according to the determined chest upper side segmentation boundary, and the first vertical tangent plane is used as a reference to deviate a preset distance in the direction from the head to the feet of the human body, so that a second vertical tangent plane can be obtained. As an implementation manner, the offset distance of the first vertical tangential plane may be set as a plurality of sets of constants, and in practical application, one of the constants may be selected from the database as the offset distance according to information of the user, such as age, height, and weight, for example, the constant may be any selected value in a range of 20-30 cm. After the second vertical tangent plane is obtained, the point cloud intersected with the second vertical tangent plane can be screened from the point cloud of the chest area, and the lower side segmentation boundary of the chest is fitted according to the part of the point cloud.
For the breast on each side, after the corresponding chest upper side segmentation boundary, center segmentation boundary, axillary side segmentation boundary and chest lower side segmentation boundary are obtained, the point cloud of the chest scanning area can be screened by utilizing tangent planes of the four segmentation boundaries, and an accurate point cloud basis is provided for a subsequent scanning track planning algorithm.
In another embodiment of the present invention, in order to ensure accuracy of data processing, a lying pose calibration link is added, and specifically, the cloud segmentation algorithm further includes:
and calculating linear equations of the left side and the right side of the chest according to the point cloud of the chest area, determining an angular bisector according to the linear equations of the left side and the right side, and calculating the body width according to the linear equations of the left side and the right side if an included angle formed between the angular bisector and a preset reference line is smaller than a preset value.
The ideal lying posture of the testee is that the body center line is parallel to the bed center line, and when the body center line inclines relative to the bed center line beyond a certain angle, incomplete scanning or accidents occur. Therefore, in order to ensure the scanning safety and obtain a comprehensive and accurate ultrasonic image, whether the pose of the subject meets the requirement needs to be detected, and if the pose does not meet the requirement of sufficient parallelism, the program returns and prompts to adjust the pose. By solving the angular bisector, the actual situation of the lying pose can be evaluated.
For a unilateral chest, firstly, the chest area point cloud is subjected to body transverse equal-interval slicing, and the interval distance is adjustable (for example, 0.5cm is taken), so that a series of transverse slices of the chest area point cloud are obtained. Then, the extreme point of the body edge, i.e. the lowest and most body-edge point of each slice, is selected from each transverse slice, which for the case shown in fig. 9 (representing the left thorax) is the point with the largest Y coordinate and the smallest Z coordinate, but for the right thorax is the point with the smallest Y coordinate and the smallest Z coordinate. And finally, projecting all the extracted points to a plane where the XY axes are located and performing straight line fitting to obtain a straight line equation. In this embodiment, RANSAC or least squares may be used to fit the point cloud to a linear equation. Taking the coordinate system shown in fig. 9 as an example, the preset reference line is parallel to the X axis, if the angle between the bisector and the X axis is small enough, the parallel check is passed, for example, the included angle used as a reference is preset to be 0-5 °, and otherwise, the failure is returned.
In addition, after the linear equations of the left side and the right side of the chest are obtained, the offset distance of the vertical tangent plane can be calculated according to the two linear equations. Specifically, the size of the preset distance is calculated according to the following formula:
d=max(Wbd·r,dmin)
wherein, WbdIs body width, r is a proportionality coefficient, dminIs the minimum scan length.
The body width can be determined from two line equations, such as taking the midpoints of two edge lines and calculating the distance between the two midpoints as the body width. The scaling factor may be set according to individual differences of users, or take a common value, such as r ═ 0.7. The minimum scan length is set to avoid that the estimated body width is too small to fully cover the area to be scanned, e.g. dmin20cm, or some useful number greater than 20 cm. Therefore, the offset distance of the first vertical tangent plane is determined in a quantitative calculation mode, and the accuracy is higher.
And step S23, performing skeleton model reconstruction on the breast area structure according to the point cloud of the breast scanning area to obtain a curve skeleton.
Through the point cloud segmentation operation, a point cloud area with a smaller range can be obtained, and a track is planned on the basis, so that a more accurate result can be obtained.
The data volume of the acquired three-dimensional point cloud data is huge, the data needs to be reconstructed, and the application requirements of a scanning track planning algorithm are met while the data is simplified. Specifically, the point cloud is sliced according to a preset direction, the direction of the human body is taken as a reference, slicing operation is mainly carried out along the transverse direction and the longitudinal direction of the body, and slicing is carried out in an equidistant mode under an optimal slicing constraint condition, so that sub-point clouds with equal width in a section are obtained, and the width of each sub-point cloud can be flexibly adjusted according to actual conditions. As a possible implementation manner, the ultrasound probe adopts a bar scanning manner, and the bar scanning direction is along the longitudinal direction of the body, so that point cloud slicing is performed along the transverse direction of the body, the scanning manner has low requirements on a motion mechanism, and the quality of an ultrasound image can be ensured.
Transversely slicing the three-dimensional point cloud data to obtain a plurality of sub-point clouds; and performing curve fitting on each section of the sub-point cloud by using a Bezier curve to obtain a curve skeleton.
As shown in fig. 10, the reconstructed curved skeleton is a more stable and reliable representation of the structure of the chest region, which facilitates the post-processing of the algorithm. In this step, the fitting operation of the bezier curve may refer to the detailed description about this aspect in the prior art, which is not described herein again.
And step S24, segmenting each curve in the curve skeleton according to preset curve segmentation conditions, and taking all segmentation points on each curve.
In the step, taking the selected longitudinal strip scanning mode as an example, each curve which is transversely distributed is subjected to equal arc length segmentation, and the segmentation interval is set according to the size of the coverage surface of the ultrasonic probe, so that the ultrasonic probe can cover a complete area to be scanned in the scanning process, and meanwhile, the overlapped area can be reduced. In the link of executing curve division, the obtained division point is expressed as { S }ij,0≤i<A,0≤j<BiWhere A is the number of curves in the curve skeleton, BiThe number of segmentation points on the ith curve is, i and j are positive integers, and XYZ coordinate values of each segmentation point in a motion coordinate system corresponding to the ultrasonic probe can be obtained by performing coordinate transformation on the point cloud.
And step S25, selecting a plurality of groups of segmentation points from the segmentation point set according to the preset ultrasonic scanning direction, and connecting each group of segmentation points into a scanning track curve.
In the step, a plurality of groups of segmentation points capable of being combined into a scanning track curve are selected from a segmentation point set according to a preset ultrasonic scanning direction, longitudinal bar scanning is taken as an example, the simplest grouping mode is that segmentation points with the same serial number j on each curve in a curve skeleton are selected as a group, and thus, the scanning track curve is obtainedA complete trajectory S can be obtained0j,S1j,S2j,…,SAj}. In addition to the exemplary division point combination method described above, division point grouping may be performed in any other suitable manner.
And step S26, extracting a plurality of track points from the scanned track curve, and calculating the attitude angle of each track point.
In this step, as a preferred embodiment, the track points are the foregoing segmentation points, and this way of extracting the track points can simplify the process of data processing. Of course, besides extracting the aforementioned segmentation points, one or more points may be additionally extracted between adjacent segmentation points as track points, where the motion parameters of the scanning mechanism need to be combined to avoid data redundancy. Taking the extracted division point as a track point as an example and matching with a scanning mechanism with five degrees of freedom, the coordinate value and the corresponding attitude angle of each track point are required to be obtained, and the track point is expressed as Pi=[Xi,Yi,Zi,Ri,Pi]These five quantities represent PiXYZ coordinate value of and PiRoll, Pitch attitude angle of (1), wherein PiThe XYZ coordinate values of (a) are calculated from the point cloud data described above, so this step mainly calculates two attitude angles of the trajectory points. However, if the extracted trajectory points are not the aforementioned division points, the XYZ coordinate values of these unknown trajectory points need to be calculated. By determining five coordinate quantities of each track point, the ultrasonic probe can be controlled to move to a specific position of an area to be scanned in a motion control program according to XYZ coordinate values, and the angle posture to which the ultrasonic probe should be adjusted can be controlled in the motion control program according to Roll and Pitch posture angles, so that the surface of the probe is tightly attached to the surface of the area to be scanned.
In a preferred embodiment, the following algorithm is mainly adopted for calculating the attitude angle of each trace point, and the specific steps include:
extracting a neighborhood point set of the track points, and obtaining a unit direction vector Vz of the track points on a Z axis by solving PCA of the neighborhood point set;
according to the formula Vy ═ Vz × [ 001 × []T、VxCalculating unit direction vectors Vx and Vy of the track point on an XY axis by Vy multiplied by Vz;
converting the unit direction vector of the XYZ coordinate axes of the track points into a representation form of Euler angles, and extracting attitude angles.
The boundary radius of the neighborhood point set extracted by taking the track point as the center can be selected according to the expected calculation precision, the range of the neighborhood point set is not limited, and after the extraction range of the neighborhood point set is set, the unit direction vector Vz of the track point on the Z axis can be obtained by solving PCA of the neighborhood point set.
After converting the unit direction vector into the representation form of the euler angle, the attitude angles of three directions can be actually obtained, specifically, several attitude angles are extracted, and the motion degrees of freedom that the ultrasonic probe can provide can be combined, and the embodiment takes the extraction of the attitude angles of Roll and Pitch as an example.
In addition, after the step of calculating the attitude angle of each trace point, it is necessary to verify each trace point in consideration that some trace points may be located outside the range of motion of the distal end of the ultrasonic probe. And importing stroke limit data of the tail end of the ultrasonic probe, and filtering points which cannot be reached by the tail end of the ultrasonic probe from the track points according to the stroke limit data.
Usually, the motion limit of the ultrasonic probe can be calibrated and stored for later use in the form of a data table, and track points are verified according to the data table, so that accidents of equipment in the scanning process can be avoided. Meanwhile, after some track points are filtered, each scanned track curve is subjected to smooth filtering, so that the action of the ultrasonic probe in the scanning process is smoother, and the local extrusion to a human body is reduced.
And step S30, generating a motion control code according to the scanning track, and inputting the motion control code into the scanning mechanism so as to control the scanning mechanism to drive the ultrasonic probe to perform ultrasonic scanning on the breast area of the user.
After the scanning track is determined, a motion control code can be generated according to position information represented by points on the scanning track, for example, the motion control code adopts a G code representation form, and the motion control code is input into the scanning mechanism and can be specifically realized through a configured multi-axis linkage motion control card, so that the scanning mechanism is controlled to drive an ultrasonic probe to perform ultrasonic scanning on a breast area of a user. In this step, the implementation process of converting the coordinate information of the point into the motion control code is well known to those skilled in the art, and therefore will not be described herein. And controlling the ultrasonic probe to contact the surface of the breast with a certain pressure according to the planned scanning track, and adjusting the posture of the sound wave emitting surface of the ultrasonic probe according to the curved surface characteristics of the surface of the breast on each scanning path, thereby ensuring that a high-quality ultrasonic image is obtained.
In step S40, the acquired ultrasound image is analyzed to generate a diagnosis result.
As shown in fig. 14, the step S40 specifically includes:
step S41, inputting the obtained ultrasonic image into an AI diagnostic algorithm model for analysis processing to obtain diagnostic data;
and step S42, grading the diagnostic data according to the BI-RADS grading to generate a diagnostic result.
Based on a deep learning technology, a convolutional neural network (namely an AI diagnostic algorithm model) is established to analyze and process the ultrasonic image, the convolutional neural network can be obtained by providing training samples of various focuses, and random test samples are used for verifying the reliability of the convolutional neural network. The focus rapid detection and tracking adopts a target detection and tracking algorithm based on a convolutional neural network to detect a benign/malignant focus target in an ultrasonic image in real time and track the target. The diagnosis result can be embodied in a text form or an image-text form, for example, the diagnosis result is sent to a user terminal, and specifically can be presented in a mode of WeChat public number, applet, APP, short message, multimedia message and the like, so that the user can conveniently check the diagnosis result.
The lesion grading identification is based on the extracted lesion basic features, and good and malignant classifications or more detailed grading is given through a classification algorithm. For the ultrasonic screening results of the user, the system can be classified into different grades according to BI-RADS (Breast imaging reporting and data system), so that a more standard and understandable diagnosis report is provided for the user. Wherein each hierarchical meaning is as follows:
level 0: recalls are needed and evaluated after being combined with other examinations;
stage I: no abnormality is found;
II stage: regular follow-up (e.g., once a year) is recommended to account for benign changes;
grade III: benign disease may occur, but the follow-up period needs to be shortened (for example, once every 3-6 months);
stage IV: if there is an abnormality, the possibility of malignant lesion cannot be completely eliminated, and the biopsy is required to be clear;
stage IVa: the tendency to malignancy is low;
stage IVb: the likelihood of malignancy is moderate;
stage IVc: the possibility of malignancy is high;
and V stage: highly suspected malignant lesions (almost recognized as malignant disease), requiring surgical resection biopsy;
stage VI: it has been pathologically confirmed as a malignant lesion.
In this embodiment, the acquired ultrasound image may be analyzed by a local data processing device, such as a host, or may be sent to a remote data processing device via a wired/wireless network, such as a server or a remote diagnosis terminal.
In addition, in order to realize better health management, the diagnosis result of each user is stored in the database and is associated with the account information, and the subsequent screening schedule and medical information related to the mammary gland are pushed to the user according to the diagnosis result of the user. In one example, if the screening results of the user are BI-RADS 1 and BI-RADS 2, the user may be considered to be currently in a normal state of the breast, but there is no guarantee that no breast disease will occur subsequently. Under the condition, after the examination is carried out for 11 months, the system can send reminding information to the user to remind the user to screen the breast cancer in the second year in time.
As shown in fig. 12, in order to ensure that the finally output ultrasound image meets the requirements of the analysis diagnosis, before the acquired ultrasound image is subjected to the analysis processing, the breast ultrasound screening method further includes:
and step S50, carrying out effectiveness analysis on the obtained ultrasonic image, and adjusting the scanning posture of the ultrasonic probe according to the result of the effectiveness analysis.
In the step, effectiveness analysis is mainly performed on the ultrasonic image acquired in real time, wherein the effectiveness analysis refers to whether the ultrasonic image is complete, the ultrasonic image is mainly lost due to the fact that the ultrasonic probe is not in close contact with the surface of the breast, and an invalid image is represented by large-area black pixel points, so that the scanning posture of the ultrasonic probe is adjusted by using an image evaluation feedback strategy, and the sound wave emitting surface of the ultrasonic probe is ensured to be attached to the surface of the breast as much as possible. In the embodiment, the position of the invalid region in the ultrasonic image is mainly identified, and the ultrasonic probe is deflected according to the position of the invalid region, so that the posture adjustment is realized.
As shown in fig. 13, the step S50 specifically includes:
step S51, the acquired ultrasound image is divided into a plurality of sub-regions, and the number ratio of black pixels in each sub-region is calculated.
It should be understood that the pixel value of the black pixel is zero, and the ratio of the number of the black pixels is the ratio of the number of the black pixels in the divided single sub-area to the number of all the pixels in the single sub-area.
Specifically, the ultrasound image is divided into a plurality of rectangular areas (sub-areas) which are continuously distributed, a certain number of black pixel points are distributed in each rectangular area, and the number ratio of the black pixel points in the rectangular area can be calculated by calculating the number of the black pixel points in each rectangular area and the number of all the pixel points.
And step S52, judging whether the corresponding sub-area is an invalid imaging area according to the number ratio of the black pixel points.
It is easy to understand that, the more the number of black pixels in each sub-region is, the easier it is to form an invalid imaging region (black region) in the region, and when the number of black pixels in the sub-region exceeds a certain threshold, it can be determined that the sub-region where the black pixels are located is the invalid imaging region. Specifically, the threshold of the ratio of the number of black pixels in the ineffective imaging area provided by the invention is 75% to 88%, and whether each sub-area is an ineffective imaging area is determined based on the threshold. For example, it is assumed that the ultrasound image is divided into a plurality of rectangular areas which are continuously distributed and have the same size, 15960 pixel points are distributed in the rectangular area, and if 12000-14000 black pixel points are distributed in the rectangular area, it indicates that the rectangular area is an invalid imaging area.
Step S53, dividing left and right regions with the central line of the ultrasound image as a reference, counting the number of the ineffective imaging regions in the left and right regions, respectively, and calculating the area ratio of all the ineffective imaging regions in the left and right regions in the ultrasound image.
In this embodiment, the center line of the ultrasound image is used as a reference, the ultrasound image is divided into a left region and a right region, the number of invalid imaging regions in the left region and the right region is counted respectively, and the area ratio of each invalid imaging region to the ultrasound image is calculated. It is understood that the positions of the ineffective imaging areas are confirmed by dividing the obtained ultrasound image into a left area and a right area.
And step S54, calculating the pose compensation amount of the ultrasonic probe according to the area ratio so as to adjust the scanning posture of the ultrasonic probe.
In this embodiment, the pose compensation amount of the ultrasonic probe is calculated according to the area ratio of the invalid imaging regions in the left and right regions of the ultrasonic image, and is actually used for compensating the pose of the mechanical arm with multiple degrees of freedom, so that the pose of the ultrasonic probe is adjusted in real time by the mechanical arm, and specifically, the rotation and/or the pressing of the ultrasonic probe is controlled according to the calculated compensation amount.
It should be noted that, when the area ratio of the ineffective imaging region in the left region is equal to the area ratio of the ineffective imaging region in the right region, it indicates that rotation compensation is not required for the ultrasonic probe, and only the ultrasonic probe needs to be pressed down for compensation. Specifically, the depression compensation amount of the ultrasonic probe can be obtained by multiplying the area ratio of the ineffective imaging region in the left region or the area ratio of the ineffective imaging region in the right region by a preset depression coefficient, and the ultrasonic probe can be attached to the skin of a human body by the depression compensation of the ultrasonic probe, so that the ineffective imaging region in the ultrasonic image is reduced, and the imaging quality of the ultrasonic image is improved.
In the embodiment, the ultrasonic probe is controlled to be pressed down by the preset height to reach the preset position according to the calculated pressing compensation amount, so that the ultrasonic probe can be attached to the surface of the breast, a clear ultrasonic image is obtained, and the accuracy of a diagnosis result is improved.
When the area ratio of the ineffective imaging area in the left area is not equal to the area ratio of the ineffective imaging area in the right area, the area ratio of the ineffective imaging area in the left area needs to be subtracted from the area ratio of the ineffective imaging area in the right area, and the obtained difference is multiplied by a preset rotation coefficient, so that the rotation compensation amount of the ultrasonic probe is obtained.
In this embodiment, the ultrasound probe is controlled to rotate by a certain angle according to the rotation compensation amount obtained by calculation, so that the area ratio of the ineffective imaging region in the left region is the same as the area ratio of the ineffective imaging region in the right region. If the area ratio of the ineffective imaging area in the left area is greater than that of the ineffective imaging area in the right area, the ultrasonic probe is controlled to rotate to the left side of the human body by a certain angle; and if the area ratio of the invalid imaging area in the left area is smaller than that in the right area, controlling the ultrasonic probe to rotate to the right side of the human body by a certain angle.
And after the ultrasonic probe rotates by a certain angle according to the rotation compensation amount, calculating the area ratio of all invalid imaging areas in the ultrasonic image in the left area or the right area at the moment so as to be used for calculating the subsequent pressing compensation amount.
And finally, multiplying the area ratio of the invalid imaging area in the left area or the area ratio of the invalid imaging area in the right area after rotation by a preset press-down coefficient to obtain the press-down compensation amount of the ultrasonic probe, and accordingly controlling the press-down of the ultrasonic probe.
Therefore, the breast ultrasonic screening method can make low-cost and large-scale group breast cancer screening possible by means of an automatic technology and an artificial intelligence technology, greatly improve the proportion of the women of the right age in China participating in breast cancer screening, and contribute to the prevention and control of breast cancer.
In addition, the present invention also provides a breast ultrasound screening apparatus, as shown in fig. 16, including:
an image acquisition module 100 for acquiring a depth image of a user's chest region;
the track generation module 200 is configured to perform model reconstruction according to the depth image to obtain a three-dimensional structure model of the region to be scanned, and generate a scanning track of the ultrasonic probe according to the three-dimensional structure model;
the scanning control module 300 is configured to generate a motion control code according to the scanning track, and input the motion control code into the scanning mechanism, so as to control the scanning mechanism to drive the ultrasound probe to perform ultrasound scanning on the breast area of the user;
the diagnosis module 400 is configured to analyze the acquired ultrasound image to generate a diagnosis result.
The modules in the breast ultrasound screening apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a computer device, and can also be stored in a memory in a server in a software form, so that the computer device can call and execute operations corresponding to the modules. The computer device may be a Central Processing Unit (CPU), a microcomputer device, a single chip microcomputer, or the like. The working principle and the function of each functional module can be seen in the implementation process of the breast ultrasound screening method shown in fig. 11-15, which is not described herein again.
The present invention also provides a computer program storage medium having computer program code stored therein, which when executed by a processor, performs the steps of:
acquiring a depth image of a chest region of a user;
performing model reconstruction according to the depth image to obtain a three-dimensional structure model of the region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model;
generating a motion control code according to the scanning track, and inputting the motion control code into a scanning mechanism so as to control the scanning mechanism to drive an ultrasonic probe to perform ultrasonic scanning on the breast area of the user;
and analyzing and processing the acquired ultrasonic image to generate a diagnosis result.
When being executed by the processor, the computer program further realizes other steps of the breast ultrasound screening method, which can be specifically referred to the description of the above embodiments of the breast ultrasound screening method and will not be described herein again.
The present invention also provides a computer apparatus, as shown in fig. 17, which includes a processor 40, a memory 50 and computer program code stored in the memory 50, wherein the processor 40, when calling the computer program code, implements the steps of a breast ultrasound screening method provided in the above embodiments.
In particular, the computer device may be a personal computer or a server. The computer device includes a processor 40, a memory 50, and a communication interface (not shown) connected by a system bus. The processor 40 is used to provide computing and control capabilities, among other things, to support the operation of the overall computer device. The memory 50 includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium has stored therein an operating system and a computer program that, when executed by the processor 40, implements a breast ultrasound screening method. The internal memory provides an environment for the operating system and the computer program to run in the non-volatile storage medium. The communication interface is used for connecting and communicating with an external server or terminal through a network.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (7)
1. A breast ultrasound screening method, comprising:
acquiring a depth image of the chest region by a photographing device;
carrying out model reconstruction according to the depth image to obtain a three-dimensional structure model of a region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model;
generating a motion control code according to the scanning track, and inputting the motion control code into a scanning mechanism so as to control the scanning mechanism to drive an ultrasonic probe to perform ultrasonic scanning on the breast area of the user;
analyzing and processing the acquired ultrasonic image to generate an analysis result;
the model reconstruction according to the depth image to obtain a three-dimensional structure model of the region to be scanned, and the generation of the scanning track of the ultrasonic probe according to the three-dimensional structure model comprises the following steps:
carrying out coordinate transformation on the point cloud data of the plurality of depth images under different viewing angles to obtain three-dimensional point clouds of the chest region under the same base coordinate system;
segmenting the three-dimensional point cloud of the chest area according to a preset point cloud segmentation algorithm to obtain a point cloud of a breast scanning area;
performing skeleton model reconstruction on the breast area structure according to the breast scanning area point cloud to obtain a curve skeleton;
dividing each curve in the curve skeleton according to a preset curve dividing condition, and taking all dividing points on each curve;
selecting a plurality of groups of segmentation points from the segmentation point set according to a preset ultrasonic scanning direction, and connecting each group of segmentation points into a scanning track curve;
and extracting a plurality of track points from the scanning track curve, and calculating the attitude angle of each track point.
2. The breast ultrasound screening method of claim 1 wherein prior to the step of acquiring a depth image of the breast region by a capture device, the method further comprises:
inputting personal information of a user;
and generating a screening serial number according to the personal information of the user and adding the screening serial number into a screening waiting queue.
3. The breast ultrasound screening method of claim 1 wherein prior to the step of analytically processing the acquired ultrasound images to generate an analysis result, the method further comprises:
and carrying out effectiveness analysis on the obtained ultrasonic image, and adjusting the scanning posture of the ultrasonic probe according to the result of the effectiveness analysis.
4. The breast ultrasound screening method according to claim 3, wherein the analyzing validity of the acquired ultrasound image and adjusting the scanning posture of the ultrasound probe according to the result of the validity analysis comprises:
dividing the obtained ultrasonic image into a plurality of sub-regions, and calculating the number ratio of black pixel points of each sub-region;
judging whether the corresponding sub-area is an invalid imaging area or not according to the number ratio of the black pixel points;
dividing a left region and a right region by taking a central line of the ultrasonic image as a reference, counting the number of the invalid imaging regions in the left region and the right region respectively, and calculating the area ratio of all the invalid imaging regions in the left region and the right region in the ultrasonic image;
and calculating the pose compensation amount of the ultrasonic probe according to the area ratio so as to adjust the scanning posture of the ultrasonic probe.
5. A breast ultrasound screening device, comprising:
the image acquisition module is used for acquiring a depth image of the chest area of the user;
the track generation module is used for carrying out model reconstruction according to the depth image so as to obtain a three-dimensional structure model of a region to be scanned and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model;
the scanning control module is used for generating a motion control code according to the scanning track and inputting the motion control code into the scanning mechanism so as to control the scanning mechanism to drive the ultrasonic probe to carry out ultrasonic scanning on the breast area of the user;
the diagnosis module is used for analyzing and processing the acquired ultrasonic image to generate a diagnosis result;
the model reconstruction according to the depth image to obtain a three-dimensional structure model of the region to be scanned, and the generation of the scanning track of the ultrasonic probe according to the three-dimensional structure model comprises the following steps:
carrying out coordinate transformation on the point cloud data of the plurality of depth images under different viewing angles to obtain three-dimensional point clouds of the chest region under the same base coordinate system;
segmenting the three-dimensional point cloud of the chest area according to a preset point cloud segmentation algorithm to obtain a point cloud of a breast scanning area;
performing skeleton model reconstruction on the breast area structure according to the breast scanning area point cloud to obtain a curve skeleton;
dividing each curve in the curve skeleton according to a preset curve dividing condition, and taking all dividing points on each curve;
selecting a plurality of groups of segmentation points from the segmentation point set according to a preset ultrasonic scanning direction, and connecting each group of segmentation points into a scanning track curve;
and extracting a plurality of track points from the scanning track curve, and calculating the attitude angle of each track point.
6. The breast ultrasonic screening system is characterized by comprising a host, a shooting device, a scanning mechanism and an ultrasonic probe, wherein:
the shooting equipment is used for acquiring a depth image of the chest area of the user;
the host is used for carrying out model reconstruction on the depth image to obtain a three-dimensional structure model of a region to be scanned, and generating a scanning track of the ultrasonic probe according to the three-dimensional structure model; the host is also used for generating a motion control code according to the scanning track;
the scanning mechanism is used for receiving the motion control code output by the host and driving the ultrasonic probe to carry out ultrasonic scanning on the breast area of the user according to the motion control code;
the host is also used for analyzing and processing the acquired ultrasonic image to generate a diagnosis result;
the model reconstruction of the depth image to obtain a three-dimensional structure model of the region to be scanned, and the generation of the scanning track of the ultrasonic probe according to the three-dimensional structure model comprises the following steps:
carrying out coordinate transformation on the point cloud data of the plurality of depth images under different viewing angles to obtain three-dimensional point clouds of the chest region under the same base coordinate system;
segmenting the three-dimensional point cloud of the chest area according to a preset point cloud segmentation algorithm to obtain a point cloud of a breast scanning area;
performing skeleton model reconstruction on the breast area structure according to the breast scanning area point cloud to obtain a curve skeleton;
dividing each curve in the curve skeleton according to a preset curve dividing condition, and taking all dividing points on each curve;
selecting a plurality of groups of segmentation points from the segmentation point set according to a preset ultrasonic scanning direction, and connecting each group of segmentation points into a scanning track curve;
and extracting a plurality of track points from the scanning track curve, and calculating the attitude angle of each track point.
7. The breast ultrasound screening system of claim 6, further comprising a user information entry device comprising an information entry module and a number calling module, wherein:
the information input module is used for inputting personal information of a user;
and the number calling module is used for generating a screening serial number according to the personal information of the user and adding the screening serial number into a screening waiting queue.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911007859.4A CN110786887B (en) | 2019-10-22 | 2019-10-22 | Mammary gland ultrasonic screening method, device and system |
PCT/CN2020/121237 WO2021078066A1 (en) | 2019-10-22 | 2020-10-15 | Breast ultrasound screening method, apparatus and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911007859.4A CN110786887B (en) | 2019-10-22 | 2019-10-22 | Mammary gland ultrasonic screening method, device and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110786887A CN110786887A (en) | 2020-02-14 |
CN110786887B true CN110786887B (en) | 2021-11-26 |
Family
ID=69440928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911007859.4A Active CN110786887B (en) | 2019-10-22 | 2019-10-22 | Mammary gland ultrasonic screening method, device and system |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110786887B (en) |
WO (1) | WO2021078066A1 (en) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110786887B (en) * | 2019-10-22 | 2021-11-26 | 深圳瀚维智能医疗科技有限公司 | Mammary gland ultrasonic screening method, device and system |
CN111528902B (en) * | 2020-04-16 | 2024-08-09 | 深圳瀚维智能医疗科技有限公司 | Automatic mammary gland scanning equipment |
CN111603199B (en) * | 2020-04-24 | 2023-03-14 | 中国人民解放军总医院第二医学中心 | Three-dimensional reconstruction ultrasonic diagnosis system based on body surface positioning measuring instrument |
CN112085698A (en) * | 2020-07-27 | 2020-12-15 | 深圳瀚维智能医疗科技有限公司 | Method and device for automatically analyzing left and right breast ultrasonic images |
CN112419415B (en) * | 2020-12-08 | 2022-06-17 | 浙江德尚韵兴医疗科技有限公司 | Ultrasonic scanning method for realizing pose planning based on CRS curve fitting |
CN112472137B (en) * | 2020-12-15 | 2022-09-16 | 深圳市德力凯医疗设备股份有限公司 | Ultrasonic scanning method and system based on sliding operation track of touch screen |
CN112603342B (en) * | 2020-12-23 | 2022-12-06 | 达影医疗(中山)有限公司 | Compression device, mammary gland X-ray imaging machine and mammary gland detection imaging method |
CN112767237B (en) * | 2020-12-30 | 2024-06-25 | 无锡祥生医疗科技股份有限公司 | Annular pose control method and device based on point cloud data and ultrasonic equipment |
CN112716522A (en) * | 2020-12-30 | 2021-04-30 | 无锡祥生医疗科技股份有限公司 | Probe tail end trajectory tracking method and device, electronic equipment and storage medium |
EP4309588A4 (en) * | 2021-03-17 | 2024-07-31 | Fujifilm Corp | Ultrasonic diagnosis device and method for controlling ultrasonic diagnosis device |
CN113057678A (en) * | 2021-04-09 | 2021-07-02 | 哈尔滨理工大学 | Mammary gland ultrasonic scanning method and system based on binocular vision and robot |
CN113344028A (en) * | 2021-05-10 | 2021-09-03 | 深圳瀚维智能医疗科技有限公司 | Breast ultrasound sequence image classification method and device |
CN113538665B (en) * | 2021-07-21 | 2024-02-02 | 无锡艾米特智能医疗科技有限公司 | Organ three-dimensional image reconstruction compensation method |
CN113558660B (en) * | 2021-08-03 | 2024-09-13 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning assembly and 4D ultrasonic imaging system |
CN113838210A (en) * | 2021-09-10 | 2021-12-24 | 西北工业大学 | Method and device for converting ultrasonic image into 3D model |
CN113925529A (en) * | 2021-10-14 | 2022-01-14 | 武汉库柏特科技有限公司 | Ultrasonic scanning control method, device, equipment and storage medium |
CN113975661B (en) * | 2021-11-16 | 2024-03-01 | 神州医疗科技股份有限公司 | Quality control method, device and system for monitoring treatment equipment and storage medium |
CN114224381B (en) * | 2021-12-16 | 2024-02-27 | 中国人民解放军联勤保障部队北戴河康复疗养中心 | Auxiliary supporting device in ultrasonic inspection and application method thereof |
CN114041828B (en) * | 2022-01-13 | 2022-04-29 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning control method, robot and storage medium |
CN116250863B (en) * | 2023-03-01 | 2023-09-08 | 山东大学 | Multi-mode multi-parameter breast cancer screening intervention system |
CN117258168B (en) * | 2023-10-16 | 2024-03-22 | 广州驰狐科技有限公司 | Dynamic intelligent control method and system for ultrasonic beauty instrument |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080021317A1 (en) * | 2006-07-24 | 2008-01-24 | Siemens Medical Solutions Usa, Inc. | Ultrasound medical imaging with robotic assistance for volume imaging |
CN101097635A (en) * | 2007-08-02 | 2008-01-02 | 郑志豪 | Bank queuing method and management system based on identity identification |
US20110301461A1 (en) * | 2010-06-04 | 2011-12-08 | Doris Nkiruka Anite | Self-administered breast ultrasonic imaging systems |
TWI476403B (en) * | 2011-04-22 | 2015-03-11 | Pai Chi Li | Automated ultrasonic scanning system and scanning method thereof |
WO2013101562A2 (en) * | 2011-12-18 | 2013-07-04 | Metritrack, Llc | Three dimensional mapping display system for diagnostic ultrasound machines |
CN103690191B (en) * | 2013-12-03 | 2016-03-02 | 华南理工大学 | A kind of ultrasonic probe intelligence continuous sweep device and scan method thereof |
CN103908258A (en) * | 2013-12-09 | 2014-07-09 | 天津天视科技有限公司 | Method for measuring volume of dairy cow mammary tissue |
CN103750864B (en) * | 2014-01-13 | 2015-12-02 | 华南理工大学 | A kind of scanning means of ultrasonic elastograph imaging and scan method thereof |
CN104856720B (en) * | 2015-05-07 | 2017-08-08 | 东北电力大学 | A kind of robot assisted ultrasonic scanning system based on RGB D sensors |
CN105997149A (en) * | 2016-06-16 | 2016-10-12 | 深圳市前海安测信息技术有限公司 | Automatic breast screening system and method |
CN106923862B (en) * | 2017-03-17 | 2020-11-27 | 苏州佳世达电通有限公司 | Ultrasonic scanning guide device and ultrasonic scanning guide method |
CN107397557A (en) * | 2017-07-13 | 2017-11-28 | 深圳市前海博志信息技术有限公司 | Breast ultrasound ripple audit report generates system and method |
CN108670305B (en) * | 2018-06-25 | 2024-01-16 | 深圳瀚维智能医疗科技有限公司 | Automatic breast scanning device |
CN209107408U (en) * | 2018-06-25 | 2019-07-16 | 深圳瀚维智能医疗科技有限公司 | Breast automatic scanning device |
CN109637630A (en) * | 2018-11-15 | 2019-04-16 | 上海联影医疗科技有限公司 | Self-service medical imaging system and control method |
CN109674494B (en) * | 2019-01-29 | 2021-09-14 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning real-time control method and device, storage medium and computer equipment |
CN110664438B (en) * | 2019-10-22 | 2021-09-10 | 深圳瀚维智能医疗科技有限公司 | Ultrasonic scanning track planning method and device, storage medium and computer equipment |
CN110675398B (en) * | 2019-10-22 | 2022-05-17 | 深圳瀚维智能医疗科技有限公司 | Mammary gland ultrasonic screening method and device and computer equipment |
CN110786887B (en) * | 2019-10-22 | 2021-11-26 | 深圳瀚维智能医疗科技有限公司 | Mammary gland ultrasonic screening method, device and system |
-
2019
- 2019-10-22 CN CN201911007859.4A patent/CN110786887B/en active Active
-
2020
- 2020-10-15 WO PCT/CN2020/121237 patent/WO2021078066A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2021078066A1 (en) | 2021-04-29 |
CN110786887A (en) | 2020-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110786887B (en) | Mammary gland ultrasonic screening method, device and system | |
CN110675398B (en) | Mammary gland ultrasonic screening method and device and computer equipment | |
WO2021078064A1 (en) | Ultrasonic scanning track planning method and apparatus, and storage medium and computer device | |
CN110751719B (en) | Breast three-dimensional point cloud reconstruction method, device, storage medium and computer equipment | |
US8634622B2 (en) | Computer-aided detection of regions of interest in tomographic breast imagery | |
EP2961324B1 (en) | Systems and methods for ultrasound imaging | |
CN107170031B (en) | Thoracic tomography with flexible compression paddle | |
EP2212859B1 (en) | Method and apparatus for volume rendering of data sets | |
EP1718206B1 (en) | Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements | |
CN110969245B (en) | Target detection model training method and device for medical image | |
US10105120B2 (en) | Methods of, and apparatuses for, producing augmented images of a spine | |
CN110766704A (en) | Breast point cloud segmentation method, device, storage medium and computer equipment | |
CN105027163A (en) | Scan region determining apparatus | |
CN105678746A (en) | Positioning method and apparatus for the liver scope in medical image | |
US10548564B2 (en) | System and method for ultrasound imaging of regions containing bone structure | |
CN107106128B (en) | Ultrasound imaging apparatus and method for segmenting an anatomical target | |
CN116869652B (en) | Surgical robot based on ultrasonic image and electronic skin and positioning method thereof | |
CN110738633A (en) | organism tissue three-dimensional image processing method and related equipment | |
CN112386282B (en) | Ultrasonic automatic volume scanning imaging method and system | |
CN116777893A (en) | Segmentation and identification method based on characteristic nodules of breast ultrasound transverse and longitudinal sections | |
CN116258736A (en) | System and method for segmenting an image | |
EP3381010B1 (en) | Process for processing medical images of a face for recognition of facial dysmorphisms | |
CN112515705B (en) | Method and system for projection profile enabled computer-aided detection | |
US20220130048A1 (en) | System and method for estimating motion of target inside tissue based on surface deformation of soft tissue | |
JP2019118694A (en) | Medical image generation apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |