CN114782537A - Human carotid artery positioning method and device based on 3D vision - Google Patents
Human carotid artery positioning method and device based on 3D vision Download PDFInfo
- Publication number
- CN114782537A CN114782537A CN202210527495.8A CN202210527495A CN114782537A CN 114782537 A CN114782537 A CN 114782537A CN 202210527495 A CN202210527495 A CN 202210527495A CN 114782537 A CN114782537 A CN 114782537A
- Authority
- CN
- China
- Prior art keywords
- carotid artery
- characteristic point
- image data
- human body
- neck
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000001715 carotid artery Anatomy 0.000 title claims abstract description 59
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000002604 ultrasonography Methods 0.000 claims abstract description 21
- 230000006870 function Effects 0.000 claims abstract description 16
- 238000013135 deep learning Methods 0.000 claims abstract description 12
- 238000005070 sampling Methods 0.000 claims description 23
- 238000007476 Maximum Likelihood Methods 0.000 claims description 6
- 230000014509 gene expression Effects 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000004590 computer program Methods 0.000 description 14
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 210000001367 artery Anatomy 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/06—Measuring blood flow
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/40—Positioning of patients, e.g. means for holding or immobilising parts of the patient's body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- Vascular Medicine (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Hematology (AREA)
- Quality & Reliability (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The utility model provides a human carotid artery positioning method and device based on 3D vision, which relates to the technical field of intelligent medical treatment and comprises the steps of obtaining ultrasonic image data; detecting ultrasonic image data by utilizing deep learning to generate characteristic point parameters of a specific position of a human body; determining the projection position of the carotid artery by adopting a radial basis function according to the characteristic point parameters of the specific position of the human body; the position of the carotid artery in the ultrasound image is projected to the neck surface to guide an autonomous ultrasound scan. Features are identified from the face and neck images captured by the 3D camera and the projected location of the carotid artery is modeled based on these features, which can be used as an initial planned path for robotic autonomous ultrasound scanning to guide automated carotid autonomous ultrasound scanning.
Description
Technical Field
The disclosure relates to the technical field of intelligent medical treatment, in particular to a human carotid artery positioning method and device based on 3D vision.
Background
In medical imaging systems such as magnetic resonance imaging (MR) systems or Computed Tomography (CT) systems, it is sometimes necessary to use a 3D camera in conjunction with the acquisition of auxiliary information such as body position information of a patient. The camera of a 3D camera is typically composed of a two-dimensional color (RGB) camera and a depth camera.
When installing the 3D camera, it is generally required to install the 3D camera right above the patient bed to detect the patient on the patient bed and the patient bed, so as to obtain the relevant state and information of the patient to the maximum extent.
Disclosure of Invention
The purpose of the disclosure is to provide a human carotid artery positioning method and device based on 3D vision, and the human carotid artery positioning method based on 3D vision is used for robot ultrasonic scanning of carotid arteries.
The first aspect of the present disclosure provides a human carotid artery positioning method based on 3D vision, including:
acquiring ultrasonic image data;
detecting ultrasonic image data by utilizing deep learning to generate characteristic point parameters of a specific position of a human body;
determining the projection position of the carotid artery by adopting a radial basis function according to the characteristic point parameters of the specific position of the human body;
the position of the carotid artery in the ultrasound image is projected to the neck surface to guide the autonomous ultrasound scan.
The present disclosure provides in a second aspect a human carotid artery positioning device based on 3D vision, comprising:
an acquisition module for acquiring ultrasound image data;
the generation module is used for detecting ultrasonic image data by utilizing deep learning to generate characteristic point parameters of a specific position of a human body;
the determining module is used for determining the projection position of the carotid artery by adopting a radial basis function according to the characteristic point parameter of the specific position of the human body;
and the guiding module is used for projecting the position of the carotid artery in the ultrasonic image to the surface of the neck so as to guide the autonomous ultrasonic scanning.
A third aspect of the present disclosure provides an electronic device, comprising: a memory and one or more processors;
the memory to store one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method for human carotid artery localization based on 3D vision as provided in any of the embodiments.
A fourth aspect of the present disclosure provides a storage medium containing computer executable instructions for implementing a method for 3D vision-based localization of human carotid artery as provided in any of the embodiments in a computer processor.
The invention provides a human carotid artery positioning method and a human carotid artery positioning device based on 3D vision, which are characterized in that features are identified from face and neck images shot by a 3D camera, the projection position of a carotid artery is modeled according to the features, and the projection position can be used as an initial planning path for autonomous ultrasonic scanning of a robot to guide automatic autonomous ultrasonic scanning of the carotid artery.
Drawings
FIG. 1 is a flow chart of a method for locating a human carotid artery based on 3D vision in an embodiment of the present disclosure;
FIG. 2 is another flow chart of a method for locating a human carotid artery based on 3D vision in an embodiment of the present disclosure;
FIG. 3 is another flow chart of a method for locating a carotid artery of a human body based on 3D vision in an embodiment of the present disclosure;
FIG. 4 is a schematic view of the feature points in FIG. 1;
FIG. 5 is a schematic view of a human carotid artery positioning device based on 3D vision in an embodiment of the present disclosure;
FIG. 6 is a schematic view of a human carotid artery positioning device based on 3D vision in an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Because the 3D camera cannot directly sense the artery blood vessel, the features need to be identified from the face and neck images shot by the 3D camera, and the projection position of the carotid artery is modeled according to the features, and the projection position can be used as an initial planning path for the robot autonomous ultrasonic scanning to guide the automatic carotid artery autonomous ultrasonic scanning. The method comprises two stages of feature identification and projection position modeling.
As shown in fig. 1, an embodiment of the present disclosure provides a method for positioning a human carotid artery based on 3D vision, including:
s101, obtaining ultrasonic image data;
the ultrasound image data includes: aligned color map data and depth map data;
the acquiring ultrasound image data includes: and (5) photographing by using a 3D camera to acquire aligned color image data and depth image data.
In the embodiment of the disclosure, a fixedly-installed 3D camera is adopted to shoot RGB-D color images and depth images of the complete human face and neck regions, and the projection position of the carotid artery on the surface of the neck is calculated from the RGB-D color images and the depth images.
S102, detecting ultrasonic image data by utilizing deep learning, and generating characteristic point parameters of a specific position of a human body, wherein the specific position of the human body in the embodiment of the disclosure comprises the following steps: at least one of a left chin profile, a right chin profile, a left neck profile, and a right neck profile.
As shown in fig. 2 and 4, detecting ultrasonic image data by using deep learning, and generating the characteristic point parameters of the specific position of the human body includes:
s201, detecting characteristic points of a left lower jaw contour, a right lower jaw contour, a left neck contour and a right neck contour by utilizing deep learning, and generating corresponding fitting curves;
s202, uniformly sampling each fitted curve according to the set sampling number;
s203, calculating characteristic point parameters according to the sampling number, and calculating corresponding sampling point coordinates for i from 0 to N-1 according to the following formula to finish sampling; the characteristic point parameters include: sampling step length and sampling point coordinates;
xi=x0+i·t
yi=L(xi)
s204, performing curve fitting on the extracted characteristic points by adopting a Lagrange interpolation method according to the characteristic point parameters to generate corresponding fitting curves; or,
and S205, if any one of the four curves of the left lower jaw contour, the right lower jaw contour, the left neck contour and the right neck contour cannot be successfully extracted in the step S201, marking all coordinates of sampling points on the curve which fails to be extracted as a preset numerical value. For example, if there is a large angle deflection of the head, it will cause the failure of feature point extraction on a part of the curve, and the coordinates of the sampling points on the curve with failed extraction are all marked as (-1, -1).
And performing curve fitting on the extracted characteristic points by adopting a Lagrange interpolation method to generate a corresponding fitting curve by adopting the following formula:
wherein L (x) represents an analytical expression of a fitting curve, li(x) Is one of expressions L (x), (x)i,yi) Is the coordinate of the feature point, xiIs the abscissa, yiIs the ordinate, k is the feature
The number of points.
S103, determining the projection position of the carotid artery by adopting a radial basis function according to the characteristic point parameters of the specific position of the human body;
and S104, projecting the position of the carotid artery in the ultrasonic image to the surface of the neck to guide autonomous ultrasonic scanning.
The projection onto the neck surface according to the carotid artery to guide an autonomous ultrasound scan employs the following equation:
wherein u, v are horizontal and vertical coordinates of the projected position point of the picture, D is depth map data, wherein fx,fyFocal length in x or y direction of the 3D camera, c, respectivelyx,cyIs the offset of the optical axis from the center of the projection plane coordinates.
As shown in fig. 3, the determining the projection position of the carotid artery by using the radial basis function according to the characteristic point parameter of the specific position of the human body includes:
s301, calculating the probability distribution of the projection positions of the carotid artery on a color map according to the characteristic point parameters of the specific positions of the human body;
where F represents a set of feature points, n is the number of feature points, μiIs the center of the radial basis function and is determined by the coordinates of the feature points.And λiIs the variance and coefficient of the radial basis function, which is the parameter to be trained.
S302, according to the projection position probability distribution, (or parameters of the model are obtained through pre-calculation by utilizing a maximum likelihood estimation model and stored), specifically, training data { X ] are collected and labeled in advancei,FiContains different deflection postures, FiRepresenting the set of extracted feature points, XiIs the set of picture projection location points for the corresponding carotid artery. Estimating model parameters lambda and sigma;
s303, extracting an area and a preset area with the probability reaching a set threshold value according to the characteristic point parameter of the specific position of the human body and the parameter of the pre-constructed maximum likelihood estimation model so as to obtain the projection position of the carotid artery on the color map.
From the above, the human carotid artery positioning method based on 3D vision is provided, modeling is carried out on the position of a carotid artery based on the characteristics of the face and the neck, the defect that a 3D camera cannot directly sense the carotid artery is overcome, the process of manually planning a path and searching before autonomous ultrasonic scanning is avoided, and automation of the whole process is realized.
As shown in fig. 5 and 6, a human carotid artery positioning device 600 based on 3D vision in the embodiment of the present disclosure includes:
an obtaining module 601, configured to obtain ultrasound image data;
a generating module 602, configured to detect ultrasound image data by using deep learning, and generate a parameter of a feature point at a specific position of a human body;
the determining module 603 is configured to determine the projection position of the carotid artery by using the radial basis function according to the characteristic point parameter of the specific position of the human body;
a guidance module 604 for projecting the position of the carotid artery in the ultrasound image to the neck surface to guide the autonomous ultrasound scan.
The ultrasound image data includes: aligned color map data and depth map data;
the obtaining module 601 is configured to take a picture with a 3D camera to obtain aligned color image data and depth image data.
The specific position of the human body comprises: at least one of a left chin profile, a right chin profile, a left neck profile, and a right neck profile;
the generating module 602 is configured to detect feature points of the left lower jaw contour, the right lower jaw contour, the left neck contour, and the right neck contour by using deep learning, and generate corresponding fitting curves;
uniformly sampling each fitted curve according to the set sampling number;
calculating the characteristic point parameters according to the sampling number to finish sampling; the characteristic point parameters include: sampling step length and sampling point coordinates;
and performing curve fitting on the extracted characteristic points by adopting a Lagrange interpolation method according to the characteristic point parameters to generate corresponding fitting curves.
And performing curve fitting on the extracted characteristic points by adopting a Lagrange interpolation method to generate a corresponding fitting curve by adopting the following formula:
wherein L (x) represents an analytical expression of a fitted curve, li (x) is one term of the expression L (x), (x)i,yi) Is the coordinate of the feature point, x is the abscissa, y ═ l (x) is the ordinate, and k is the number of feature points.
The determining module 603 is configured to calculate a projection position probability distribution of the carotid artery on the color map according to the characteristic point parameter of the specific position of the human body;
calculating to obtain parameters of a pre-constructed maximum likelihood estimation model according to the projection position probability distribution;
and extracting an area and a preset area with the probability reaching a set threshold value according to the characteristic point parameters of the specific position of the human body and the parameters of the pre-constructed maximum likelihood estimation model so as to obtain the projection position of the carotid artery on the color map.
The determining module 603 is configured to record all coordinates of the sampling points on the curve where the extraction fails as the preset numerical value if the extracted probability does not reach the region with the set threshold and/or is not in the preset region.
The projection onto the neck surface according to the carotid artery to guide an autonomous ultrasound scan employs the following equation:
wherein u and v are horizontal and vertical coordinates of a picture projection position point, D is depth map data, and fx,fyFocal length in x or y direction of the 3D camera, c, respectivelyx,cyIs the offset of the optical axis from the coordinate center of the projection plane.
The human carotid artery positioning device based on 3D vision provided by the embodiment of the disclosure can execute the human carotid artery positioning method based on 3D vision provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
Fig. 7 is a schematic view of an electronic device provided by an embodiment of the disclosure. As shown in fig. 7, the electronic apparatus of this embodiment includes: a processor 701, a memory 702, and a computer program 703 stored in the memory 702 and executable on the processor 701. The steps in the various method embodiments described above are implemented when the processor 701 executes the computer program 703. Alternatively, the processor 701 implements the functions of each module/unit in each device embodiment described above when executing the computer program 703.
Illustratively, the computer program 703 may be partitioned into one or more modules/units, which are stored in the memory 702 and executed by the processor 701 to accomplish the present disclosure. One or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 703 in the electronic device.
The electronic device may be a desktop computer, a notebook, a palm computer, a cloud server, or other electronic devices. The electronic device may include, but is not limited to, a processor 701 and a memory 702. Those skilled in the art will appreciate that fig. 7 is merely an example of an electronic device and is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 701 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 702 may be an internal storage unit of the electronic device, for example, a hard disk or a memory of the electronic device. The memory 702 may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device. Further, the memory 702 may also include both internal storage units and external storage devices of the electronic device. The memory 702 is used to store computer programs and other programs and data required by the electronic device. The memory 702 may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the device is divided into different functional units or modules, so as to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one type of logical function, another division may be made in an actual implementation, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier signal, telecommunications signal, software distribution medium, etc. It should be noted that the computer-readable medium may contain suitable additions or subtractions depending on the requirements of legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer-readable media may not include electrical carrier signals or telecommunication signals in accordance with legislation and patent practice.
The above examples are only intended to illustrate the technical solution of the present disclosure, not to limit it; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and they should be construed as being included in the scope of the present disclosure.
Claims (9)
1. A human carotid artery positioning method based on 3D vision is characterized by comprising the following steps:
acquiring ultrasonic image data;
detecting ultrasonic image data by utilizing deep learning to generate characteristic point parameters of a specific position of a human body;
determining the projection position of the carotid artery by adopting a radial basis function according to the characteristic point parameters of the specific position of the human body;
the position of the carotid artery in the ultrasound image is projected to the neck surface to guide an autonomous ultrasound scan.
2. The method of claim 1, wherein the ultrasound image data comprises: aligned color map data and depth map data;
the acquiring ultrasound image data includes: and (5) photographing by using a 3D camera to acquire aligned color image data and depth image data.
3. The method of claim 1, wherein the human body specific location comprises: at least one of a left chin profile, a right chin profile, a left neck profile, and a right neck profile;
detecting ultrasonic image data by utilizing deep learning, wherein the step of generating the characteristic point parameters of the specific position of the human body comprises the following steps:
detecting characteristic points of the left lower jaw contour, the right lower jaw contour, the left neck contour and the right neck contour by utilizing deep learning to generate corresponding fitting curves;
uniformly sampling each fitted curve according to the set sampling number;
calculating the characteristic point parameters according to the sampling number to finish sampling; the characteristic point parameters include: sampling step length and sampling point coordinates;
performing curve fitting on the extracted characteristic points by adopting a Lagrange interpolation method according to the characteristic point parameters to generate corresponding fitting curves; or,
if one of the four curves of the left lower jaw contour, the right lower jaw contour, the left neck contour and the right neck contour cannot be extracted successfully, the coordinates of sampling points on the curve with the extraction failure are recorded as preset values.
4. The method according to claim 3, wherein said curve fitting is performed on the extracted feature points by using Lagrangian interpolation, and the corresponding fitting curve is generated by using the following formula:
wherein L (x) represents an analytical expression of a fitting curve, li(x) Is one term of the expression L (x)i,yi) Is the coordinate of the feature point, xiIs the abscissa, yiIs the ordinate and k is the number of feature points.
5. The method according to claim 3, wherein the determining the projection position of the carotid artery by using the radial basis function according to the characteristic point parameter of the specific position of the human body comprises:
calculating the projection position probability distribution of the carotid artery on the color map according to the characteristic point parameters of the specific position of the human body;
calculating to obtain parameters of a pre-constructed maximum likelihood estimation model according to the projection position probability distribution;
and extracting an area and a preset area with the probability reaching a set threshold value according to the characteristic point parameters of the specific position of the human body and the parameters of the pre-constructed maximum likelihood estimation model so as to obtain the projection position of the carotid artery on the color map.
6. The method of claim 3, wherein the projection to the neck surface from the carotid artery to guide the autonomous ultrasound scan employs the following equation:
wherein u, v are horizontal and vertical coordinates of the projected position point of the picture, D is depth map data, wherein fx,fyFocal length in x or y direction of the 3D camera, c, respectivelyx,cyIs the offset of the optical axis from the center of the projection plane coordinates.
7. A human carotid artery positioning device based on 3D vision, comprising:
the acquisition module is used for acquiring ultrasonic image data;
the generating module is used for detecting ultrasonic image data by utilizing deep learning and generating human body specific position characteristic point parameters;
the determining module is used for determining the projection position of the carotid artery by adopting a radial basis function according to the characteristic point parameter of the specific position of the human body;
and the guiding module is used for projecting the position of the carotid artery in the ultrasonic image to the surface of the neck so as to guide the autonomous ultrasonic scanning.
8. An electronic device, comprising: a memory and one or more processors;
the memory to store one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
9. A storage medium containing computer-executable instructions for implementing the method of any one of claims 1-6 when executed by a computer processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210527495.8A CN114782537A (en) | 2022-05-16 | 2022-05-16 | Human carotid artery positioning method and device based on 3D vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210527495.8A CN114782537A (en) | 2022-05-16 | 2022-05-16 | Human carotid artery positioning method and device based on 3D vision |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114782537A true CN114782537A (en) | 2022-07-22 |
Family
ID=82437650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210527495.8A Pending CN114782537A (en) | 2022-05-16 | 2022-05-16 | Human carotid artery positioning method and device based on 3D vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114782537A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117898769A (en) * | 2024-02-06 | 2024-04-19 | 哈尔滨库柏特科技有限公司 | Autonomous ultrasonic robot carotid artery scanning method and device based on three-dimensional reconstruction |
-
2022
- 2022-05-16 CN CN202210527495.8A patent/CN114782537A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117898769A (en) * | 2024-02-06 | 2024-04-19 | 哈尔滨库柏特科技有限公司 | Autonomous ultrasonic robot carotid artery scanning method and device based on three-dimensional reconstruction |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102013866B1 (en) | Method and apparatus for calculating camera location using surgical video | |
US10318839B2 (en) | Method for automatic detection of anatomical landmarks in volumetric data | |
CN107909622B (en) | Model generation method, medical imaging scanning planning method and medical imaging system | |
JP6573354B2 (en) | Image processing apparatus, image processing method, and program | |
US10535136B2 (en) | Registration of a magnetic tracking system with an imaging device | |
CN109176512A (en) | A kind of method, robot and the control device of motion sensing control robot | |
US20160199147A1 (en) | Method and apparatus for coordinating position of surgery region and surgical tool during image guided surgery | |
EP2104921B1 (en) | A method, an apparatus and a computer program for data processing | |
CN110363817B (en) | Target pose estimation method, electronic device, and medium | |
US20140003738A1 (en) | Method and apparatus for gaze point mapping | |
US20220414291A1 (en) | Device for Defining a Sequence of Movements in a Generic Model | |
WO2022217794A1 (en) | Positioning method of mobile robot in dynamic environment | |
CN109255801B (en) | Method, device and equipment for tracking edges of three-dimensional object in video and storage medium | |
CN112382359A (en) | Patient registration method and device, electronic equipment and computer readable medium | |
CN110070529A (en) | A kind of Endovascular image division method, system and electronic equipment | |
CN114782537A (en) | Human carotid artery positioning method and device based on 3D vision | |
CN108597589B (en) | Model generation method, target detection method and medical imaging system | |
CN113240638B (en) | Target detection method, device and medium based on deep learning | |
CN110348351A (en) | Image semantic segmentation method, terminal and readable storage medium | |
JP2006113832A (en) | Stereoscopic image processor and program | |
CN115880428A (en) | Animal detection data processing method, device and equipment based on three-dimensional technology | |
CN113140031B (en) | Three-dimensional image modeling system and method and oral cavity scanning equipment applying same | |
KR102577964B1 (en) | Alignment system for liver surgery | |
CN114495199B (en) | Organ positioning method, organ positioning device, electronic equipment and storage medium | |
Savii | Camera calibration using compound genetic-simplex algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |