CN114399499A - Organ volume determination method, device, equipment and storage medium - Google Patents

Organ volume determination method, device, equipment and storage medium Download PDF

Info

Publication number
CN114399499A
CN114399499A CN202210065840.0A CN202210065840A CN114399499A CN 114399499 A CN114399499 A CN 114399499A CN 202210065840 A CN202210065840 A CN 202210065840A CN 114399499 A CN114399499 A CN 114399499A
Authority
CN
China
Prior art keywords
target
organ
probe
contour
ultrasonic image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210065840.0A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chison Medical Technologies Co ltd
Original Assignee
Chison Medical Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chison Medical Technologies Co ltd filed Critical Chison Medical Technologies Co ltd
Priority to CN202210065840.0A priority Critical patent/CN114399499A/en
Publication of CN114399499A publication Critical patent/CN114399499A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image

Abstract

The application relates to a method, a device, equipment and a storage medium for determining organ volume, in particular to the technical field of computer vision. The method comprises the following steps: acquiring each ultrasonic image and the probe pose when the target probe acquires the ultrasonic image; based on the position and the posture of each probe, converting the organ contour points in each ultrasonic image into a world coordinate system to obtain contour coordinates of each organ; and fitting the target organ in the world coordinate system according to the contour coordinates of each organ to obtain the volume of the target organ. According to the scheme, the target organ is subjected to ultrasonic detection through different probe poses so as to calculate and obtain organ profiles in different directions, so that better fitting to the target organ to obtain the volume is realized, the influence of personal experience is avoided, and the accuracy of detecting the volume of the organ is improved.

Description

Organ volume determination method, device, equipment and storage medium
Technical Field
The invention relates to the field of medical image processing, in particular to a method, a device, equipment and a storage medium for determining organ volume.
Background
Ultrasound images reflect differences in acoustic parameters in the medium and may yield information other than optical, X-ray, y-ray, etc. The ultrasonic wave has good resolving power to the soft tissue of the human body, can obtain useful signals with a dynamic range of more than 120dB, and is beneficial to identifying the tiny lesion of the biological tissue. When the ultrasonic image displays the living tissue, the required image can be obtained without dyeing treatment.
In the actual medical field, when the volume of an organ of a human body needs to be detected, a doctor usually collects and processes an ultrasound image of the human body to obtain an ultrasound image of the human body, and at the moment, the doctor can judge whether the volume of the organ is abnormal in the obtained ultrasound image of the target human body according to experience and comparison with the ultrasound image of the standard organ size.
In the above scheme, the judgment of the volume of the organ depends on the influence of personal experience, the size of the acquired ultrasonic image is related to the pose of the acquisition probe, and the judgment of the volume of the organ by manual work has a large error.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for determining organ volume, which improve the accuracy of organ volume detection.
In one aspect, there is provided a method of organ volume determination, the method comprising:
acquiring each ultrasonic image and the probe pose when the target probe acquires the ultrasonic image;
based on the position and the posture of each probe, converting the organ contour points in each ultrasonic image into a world coordinate system to obtain contour coordinates of each organ;
and fitting the target organ in the world coordinate system according to the contour coordinates of each organ to obtain the volume of the target organ.
In a further aspect, there is provided an organ volume determination apparatus, the apparatus comprising:
the ultrasonic image acquisition module is used for acquiring each ultrasonic image and the probe pose when the target probe acquires the ultrasonic image;
the coordinate conversion module is used for converting the organ contour points in each ultrasonic image into a world coordinate system based on the pose of each probe to obtain the contour coordinates of each organ;
and the organ volume acquisition module is used for fitting the target organ in the world coordinate system according to the contour coordinates of each organ so as to acquire the volume of the target organ.
In one possible implementation manner, the probe pose includes position information and posture information of the target probe when acquiring the ultrasound image;
the ultrasonic image acquisition module is also used for,
acquiring the ultrasonic images and the time information when the target probe acquires the ultrasonic images;
acquiring position information, which is acquired by a depth camera and acquired by the target probe when the target probe acquires each ultrasonic image, according to the time information when the target probe acquires each ultrasonic image;
and calculating attitude information of the target probe when acquiring each ultrasonic image according to the gyroscope sensor data of the target probe.
In one possible implementation, the apparatus further includes:
the human body image acquisition module is used for acquiring a target human body image;
and the target position determining module is used for performing deformation fitting on the target human body image and a standard anatomical map, and determining the position of the target organ so as to indicate the target probe to scan the position of the target organ.
In a possible implementation manner, the coordinate transformation module further includes:
the middle coordinate acquisition unit is further used for acquiring the middle coordinates of the organ contour points in each ultrasonic image in a probe coordinate system according to the depth information of the ultrasonic images;
and the contour coordinate acquisition unit is used for converting each intermediate coordinate to the world coordinate system based on the probe pose to obtain each organ contour coordinate.
In a possible implementation manner, the contour coordinate obtaining unit is further configured to,
generating a position conversion matrix according to the target pose information and the initial pose information of the target probe; the position conversion matrix comprises rotation information and position information of the target probe; the target pose information is used for indicating the probe pose when the target probe acquires a target ultrasonic image;
and converting the target intermediate coordinate into a world coordinate system according to the position conversion matrix so as to obtain the target organ contour coordinate indicated by the target ultrasonic image.
In one possible implementation, the organ volume obtaining module is further configured to,
according to the contour coordinates of each organ, spline fitting is carried out from an interface vertical to a target contour, and the shape of the target organ is obtained;
and carrying out integral processing on the shape of the target organ to obtain the volume of the target organ.
In one possible implementation, the organ volume obtaining module is further configured to,
acquiring contour coordinates of each target organ on the target contour interface;
and respectively taking the contour coordinates of the target organ in the direction of the specified line segment as a starting point and an end point, and fitting the contour coordinates of each organ into each closed graph through splines so as to form the shape of the target organ.
In yet another aspect, a computer device is provided, which comprises a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the above organ volume determination method.
In yet another aspect, a computer-readable storage medium is provided having at least one instruction stored therein, the at least one instruction being loaded and executed by a processor to implement the above-described organ volume determination method.
In yet another aspect, a computer program product is provided, as well as a computer program product or a computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the above-described organ volume determination method.
The technical scheme provided by the application can comprise the following beneficial effects:
when the volume of a target organ needs to be detected, multiple times of ultrasonic detection can be carried out on the region where the organ is located through the target probe, ultrasonic images acquired under different probe poses are acquired, then the computer equipment is transformed into a world coordinate system according to organ contour points acquired under different probe poses, contour coordinates of each organ are acquired, at the moment, the target organ can be fitted according to the organ contour coordinates acquired under different poses, so that the computer can integrate the shape of the target organ, and the volume of the target organ is acquired. According to the scheme, the target organ is subjected to ultrasonic detection through different probe poses so as to calculate and obtain organ profiles in different directions, so that better fitting is realized for the target organ to obtain the volume, the influence of personal experience is avoided, and the accuracy of detecting the volume of the organ is improved.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic diagram illustrating the structure of an organ volume determination system according to an exemplary embodiment.
Fig. 2 is a method flow diagram illustrating an organ volume determination method according to an exemplary embodiment.
Fig. 3 is a method flow diagram illustrating an organ volume determination method according to an exemplary embodiment.
FIG. 4 is a method flow diagram illustrating a bladder volume determination method according to an exemplary embodiment.
Fig. 5 is a block diagram showing the structure of an organ volume determination apparatus according to an exemplary embodiment.
FIG. 6 is a schematic diagram of a computer device provided in accordance with an exemplary embodiment of the present application.
Detailed Description
The technical solutions of the present application will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be understood that "indication" mentioned in the embodiments of the present application may be a direct indication, an indirect indication, or an indication of an association relationship. For example, a indicates B, which may mean that a directly indicates B, e.g., B may be obtained by a; it may also mean that a indicates B indirectly, for example, a indicates C, and B may be obtained by C; it can also mean that there is an association between a and B.
In the description of the embodiments of the present application, the term "correspond" may indicate that there is a direct correspondence or an indirect correspondence between the two, may also indicate that there is an association between the two, and may also indicate and be indicated, configure and configured, and so on.
In the embodiment of the present application, "predefining" may be implemented by saving a corresponding code, table, or other manners that may be used to indicate related information in advance in a device (for example, including a terminal device and a network device), and the present application is not limited to a specific implementation manner thereof.
Before describing the various embodiments shown herein, several concepts related to the present application will be described.
1) AI (Artificial Intelligence, intellectual Association)
Artificial Intelligence (Artificial Intelligence), abbreviated in english as AI. The method is a new technical science for researching and developing theories, methods, technologies and application systems for simulating, extending and expanding human intelligence. Artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence, a field of research that includes robotics, language recognition, image recognition, natural language processing, and expert systems, among others. Since the birth of artificial intelligence, theories and technologies become mature day by day, and application fields are expanded continuously, so that science and technology products brought by the artificial intelligence in the future can be assumed to be 'containers' of human intelligence. The artificial intelligence can simulate the information process of human consciousness and thinking. Artificial intelligence is not human intelligence, but can think like a human, and can also exceed human intelligence.
The main material basis for studying artificial intelligence and the machines that can implement the technical platform of artificial intelligence are computers. In addition to computer science, artificial intelligence also relates to the multi-disciplines of information theory, cybernetics, automation, bionics, biology, psychology, mathematical logic, linguistics, medicine, philosophy, and the like. The main contents of the artificial intelligence subject research comprise: knowledge representation, automatic reasoning and searching methods, machine learning and knowledge acquisition, knowledge processing systems, natural language understanding, computer vision, intelligent robots, automatic programming, and the like.
2) CV (Computer Vision )
Computer vision is a science for researching how to make a machine "see", and further, it means that a camera and a computer are used to replace human eyes to perform machine vision such as identification, tracking and measurement on a target, and further image processing is performed, so that the computer processing becomes an image more suitable for human eyes to observe or transmitted to an instrument to detect. As a scientific discipline, computer vision research-related theories and techniques attempt to build artificial intelligence systems that can acquire 'information' from images or multidimensional data. The information referred to herein refers to information defined by Shannon that can be used to help make a "decision". Because perception can be viewed as extracting information from sensory signals, computer vision can also be viewed as the science of how to make an artificial system "perceive" from images or multidimensional data.
Fig. 1 is a schematic diagram illustrating the structure of an organ volume determination system according to an exemplary embodiment. The organ volume determination system comprises a computer device 110, a depth camera 120 and an ultrasound probe 130. The computer device 110 and the depth camera 120, and the computer device 110 and the ultrasound probe 130 may be communicatively connected through a communication network, which may be a wired network or a wireless network, to implement transmission of image data or control signals.
Optionally, an application having an image processing function is installed in the computer device 110, and the application may be a professional image processing application, a medical application, or an AI application having an image processing function, which is not limited in this embodiment of the present application.
Alternatively, the computer device 110 may be a terminal device having a data transmission interface for receiving image data captured by an image capturing device having an image capturing component. That is, the computer device 110 may acquire the image information acquired by the depth camera 120 through the data transmission interface, and the computer device 110 may also acquire the ultrasound image of the target human body acquired by the ultrasound probe through the data transmission interface.
Optionally, the computer device 110 may further send a data acquisition instruction to the depth camera through a data transmission interface to instruct the depth camera to acquire image information of the target human body and the target probe at a specified time; the computer device 110 may also send data acquisition instructions to the ultrasound probe via the data transmission interface to instruct the ultrasound probe to acquire an ultrasound image of the target organ in a certain direction at a specified time.
When the computer device 110 receives the image information of the ultrasound probe acquired by the depth camera and the ultrasound image read by the ultrasound probe, the image information of the ultrasound probe and the ultrasound image may be subjected to data processing, so as to fit the shape and size of the target organ.
Optionally, the terminal 120 may also be an ultrasound device with a data processing module, and at this time, the ultrasound device may obtain the pose information of the probe of the ultrasound device from the depth camera according to an ultrasound image obtained by performing ultrasound detection on the target human body. The data processing module in the ultrasonic equipment directly processes data according to the ultrasonic image and the pose information of the probe of the ultrasonic equipment, so that the shape and the volume of a target organ in a target human body are determined. The method of the present application may be implemented directly by the ultrasound device, or may be implemented by other external computer devices.
Alternatively, the computer device 110 may be a terminal or a server, or the computer device 110 may be implemented as a data processing system in which a terminal and a server are combined.
Optionally, the terminal may be a mobile terminal such as a smart phone, a tablet computer, a laptop portable notebook computer, or the like, or a terminal such as a desktop computer, a projection computer, or the like, or an intelligent terminal having a data processing component, which is not limited in this embodiment of the present application.
The server may be implemented as one server, or may be implemented as a server cluster formed by a group of servers, which may be physical servers, or may be implemented as a cloud server. In one possible implementation, the server is a backend server of the application in the terminal.
In a possible implementation manner of the embodiment of the present application, in a medical scenario, when a target organ of a target human body needs to be subjected to volume detection, a user (e.g., medical staff) may input a detection operation to a computer device, so as to preset a type of the target organ that needs to be detected. The computer equipment determines the region of the target human body required to be subjected to ultrasonic detection according to the setting, sends an instruction to the ultrasonic probe and carries out ultrasonic detection on the target organ in the region under different postures; meanwhile, the computer device also sends an image acquisition instruction to the depth camera so as to acquire the position information and the depth information of the ultrasonic probe at different moments.
After the information acquisition process of the target organ is completed, the computer equipment calculates the shape and the volume of the target organ according to the ultrasonic image acquired by the ultrasonic probe, the posture of the probe and the position of the probe acquired by the depth camera, and stores the shape and the volume into a background medical server for filing.
Optionally, the server may be an independent physical server, a server cluster formed by a plurality of physical servers, or a distributed system, and may also be a cloud server that provides technical operation and computation services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, and a big data and artificial intelligence platform.
Optionally, the system may further include a management device, where the management device is configured to manage the system (e.g., manage connection states between the modules and the server, and the management device is connected to the server through a communication network. Optionally, the communication network is a wired network or a wireless network.
Optionally, the wireless network or wired network described above uses standard communication techniques and/or protocols. The network is typically the internet, but may be any other network including, but not limited to, a local area network, a metropolitan area network, a wide area network, a mobile, a limited or wireless network, a private network, or any combination of virtual private networks. In some embodiments, data exchanged over the network is represented using techniques and/or formats including hypertext markup language, extensible markup language, and the like. All or some of the links may also be encrypted using conventional encryption techniques such as secure sockets layer, transport layer security, virtual private network, internet protocol security, and the like. In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
Fig. 2 is a method flow diagram illustrating an organ volume determination method according to an exemplary embodiment. The method is performed by a computer device, which may be a server or a terminal in an organ volume determination system as shown in fig. 1. As shown in fig. 2, the organ volume determination method may include the steps of:
step 201, obtaining each ultrasonic image and the probe pose when the target probe acquires the ultrasonic image.
The ultrasonic images are obtained by acquiring the ultrasonic images of the target area of the target human body by the target probe in different probe postures.
In one possible implementation, the computer device determines in advance a target region to be acquired in a target human body, and controls the target probe to acquire images of the target region of the target human body in different poses, thereby obtaining ultrasound images acquired at different probe poses.
Optionally, the probe pose includes position information and posture information of the target probe; according to the position information and the posture information of the target probe, the direction corresponding to the probe in each ultrasonic image acquired by the target probe and the distance between the probe and the target area can be determined.
And 202, converting the organ contour points in each ultrasonic image into a world coordinate system based on the pose of each probe to obtain the contour coordinates of each organ.
After acquiring each probe pose and the ultrasound images acquired in the probe pose, the computer device may acquire organ contour points in each ultrasound image, for example, by using an image detection algorithm to obtain an organ contour of each ultrasound image and using each coordinate point of the contour as an organ contour point.
At this time, the organ contour points are coordinates in the ultrasound image acquired by the probe, that is, coordinates in a coordinate system established by the pose of the probe, so that the computer device can transform the coordinate system corresponding to each ultrasound image and the world coordinate system according to the pose of the probe, and accordingly, the organ contour points in the ultrasound image can also be transformed into the organ contour coordinates in the world coordinate system.
And step 203, fitting the target organ in the world coordinate system according to the contour coordinates of each organ to obtain the volume of the target organ.
The computer device performs fitting operation on the contour coordinates of each organ in a world coordinate system to obtain a three-dimensional solid figure of the target organ, and the computer device performs integration on the three-dimensional solid figure to obtain the volume of the target organ.
In summary, when the volume of the target organ needs to be detected, the target probe can perform multiple ultrasonic detections on the region where the organ is located to obtain ultrasonic images acquired under different probe poses, the computer device transforms the organ contour points acquired under different probe poses into a world coordinate system to obtain contour coordinates of each organ, and the organ contour coordinates obtained according to different poses can be fitted to obtain the target organ, so that the computer can integrate the shape of the target organ to obtain the volume of the target organ. According to the scheme, the target organ is subjected to ultrasonic detection through different probe poses so as to calculate and obtain organ profiles in different directions, so that better fitting is realized for the target organ to obtain the volume, the influence of personal experience is avoided, and the accuracy of detecting the volume of the organ is improved.
Fig. 3 is a method flow diagram illustrating an organ volume determination method according to an exemplary embodiment. The method is performed by a computer device, which may be a server or a terminal in an organ volume determination system as shown in fig. 1. As shown in fig. 3, the organ volume determination method may include the steps of:
step 301, acquiring a target human body image.
In the embodiment of the present application, in order to detect a target organ in a target human body, it is necessary to determine in advance where the target organ is located in the target human body.
Optionally, in the organ volume determination system, a complete image of the target human body may be acquired through an image acquisition device (e.g., a depth camera).
Optionally, in the organ extraction and determination system, the partial image of the target human body may also be acquired by an image acquisition device.
Optionally, the image acquisition device may vertically face downward or form a certain angle with the vertical direction to acquire an image of the target human body, and when the target human body lies flat, the image acquisition device may accurately acquire a part or all of the image of the target human body.
And 302, performing deformation fitting on the target human body image and a standard anatomical map, and determining the position of the target organ so as to instruct the target probe to scan the position of the target organ.
After the target human body image is obtained, the computer device performs adaptation processing on the target human body image and a pre-stored standard anatomical map, for example, performs equal-scale scaling on the target human body image, or performs equal-scale scaling on the standard anatomical map, compares the target human body image with the standard anatomical map, and determines the position of a target organ in the target human body image according to the position of the target organ in the standard anatomical map.
In one possible implementation, the product of the respective contour coordinate positions of the target organ in the standard anatomical map and the magnification factor is determined as the position of the target organ in the target human body image.
That is, when comparing the standard anatomical map with the target human body image, in order to consider the difference between the organ position of the actual human body and the theoretical value, the region where the target organ may be located needs to be enlarged at this time, so as to ensure that the complete target organ can be covered when performing the ultrasonic detection in the target human body image subsequently.
After the position of the target organ in the target human body image is determined, the computer device can determine the position information of the target organ in an actual coordinate system (namely, a world coordinate system constructed in a target direction) according to the proportional relation between the target human body image and the actual human body, so as to determine the position required to be acquired by the ultrasonic probe.
Step 303, obtaining the ultrasound images and the time information when the ultrasound images are acquired.
In a possible implementation manner, when the position of the human body to be acquired by the target probe (i.e., the ultrasound probe) is determined, the computer device may randomly set the acquisition direction and the acquisition position of the target probe within a selectable range based on the region of the human body to be acquired, for example, the target probe is controlled to randomly rotate by a certain angle with the center of the target region of the human body as an end point, so as to acquire each ultrasound image.
In this case, although the ultrasound images each include ultrasound image information of the target organ in the target human body region, the contour of the target organ displayed in each ultrasound image is different depending on the acquisition angle, and the contour information of the target organ in different positions captured on each ultrasound image is increased as the acquisition angle is increased.
And 304, acquiring the position information, which is acquired by the depth camera and acquired by the target probe when the target probe acquires each ultrasonic image, according to the time information when the target probe acquires each ultrasonic image.
When the target probe is positioned at different positions to acquire ultrasonic images of the target human body region, the computer equipment can also control the depth camera to acquire the position of the target probe, so that the position information of the target probe when acquiring each ultrasonic image is judged. When the computer equipment needs to acquire the position information when the target probe acquires an ultrasonic image, namely the time information of the ultrasonic image acquired by the target probe can be acquired, all the position information acquired by the depth camera is inquired, and the position information of which the acquisition time is the same as the time information of the ultrasonic image in all the position information is used as the position information when the target probe acquires the target image.
The position information of the target probe when acquiring the ultrasonic images comprises plane coordinate information and depth information, and coordinate values of the target probe on a world coordinate system when acquiring each ultrasonic image can be obtained through the position information.
And 305, calculating attitude information of the target probe when acquiring each ultrasonic image according to the gyroscope sensor data of the target probe.
In the embodiment of the application, a gyroscope sensor is also deployed in the target probe, the gyroscope sensor in the target probe records the rotation posture of the target probe in the rotation process of the target probe, and when the initial posture of the target probe is determined, the posture information of the target probe at each moment can be determined according to data in the gyroscope sensor.
Optionally, a magnetic sensor is also deployed in the target probe, and the computer device performs kalman filtering on the angle acquired by the gyroscope sensor and the angle acquired by the magnetic sensor to obtain the actual rotation information of the probe.
At this time, the position information and the posture information of the target probe when acquiring each ultrasonic image constitute the probe pose of the target probe.
In a possible implementation manner, an acceleration sensor is further present in the target probe, and when determining the world coordinate corresponding to the initial state of the target probe, the world coordinate (i.e., position information) of the target probe at each time may also be acquired according to acceleration data acquired by the acceleration sensor.
And step 306, acquiring the intermediate coordinates of the organ contour points in each ultrasonic image in the probe coordinate system according to the depth information of the ultrasonic image.
In the embodiment of the present application, the depth information also exists in the ultrasound image, for example, when the ultrasound probe sends an ultrasound wave to the target area, the ultrasound probe generates a corresponding ultrasound image according to the returned ultrasound wave, and the time interval between the sending ultrasound wave and the returning ultrasound wave is a time interval that can represent the depth information of the ultrasound image, that is, the distance between the ultrasound probe and the plane corresponding to the ultrasound image.
Since the ultrasound image is a plane perpendicular to the ultrasound probe, if a probe coordinate system is constructed with the probe of the ultrasound probe as an origin, the X-axis and Y-axis coordinates of each intermediate coordinate in the probe coordinate system can be obtained according to the organ contour points in the ultrasound image, and the Z-axis coordinate of each intermediate coordinate is the depth of the ultrasound image.
And 307, converting each intermediate coordinate into the world coordinate system based on the probe pose to obtain each organ contour coordinate.
After the coordinates of each organ contour point in the ultrasonic image in the probe coordinate system are acquired, the probe poses are different due to the fact that each ultrasonic image is acquired, namely, each organ contour point in each ultrasonic image is respectively located in different coordinate systems and is difficult to compare. Therefore, the relation between the probe coordinate system and the world coordinate system can be constructed according to the pose of the probe, and each intermediate coordinate is converted into the world coordinate system based on the relation, so that the organ contour coordinates of the contour of the target organ collected from different directions are obtained.
In one possible implementation, a position transformation matrix is generated according to the target pose information and the initial pose information of the target probe; the position conversion matrix comprises rotation information and position information of the target probe; and converting the target intermediate coordinate into a world coordinate system according to the position conversion matrix so as to obtain the target organ contour coordinate indicated by the target ultrasonic image.
And the target pose information is used for indicating the probe pose when the target probe acquires the target ultrasonic image.
When the target ultrasonic images in the ultrasonic images are acquired, the computer equipment simultaneously reads the target pose information when the target probe acquires the target ultrasonic images so as to subsequently calculate the data in the target ultrasonic images.
When the target pose information of the target probe when acquiring the target ultrasonic image is acquired, the target pose information comprises the position information and the posture information of the target probe; when the initial position information and the initial posture information of the target probe are acquired, assuming that a world coordinate system is constructed by the position and the direction of the initial target probe, according to the initial posture information of the target probe, the rotation information of the target probe when acquiring a target ultrasonic image can be determined, for example, in the embodiment of the application, the rotation information can be expressed by a rotation matrix; similarly, according to the initial position information of the target probe and the target position information when the target ultrasonic image is acquired, the position information when the target ultrasonic image is acquired can be determined.
For example, the target organ contour coordinates may be calculated as follows.
During actual scanning, the depth information of the probe can be obtained, so that the specific x and y coordinates of the bladder profile in the probe coordinate system can be obtained through the pixel coordinates and the image depth resolution on the ultrasonic image coordinates. Meanwhile, the x and y axes of the ultrasonic image and the probe coordinate system are always coplanar, and the outline point coordinate can be expressed as (x, y,0) in the probe coordinate system. In 4 steps, the rotation angles (alpha, beta, gamma) and the actual coordinate positions (x1, y1, z1) of the probe around the x, y and z axes in the world coordinate system are obtained. Therefore, each point on the ultrasound image can be converted to the world coordinate system (x2, y2, z2) by the following conversion formula: where a' is a matrix formed by [ x2, y2, z2], R is a rotation matrix inverse formed by (α, β, γ), i.e. the inverse of the rotation matrix is expressed from the probe coordinate system to the world coordinate system, t is the coordinates of the probe in the world coordinate system, a is [ x1, y1, z1], so that the conversion from the probe coordinate system to the world coordinate system can be expressed by the following formula:
Figure BDA0003480023200000131
and 308, performing spline fitting from the interface vertical to the target contour according to the contour coordinates of each organ to obtain the shape of the target organ.
After the organ contour coordinates in each ultrasonic image are converted into a world coordinate system, each organ contour coordinate actually represents each point coordinate on the contour of the target organ, and at the moment, each point coordinate is fitted, so that the three-dimensional shape of the target organ can be obtained.
In one possible implementation, obtaining contour coordinates of each target organ on the target contour interface; and respectively taking the contour coordinates of the target organ in the direction of the specified line segment as a starting point and an end point, and fitting the contour coordinates of each organ into each closed graph through splines so as to form the shape of the target organ.
That is, in the embodiment of the present application, a spline fitting manner may be adopted to perform processing, a contour interface is randomly selected as a target contour interface, and a direction is arbitrarily selected, two organ contour coordinates exist in the direction of the target contour interface, and are respectively used as a starting point and an end point, and each organ contour coordinate on a plane perpendicular to the target contour interface is fitted into a closed graph, at this time, the closed graph may be regarded as a cross-sectional view of a target organ perpendicular to the target contour interface.
Repeating the steps to obtain each section of the target organ, and fitting each section to obtain the target shape of the target organ.
Step 309, integrating the shape of the target organ to obtain the volume of the target organ.
After the target shape of the target organ is obtained, the three-dimensional stereo graph of the target organ is subjected to integration processing, and the volume size of the target organ can be obtained.
In summary, when the volume of the target organ needs to be detected, the target probe can perform multiple ultrasonic detections on the region where the organ is located to obtain ultrasonic images acquired under different probe poses, the computer device transforms the organ contour points acquired under different probe poses into a world coordinate system to obtain contour coordinates of each organ, and the organ contour coordinates obtained according to different poses can be fitted to obtain the target organ, so that the computer can integrate the shape of the target organ to obtain the volume of the target organ. According to the scheme, the target organ is subjected to ultrasonic detection through different probe poses so as to calculate and obtain organ profiles in different directions, so that better fitting is realized for the target organ to obtain the volume, the influence of personal experience is avoided, and the accuracy of detecting the volume of the organ is improved.
FIG. 4 is a method flow diagram illustrating a bladder volume determination method according to an exemplary embodiment. The method is performed by a computer device, which in the embodiment of the present application is used for ultrasonically examining the urine volume in the bladder, thereby avoiding the problems of susceptibility to personal experience and errors in the way the bladder volume is estimated, which may be a server or a terminal in the organ volume determination system as shown in fig. 1. As shown in fig. 4, the bladder volume determination method may include the steps of:
step 401, the bladder location is determined. The bladder position images the patient in the target area through a designated device (such as a camera installed right above the human body, and optionally, the camera can be a depth camera) installed at the designated position, and the standard anatomical map of the human body is subjected to deformation adaptation through key points on the body surface. And carrying out region segmentation on the human body for the fitted human anatomy map, wherein the bladder position obtained by segmentation is used as the final approximate position of the bladder.
Step 402, the bladder area is scanned. From the navel, along the centerline of the body, through the bladder area determined in step 1.
Step 403, acquiring an ultrasound image and posture information. During scanning, the bladder outline of the ultrasonic image is synchronously extracted (the bladder image on the ultrasonic image is divided without limitation on a specific method as long as the bladder image can be divided), the posture of the probe is recorded by imu, and meanwhile, the body surface positioning of the probe is recorded by a depth camera above the human body.
In step 404, the probe rotation and displacement information is calculated. Resolving the rotation and position information of the probe through a depth camera (right above the human body) and imu probe attitude information; the probe is internally provided with 9-axis imu, and after Kalman filtering is carried out on an angle acquired by a gyroscope sensor and an angle acquired by a magnetic sensor, the angle is used as real rotation information of the probe, real-time integration is carried out on an acceleration sensor, and the movement displacement of the probe is acquired; and simultaneously, detecting the probe through the RGB image of the depth camera (any image detection algorithm can be adopted without making requirements on a specific detection algorithm), and converting the depth value of the detected probe position obtained through the depth camera into specific probe position information.
Step 405, coordinate transformation. Through the probe information calculated in step 404, the bladder contour information extracted in step 403 is transformed into a world coordinate system, and the specific implementation manner of step 405 is similar to that shown in the embodiment of fig. 3, and is not described again here.
Step 406, calculate bladder volume. Fitting the scanned bladder contour sections to obtain a bladder three-dimensional form, and calculating the bladder volume according to the three-dimensional form.
And according to the extracted contour points, spline fitting is carried out on all the scanned contour points in the direction vertical to the contour interface, and the three-dimensional shape of the bladder is obtained. The bladder volume is then calculated by integrating the fitted bladders in the vertical direction.
Fig. 5 is a block diagram showing the structure of an organ volume determination apparatus according to an exemplary embodiment. The device comprises:
an ultrasound image obtaining module 501, configured to obtain each ultrasound image and a probe pose when a target probe acquires the ultrasound image;
a coordinate transformation module 502, configured to transform the organ contour points in each ultrasound image into a world coordinate system based on the pose of each probe, so as to obtain contour coordinates of each organ;
an organ volume obtaining module 503, configured to fit the target organ in the world coordinate system according to the contour coordinates of each organ to obtain a volume of the target organ.
In one possible implementation manner, the probe pose includes position information and posture information of the target probe when acquiring the ultrasound image;
the ultrasonic image acquisition module is also used for,
acquiring the ultrasonic images and the time information when the target probe acquires the ultrasonic images;
acquiring position information, which is acquired by a depth camera and acquired by the target probe when the target probe acquires each ultrasonic image, according to the time information when the target probe acquires each ultrasonic image;
and calculating attitude information of the target probe when acquiring each ultrasonic image according to the gyroscope sensor data of the target probe.
In one possible implementation, the apparatus further includes:
the human body image acquisition module is used for acquiring a target human body image;
and the target position determining module is used for performing deformation fitting on the target human body image and a standard anatomical map, and determining the position of the target organ so as to indicate the target probe to scan the position of the target organ.
In a possible implementation manner, the coordinate transformation module further includes:
the middle coordinate acquisition unit is further used for acquiring the middle coordinates of the organ contour points in each ultrasonic image in a probe coordinate system according to the depth information of the ultrasonic images;
and the contour coordinate acquisition unit is used for converting each intermediate coordinate to the world coordinate system based on the probe pose to obtain each organ contour coordinate.
In a possible implementation manner, the contour coordinate obtaining unit is further configured to,
generating a position conversion matrix according to the target pose information and the initial pose information of the target probe; the position conversion matrix comprises rotation information and position information of the target probe;
and converting the target intermediate coordinate into a world coordinate system according to the position conversion matrix so as to obtain the target organ contour coordinate indicated by the target ultrasonic image.
In one possible implementation, the organ volume obtaining module is further configured to,
according to the contour coordinates of each organ, spline fitting is carried out from an interface vertical to a target contour, and the shape of the target organ is obtained;
and carrying out integral processing on the shape of the target organ to obtain the volume of the target organ.
In one possible implementation, the organ volume obtaining module is further configured to,
acquiring contour coordinates of each target organ on the target contour interface;
and respectively taking the contour coordinates of the target organ in the direction of the specified line segment as a starting point and an end point, and fitting the contour coordinates of each organ into each closed graph through splines so as to form the shape of the target organ.
In summary, when the volume of the target organ needs to be detected, the target probe can perform multiple ultrasonic detections on the region where the organ is located to obtain ultrasonic images acquired under different probe poses, the computer device transforms the organ contour points acquired under different probe poses into a world coordinate system to obtain contour coordinates of each organ, and the organ contour coordinates obtained according to different poses can be fitted to obtain the target organ, so that the computer can integrate the shape of the target organ to obtain the volume of the target organ. According to the scheme, the target organ is subjected to ultrasonic detection through different probe poses so as to calculate and obtain organ profiles in different directions, so that better fitting is realized for the target organ to obtain the volume, the influence of personal experience is avoided, and the accuracy of detecting the volume of the organ is improved.
Refer to fig. 6, which is a schematic diagram of a computer device according to an exemplary embodiment of the present application, the computer device including a memory and a processor, the memory storing a computer program, and the computer program when executed by the processor implementing the method.
The processor may be a Central Processing Unit (CPU). The Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the methods of the embodiments of the present invention. The processor executes various functional applications and data processing of the processor by executing non-transitory software programs, instructions and modules stored in the memory, that is, the method in the above method embodiment is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor, and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be coupled to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In an exemplary embodiment, a computer readable storage medium is also provided for storing at least one computer program, which is loaded and executed by a processor to implement all or part of the steps of the above method. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A method of organ volume determination, the method comprising:
acquiring each ultrasonic image and the probe pose when the target probe acquires the ultrasonic image;
based on the position and the posture of each probe, converting the organ contour points in each ultrasonic image into a world coordinate system to obtain contour coordinates of each organ;
and fitting the target organ in the world coordinate system according to the contour coordinates of each organ to obtain the volume of the target organ.
2. The method of claim 1, wherein the probe pose comprises position information and pose information of a target probe at the time of acquiring the ultrasound image;
the probe position and posture when acquiring each ultrasonic image and the ultrasonic image is collected by the target probe comprises:
acquiring the ultrasonic images and the time information when the target probe acquires the ultrasonic images;
acquiring position information, which is acquired by a depth camera and acquired by the target probe when the target probe acquires each ultrasonic image, according to the time information when the target probe acquires each ultrasonic image;
and calculating attitude information of the target probe when acquiring each ultrasonic image according to the gyroscope sensor data of the target probe.
3. The method of claim 2, further comprising:
acquiring a target human body image;
and performing deformation adaptation on the target human body image and a standard anatomical map, and determining the position of the target organ so as to instruct the target probe to scan the position of the target organ.
4. The method according to any one of claims 1 to 3, wherein the transforming the organ contour points in each ultrasound image into a world coordinate system based on each probe pose to obtain each organ contour coordinate comprises:
acquiring the middle coordinates of the organ contour points in each ultrasonic image under a probe coordinate system according to the depth information of the ultrasonic images;
and converting each intermediate coordinate into the world coordinate system based on the probe pose to obtain each organ contour coordinate.
5. The method of claim 4, wherein said transforming said respective intermediate coordinates into said world coordinate system based on said probe pose to obtain respective organ contour coordinates comprises:
generating a position conversion matrix according to the target pose information and the initial pose information of the target probe; the position conversion matrix comprises rotation information and position information of the target probe; the target pose information is used for indicating the probe pose when the target probe acquires a target ultrasonic image;
and converting the target intermediate coordinate into a world coordinate system according to the position conversion matrix so as to obtain the target organ contour coordinate indicated by the target ultrasonic image.
6. The method of any one of claims 1 to 3, wherein said fitting said target organ to obtain a volume of the target organ based on respective organ contour coordinates comprises:
according to the contour coordinates of each organ, spline fitting is carried out from an interface vertical to a target contour, and the shape of the target organ is obtained;
and carrying out integral processing on the shape of the target organ to obtain the volume of the target organ.
7. The method of claim 6, wherein said obtaining the shape of the target organ by spline fitting from perpendicular to the target contour interface according to each organ contour coordinate comprises:
acquiring contour coordinates of each target organ on the target contour interface;
and respectively taking the contour coordinates of the target organ in the direction of the specified line segment as a starting point and an end point, and fitting the contour coordinates of each organ into each closed graph through splines so as to form the shape of the target organ.
8. An organ volume determination apparatus, the apparatus comprising:
the ultrasonic image acquisition module is used for acquiring each ultrasonic image and the probe pose when the target probe acquires the ultrasonic image;
the coordinate conversion module is used for converting the organ contour points in each ultrasonic image into a world coordinate system based on the pose of each probe to obtain the contour coordinates of each organ;
and the organ volume acquisition module is used for fitting the target organ in the world coordinate system according to the contour coordinates of each organ so as to acquire the volume of the target organ.
9. A computer device comprising a processor and a memory, said memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement the organ volume determination method according to any one of claims 1 to 7.
10. A computer-readable storage medium having stored therein at least one instruction, which is loaded and executed by a processor, to implement the organ volume determination method according to any one of claims 1 to 7.
CN202210065840.0A 2022-01-20 2022-01-20 Organ volume determination method, device, equipment and storage medium Pending CN114399499A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210065840.0A CN114399499A (en) 2022-01-20 2022-01-20 Organ volume determination method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210065840.0A CN114399499A (en) 2022-01-20 2022-01-20 Organ volume determination method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114399499A true CN114399499A (en) 2022-04-26

Family

ID=81232142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210065840.0A Pending CN114399499A (en) 2022-01-20 2022-01-20 Organ volume determination method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114399499A (en)

Similar Documents

Publication Publication Date Title
AU2017292642B2 (en) System and method for automatic detection, localization, and semantic segmentation of anatomical objects
KR101864380B1 (en) Surgical image data learning system
Droste et al. Automatic probe movement guidance for freehand obstetric ultrasound
KR102298412B1 (en) Surgical image data learning system
EP3477589B1 (en) Method of processing medical image, and medical image processing apparatus performing the method
US10881353B2 (en) Machine-guided imaging techniques
CN112308932B (en) Gaze detection method, device, equipment and storage medium
US20200029941A1 (en) Articulating Arm for Analyzing Anatomical Objects Using Deep Learning Networks
US11945125B2 (en) Auxiliary photographing device for dyskinesia analysis, and control method and apparatus for auxiliary photographing device for dyskinesia analysis
US10078906B2 (en) Device and method for image registration, and non-transitory recording medium
CA3110581C (en) System and method for evaluating the performance of a user in capturing an image of an anatomical region
US11475568B2 (en) Method for controlling display of abnormality in chest x-ray image, storage medium, abnormality display control apparatus, and server apparatus
KR102213412B1 (en) Method, apparatus and program for generating a pneumoperitoneum model
US11766234B2 (en) System and method for identifying and navigating anatomical objects using deep learning networks
CN114399499A (en) Organ volume determination method, device, equipment and storage medium
WO2022059539A1 (en) Computer program, information processing method, and information processing device
CN114067422A (en) Sight line detection method and device for driving assistance and storage medium
CN115019396A (en) Learning state monitoring method, device, equipment and medium
CN114886459A (en) System, method, device, equipment and medium for collecting ultrasonic operation manipulation data
CN111598904A (en) Image segmentation method, device, equipment and storage medium
CN116687452B (en) Early pregnancy fetus ultrasonic autonomous scanning method, system and equipment
EP4321101A1 (en) Patient motion detection in diagnostic imaging
EP4198997A1 (en) A computer implemented method, a method and a system
Song et al. HoloCV: A Head-Mounted Mixed Reality System for Contactless Vital Signs Monitoring in Medical Emergency Situations
Jingjing et al. Research on multidimensional expert system based on facial expression and physiological parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination